GCR News Summary June 2014

by | 1 July 2014

Russia has lost contact with its only missile detection satellite in geostationary orbit above the US. The satellite was originally supposed to operate until at least 2017, but began malfunctioning shortly after its launch in 2012. Russia still has two remaining missile detection satellites in elliptical orbits around the planet, but they are reportedly able to monitor US missile activity for just three hours a day. Russia originally began to use geostationary satellites for missile detection after the 1983 “Autumn Equinox Incident” when the satellites in elliptical orbits mistook sunlight glancing off high-altitude clouds for a US missile launch. Russia’s ability to monitor US missile activity has degraded since the end of the Cold War as Soviet-era missile-detection satellites have stopped working. The decline in Russia’s satellite capabilities may make it harder for Russia to distinguish between a missile strike and a non-threatening event. That may in turn make it more likely that Russia will mistakenly respond to a nuclear false alarm.

Russian president Vladimir Putin asked the Russian Parliament to rescind the resolution it passed in March authorizing the use of force in Ukraine. Putin also called for pro-Russian separatists in eastern Ukraine to extend their temporary ceasefire. The Ukrainian military accused the separatists of shooting down a military helicopter with nine people on board in spite of the ceasefire. Putin tentatively endorsed Ukrainian president Petro Poroshenko’s plan to allow greater autonomy to eastern Ukraine. Poroshenko’s plan would also grant amnesty to separatists who have not committed serious crimes.

Ukraine signed an association agreement with the European Union, in spite of Russia’s warning that signing the agreement would have “serious consequences”. European foreign ministers also banned the import of goods made in the breakaway province of Crimea, but didn’t impose any additional sanctions. “The current crisis in Ukraine, which many see a result of a US-Russia geopolitical confrontation, underscores once again the urgent necessity for developing a new foreign policy agenda that will benefit both American and Russian long-term strategic interests,” said Edward Lozansky, President of the American University in Moscow. “US-Russia relations are at their lowest point since the collapse of the Soviet Union.”

The US Environmental Protection Agency (EPA) proposed new guidelines for lowering carbon emissions from power plants. The rule would require US states to reduce emissions from existing power plants 25% from 2005 levels by 2020 and 30% from 2005 levels by 2030.The EPA noted that power plants produce about one-third of all domestic greenhouse gas emissions in the US. Although the US government regulates the amount of arsenic, mercury, sulfur dioxide, nitrogen oxides, and particle pollution power plants can emit, there are currently no limits on carbon emissions from power plants. Michael Grunwald argued in Time that the proposed rules are not very ambitious considering that coal-generated electricity—which is responsible for most carbon emissions from power plants—is declining rapidly even without regulation.

The day after the EPA released its proposed new emissions guidelines, He Jiankun, the chair of China’s Advisory Committee on Climate Change, said that China would also limit its carbon dioxide emissions when it embarks on its next five-year plan in 2016. China and the US are the two biggest emitters of greenhouse gases, jointly accounting for about 42% of global emissions. It would be the first time China publicly committed to limiting emissions. University College London climate policy professor Michael Grubb said Chinese action could be “the most important turning point in the global scene on climate change for a decade.”

A World Bank report estimated that if the US, the EU, Brazil, China, India, and Mexico adopted several sets of environmental policies—on clean transportation, energy efficiency in industry, and energy efficiency in buildings—they could cut carbon emissions by 30% of the total amount needed to limit global warming to 2°C (3.6°F) by the year 2030. The World Bank report estimated that implementing these policies would also save 94,000 lives and add between $1.8 and $2.6 trillion to global economic output annually by 2030.

In The New York Times, former US Treasury Secretary Hank Paulson compared climate change to the credit bubble that cause the financial crisis in 2008. Paulson said that by failing to address climate change, we are ignoring an imminent crisis “that poses enormous risks to both our environment and economy”. He called for taxing carbon emissions to create an incentive for people to develop cleaner energy technologies. “If there’s one thing I’ve learned throughout my work in finance, government and conservation,” Paulson said, “it is to act before problems become too big to manage.”

Morocco’s health minister advised its citizens not to go on pilgrimage to Saudi Arabia this year because of the danger of contracting Middle Eastern Respiratory Syndrome (MERS). Bangladesh announced that it would require its pilgrims to wear masks while they travel. The World Health Organization (WHO) said that the recent surge in new MERS cases has abated and that there is still no evidence of sustained transmission among humans. WHO also said that Saudia Arabia has “made significant efforts to strengthen infection prevention and control measures”. But WHO said the disease continues to be a concern, especially given the large number of pilgrims expected to travel to Saudi Arabia for the Hajj. WHO does not currently call for any restrictions on travel to Saudi Arabia, but does recommend that people with preexisting medical conditions consult a doctor before going to the country. Nature wrote in an editorial that international efforts to control MERS have been less effective than international efforts to control Severe Acute Respiratory Syndrome (SARS) because WHO’s outbreak-response division is underfunded and has failed to show real leadership. “The WHO, as an intergovernmental agency with a direct line to health ministries,” the editorial said, “remains best placed to bang heads together and get things done cooperatively, but its efforts must be well funded and staffed.

The Ebola outbreak in West Africa has killed more people than any previous outbreak of the disease. Although the outbreak seemed to be slowing in late April, it now appears to be in a second wave. Liberian president Ellen Johnson-Sirleaf called the outbreak a “national emergency”. The US Centers for Disease Control (CDC) said that controlling the disease is difficult because of “its wide geographic spread, weak health infrastructures, and community mistrust and resistance”. Médecins Sans Frontières said that controlling the outbreak will require “a massive deployment of resources by governments in West Africa and aid organizations”. Ebola expert Tom Geisbert said that one of the reasons this outbreak may be harder to contain than previous outbreaks is that the road system in West Africa is better than in Central Africa, allowing  the disease to spread more easily. Ebola has never spread widely outside of Africa, although there is concern that new mutations may eventually enable it spread around the world.

As many as 75 workers at a CDC lab may have been exposed to anthrax when live samples of the bacteria were not properly inactivated before being transferred to a lower-containment-level lab. The lab was testing a new way of killing the bacteria with chemicals rather than with radiation. When no new colonies grew from the treated samples initially, researchers wrongly assumed that the bacteria were dead. So far no workers have shown symptoms of anthrax infections. Anthrax is not contagious in humans, but Center for Infectious Disease Research and Policy director Michael Osterholm said that the incident raises questions about whether any lab is secure enough for “gain-of-function” research that deliberately engineers pathogens to be more deadly in order to study them.

It was widely reported that a Russian program passed a version of the Turing Test by convincing one out of three human judges that it was a 13-year-old Ukrainian boy. In a seminal 1950 paper, computing pioneer Alan Turing argued that when computers can pass for humans in a written conversation they will effectively have human-level intelligence. But computers would have to be indistinguishable from humans—and not merely fool some judges—to pass the Turing Test in its original formulation. Robert Gonzalez and George Dvorsky pointed out that the winning program was specifically designed to convince judges it was a teenager with a poor grasp of English, rather than a person who could be expected to answer questions clearly and accurately. In a conversation the program had with computer scientist Scott Aaronson, the program simply changed the subject when asked questions like “how many legs does a camel have?” that most humans could answer easily. “A chatbot pretending to be a 13-year-old boy for whom English is a second language,” Pranav Dixit said, “ain’t exactly Hal 9000.”

Elon Musk—who helped found both SpaceX and Tesla Motors—told CNBC that the development of artificial intelligence (AI) could be dangerous. Musk said that we have to be careful, because AI could behave in unexpected ways. Musk said that he didn’t know what could stop the development of AI, but said that that “there are some potentially scary outcomes, and we should try to make sure the outcomes are good, not bad.” And on John Oliver’s comedy show, physicist Stephen Hawking reiterated his concerns that AI could pose a threat to human survival. “I know you’re trying to get people to be cautious there, but why should I not be excited about fighting a robot?” Oliver asked. “You would lose,” Hawking said.

This news summary was put together in collaboration with Anthropocene. Thanks to Seth Baum, Kaitlin Butler, and Grant Wilson for help compiling the news.

For last month’s news summary, please see GCR News Summary May 2014.

You can help us compile future news posts by putting any GCR news you see in the comment thread of this blog post, or send it via email to Grant Wilson (grant [at] gcrinstitute.org).

Image credit: Ed Brown

Author

Recent Publications

Climate Change, Uncertainty, and Global Catastrophic Risk

Climate Change, Uncertainty, and Global Catastrophic Risk

Is climate change a global catastrophic risk? This paper, published in the journal Futures, addresses the question by examining the definition of global catastrophic risk and by comparing climate change to another severe global risk, nuclear winter. The paper concludes that yes, climate change is a global catastrophic risk, and potentially a significant one.

Assessing the Risk of Takeover Catastrophe from Large Language Models

Assessing the Risk of Takeover Catastrophe from Large Language Models

For over 50 years, experts have worried about the risk of AI taking over the world and killing everyone. The concern had always been about hypothetical future AI systems—until recent LLMs emerged. This paper, published in the journal Risk Analysis, assesses how close LLMs are to having the capabilities needed to cause takeover catastrophe.

On the Intrinsic Value of Diversity

On the Intrinsic Value of Diversity

Diversity is a major ethics concept, but it is remarkably understudied. This paper, published in the journal Inquiry, presents a foundational study of the ethics of diversity. It adapts ideas about biodiversity and sociodiversity to the overall category of diversity. It also presents three new thought experiments, with implications for AI ethics.

Climate Change, Uncertainty, and Global Catastrophic Risk

Climate Change, Uncertainty, and Global Catastrophic Risk

Is climate change a global catastrophic risk? This paper, published in the journal Futures, addresses the question by examining the definition of global catastrophic risk and by comparing climate change to another severe global risk, nuclear winter. The paper concludes that yes, climate change is a global catastrophic risk, and potentially a significant one.

Assessing the Risk of Takeover Catastrophe from Large Language Models

Assessing the Risk of Takeover Catastrophe from Large Language Models

For over 50 years, experts have worried about the risk of AI taking over the world and killing everyone. The concern had always been about hypothetical future AI systems—until recent LLMs emerged. This paper, published in the journal Risk Analysis, assesses how close LLMs are to having the capabilities needed to cause takeover catastrophe.

On the Intrinsic Value of Diversity

On the Intrinsic Value of Diversity

Diversity is a major ethics concept, but it is remarkably understudied. This paper, published in the journal Inquiry, presents a foundational study of the ethics of diversity. It adapts ideas about biodiversity and sociodiversity to the overall category of diversity. It also presents three new thought experiments, with implications for AI ethics.