GCR News Summary January 2015

by | 2 February 2015

The Bulletin of Atomic Scientists moved the minute hand of its “Doomsday Clock” two minutes closer to midnight. The symbolic clock is now at three minutes to midnight, indicating that the Bulletin believes the “probability of global catastrophe is very high”:

In 2015, unchecked climate change, global nuclear weapons modernizations, and outsized nuclear weapons arsenals pose extraordinary and undeniable threats to the continued existence of humanity, and world leaders have failed to act with the speed or on the scale required to protect citizens from potential catastrophe. These failures of political leadership endanger every person on Earth.

The Bulletin called on us to demand that our leaders cap greenhouse gas emissions, focus on nuclear disarmament, develop safe facilities to store nuclear waste, and build institutions to address the potentially catastrophic misuse of new technology. It is the closest the Doomsday Clock has been to midnight since relations between the US and the Soviet Union hit a low in 1984.

The World Economic Forum’s 2015 Global Risks Report found that the most likely threat to the stability of the world in the next ten years is from international conflict. The report is based on a survey of almost 900 members of the World Economic Forum—which is known for its annual winter meeting in Davos, Switzerland—about the greatest global risks. The report defines a global risk as “an uncertain event or condition that, if it occurs, can cause significant negative impact for several countries or industries within the next 10 years”. While international conflict was seen as being the most likely threat to global stability, the report found that water crises, infectious disease, and weapons of mass destruction were all potentially even more damaging to the global community.

Former Soviet leader Mikhail Gorbachev told Der Spiegel that conflict between Russia and the West over Ukraine could lead to a nuclear war, saying that “things could blow up at any time if we don’t act.” The North Atlantic Treaty Organization (NATO) accused Russia of supporting separatist forces in Ukraine with soldiers and sophisticated weapons systems. In its recently revised military doctrine, Russia said that “the expansion of NATO’s infrastructure to Russia’s borders” poses a significant threat to Russia. Gorbachev criticized what he called the US’ “dangerous winner’s mentality” after the end of the Cold War and said that NATO’s eastward expansion “destroyed the European security architecture” defined by the 1975 Helsinki Final Act. “If one side loses its nerves in this inflamed atmosphere, then we won’t survive the coming years, Gorbachev said. “I don’t say such things lightly. I am truly and deeply concerned.”

“The most important post-Cold War initiative to reduce nuclear dangers undertaken by the United States,” Michael Krepon wrote, came to a “quiet, unceremonious end” when Russia informed the US that it would no longer welcome US help protecting its stockpiles of weapons-grade uranium and plutonium. The US has helped Russia destroy nuclear weapons and nuclear submarines, install security at nuclear weapons facilities, train workers who have access to nuclear material, and pay those workers’ salaries. Russia said that it would continue on its own to ensure that its nuclear material is secure. But with the Russian economy faltering, some analysts worry Russia won’t have the resources to do the job properly. Former Senator Sam Nunn, who helped develop “cooperative threat reduction” programs, said that breakdown in cooperation “greatly increases the risk of catastrophic terrorism”.

Separate analyses by the US’ National Aeronautics and Space Administration (NASA) and National Oceanic and Atmospheric Administration (NOAA) found that 2014 was the hottest year since the instrumental record began in 1880. The Earth’s average surface temperature has increased about 1.4°F (0.8°C) since 1880. Most of the warming has come in the last three decades. While global record high temperatures have been a regular occurrence in recent years, the last global record low was in 1909. “This is the latest in a series of warm years, in a series of warm decades,” said Goddard Institute of Space Studies Director Gavin Schmidt. “ While the ranking of individual years can be affected by chaotic weather patterns, the long-term trends are attributable to drivers of climate change that right now are dominated by human emissions of greenhouse gases.”

The number of new cases of Ebola in West Africa appears to be shrinking, although there is still a steady stream of new infections. The World Health Organization (WHO) said that the next phase of the response would be to track down all cases of the disease and stop every chain of transmission. The Wellcome Trust released a draft road map for Ebola vaccine development calling for “multiple Ebola vaccines with different characteristics”. As the epidemic shrinks, vaccines trials may have to be expanded in order to ensure the results are statistically meaningful. Two Ebola vaccines will begin clinical trials in Liberia in January. WHO will consider ways to improve its preparation for future epidemics since the Ebola outbreak revealed that it wasn’t entirely ready for an outbreak of that scale. “Ebola outbreaks in many ways are a symptom of deeper health system problems. One vaccine won’t stop the next Ebola, the next HIV from surfacing,” Bruce Lee, a director of Johns Hopkins’ International Vaccine Research Council, said. “We are a global community now. When something happens in any country, there’s a big chance that every other country is going to be affected.”

A paper in Science found that while ocean ecosystems have been less damaged than terrestrial ecosystems, the impact of humans on marine species is likely to “rapidly intensify as a human use of the oceans increases”. Fishing, mining, and the acidification of ocean water have already begun to cause severe damage. “If by the end of the century we’re not off the business-as-usual curve we are now, I honestly feel there’s not much hope for normal ecosystems in the ocean,” Stephen R. Palumbi, one of the study’s authors, said. “But in the meantime, we do have a chance to do what we can. We have a couple decades more than we thought we had, so let’s please not waste it.”

A large number of prominent artificial intelligence (AI) researchers and scientists signed an open letter saying that while the potential benefits of AI are huge, we need more research aimed at ensuring that AI systems “do what we want them to do”. SpaceX and Tesla founder Elon Musk, who signed the open letter, also donated $10 million to The Future of LIfe Institute (FLI) “to run a global research program aimed at keeping AI beneficial to humanity”. Microsoft founder Bill Gates told Reddit that he is “in the camp that is concerned about super intelligence”:

First the machines will do a lot of jobs for us and not be super intelligent. That should be positive if we manage it well. A few decades after that though the intelligence is strong enough to be a concern. I agree with Elon Musk and some others on this and don’t understand why some people are not concerned.

“Building advanced AI is like launching a rocket,” FLI co-founder Jaan Tallinn (who has also helped support the Global Catastrophic Risk Institute) said. “The first challenge is to maximize acceleration, but once it starts picking up speed, you also need to to focus on steering.”

This news summary was put together in collaboration with Anthropocene. Thanks to Tony Barrett, Seth Baum, Kaitlin Butler, and Grant Wilson for help compiling the news.

For last month’s news summary, please see GCR News Summary December 2014.

You can help us compile future news posts by putting any GCR news you see in the comment thread of this blog post, or send it via email to Grant Wilson (grant [at] gcrinstitute.org).

Image credit: US Department of Energy

Author

Recent Publications

Climate Change, Uncertainty, and Global Catastrophic Risk

Climate Change, Uncertainty, and Global Catastrophic Risk

Is climate change a global catastrophic risk? This paper, published in the journal Futures, addresses the question by examining the definition of global catastrophic risk and by comparing climate change to another severe global risk, nuclear winter. The paper concludes that yes, climate change is a global catastrophic risk, and potentially a significant one.

Assessing the Risk of Takeover Catastrophe from Large Language Models

Assessing the Risk of Takeover Catastrophe from Large Language Models

For over 50 years, experts have worried about the risk of AI taking over the world and killing everyone. The concern had always been about hypothetical future AI systems—until recent LLMs emerged. This paper, published in the journal Risk Analysis, assesses how close LLMs are to having the capabilities needed to cause takeover catastrophe.

On the Intrinsic Value of Diversity

On the Intrinsic Value of Diversity

Diversity is a major ethics concept, but it is remarkably understudied. This paper, published in the journal Inquiry, presents a foundational study of the ethics of diversity. It adapts ideas about biodiversity and sociodiversity to the overall category of diversity. It also presents three new thought experiments, with implications for AI ethics.

Climate Change, Uncertainty, and Global Catastrophic Risk

Climate Change, Uncertainty, and Global Catastrophic Risk

Is climate change a global catastrophic risk? This paper, published in the journal Futures, addresses the question by examining the definition of global catastrophic risk and by comparing climate change to another severe global risk, nuclear winter. The paper concludes that yes, climate change is a global catastrophic risk, and potentially a significant one.

Assessing the Risk of Takeover Catastrophe from Large Language Models

Assessing the Risk of Takeover Catastrophe from Large Language Models

For over 50 years, experts have worried about the risk of AI taking over the world and killing everyone. The concern had always been about hypothetical future AI systems—until recent LLMs emerged. This paper, published in the journal Risk Analysis, assesses how close LLMs are to having the capabilities needed to cause takeover catastrophe.

On the Intrinsic Value of Diversity

On the Intrinsic Value of Diversity

Diversity is a major ethics concept, but it is remarkably understudied. This paper, published in the journal Inquiry, presents a foundational study of the ethics of diversity. It adapts ideas about biodiversity and sociodiversity to the overall category of diversity. It also presents three new thought experiments, with implications for AI ethics.