GCR News Summary March 2013

by | 26 March 2013

This post marks the first of what we hope will be an ongoing series of global catastrophic risk news summaries. You can help by posting any GCR news you see in the comment thread of this blog post, or send them via email to Grant Wilson (grant [at] gcrinstitute.org).

In Science, British Astronomer Royal Martin Rees argued that we need to take existential risk more seriously. Foreign Policy did a profile of the Cambridge Centre for the Study of Existential Risk, which Rees helped found. Aeon interviewed members of Oxford’s Future of Humanity Institute. And The Huffington Post hosted an expert discussion of the greatest threats to humanity’s survival.

In February, B612 Foundation chairman Ed Lu wrote—the day before a meteor exploded over Chelyabinsk, Russia—that the close flyby of asteroid 2012 DA14 should wake us up to the need to anticipate and prevent asteroid impacts. Russian Deputy Prime Minister Dmitry renewed his call for an international initiative to protect the planet from asteroids. The US Air Force’s Lt. Col. Peter Garretson called for a federal asteroid policy. NASA head Charles Bolden told the House Science Committee that if we detected an asteroid the size of the Chelyabinsk meteor three weeks away, there would be nothing we could do to prevent an impact. In an interview with The New York Times, Hayden Planetarium director Neil deGrasse Tyson described the meteor as “a shot across our bow.”

A Science paper found that global temperatures are increasing faster now than at any other period in the Holocene, which encompasses roughly the last 11,000 years. The paper found that under International Panel on Climate Change projections for plausible greenhouse gas emission scenarios temperatures will exceed the Holocene maximum by 2100. A separate Science article called for better governance of research on the use of geoengineering to mitigate the effects of climate change. A Brookings Institution paper argued that we need to create an archive of the DNA of every known species to protect against biodiversity loss. And a new Niels Bohr Institute model found that global warming has caused the frequency of storm surges the magnitude of Hurricane Katrina to double since 1923. The model projected that if global temperatures rise another degree Celsius, there may be 10 times as many storms the size of Katrina than there were 80 years ago.

In The Wall Street Journal, a group of former high-ranking US government officials—Secretaries of State Henry Kissinger and George Schultz, Secretary of Defense William Perry, and Senate Armed Service Committee Chair Sam Nunn—proposed that we take four steps to reduce the risk of nuclear war:

  • build an international regime for securing nuclear materials;
  • take the US and Russia’s nuclear forces off high alert;
  • strengthen the New START Treaty; and
  • improve nuclear transparency and verification procedures.

The US cancelled a plan to add long-range interceptors to its missile defense system that had Russia had opposed. Instead, the US will deploy 14 new missile interceptors in Alaska. In response, Russia proposed regular missile defense talks with the US. Members of the House Electromagnetic Pulse Caucus are pushing to protect the US electrical grid from an EMP attack or a geomagnetic storm like the 1859 Carrington Event. And in April National Security Staff Senior Director for Europe Liz Sherwood-Randall will become the Obama administration’s new WMD Czar.

Congress extended the Pandemic and All-Hazards Preparedness Act, which provides funds to prepare for biological attacks or pandemics. In The Atlantic Cities, Emily Badger looked at a new way to model the spread of disease through air travel networks. The US Science and Technology Office posted a request for comment on a proposed new policy on potentially dangerous biological research. In Wired, Bruce Schneier wrote that we can never completely protect ourselves from the misuse of technology. Schneier argued that we need to focus on becoming resilient enough to survive major unexpected attacks because sooner or later “the technology will exist for a hobbyist to explode a nuclear weapon, print a lethal virus from a bio-printer, or turn our electronic infrastructure into a vehicle for large-scale murder.”

This news summary was put together in collaboration with and is cross-posted at Anthropocene. Thanks to Tony Barrett, Seth Baum, Kaitlin Butler, Heath Rezabek, Steven Umbrello, and Grant Wilson for help compiling the news.

We hope to do this again next month. Please post links to any news pieces you think we should include in the comment section below, or send them via email to Grant Wilson (grant [at] gcrinstitute.org).

Author

Recent Publications

Climate Change, Uncertainty, and Global Catastrophic Risk

Climate Change, Uncertainty, and Global Catastrophic Risk

Is climate change a global catastrophic risk? This paper, published in the journal Futures, addresses the question by examining the definition of global catastrophic risk and by comparing climate change to another severe global risk, nuclear winter. The paper concludes that yes, climate change is a global catastrophic risk, and potentially a significant one.

Assessing the Risk of Takeover Catastrophe from Large Language Models

Assessing the Risk of Takeover Catastrophe from Large Language Models

For over 50 years, experts have worried about the risk of AI taking over the world and killing everyone. The concern had always been about hypothetical future AI systems—until recent LLMs emerged. This paper, published in the journal Risk Analysis, assesses how close LLMs are to having the capabilities needed to cause takeover catastrophe.

On the Intrinsic Value of Diversity

On the Intrinsic Value of Diversity

Diversity is a major ethics concept, but it is remarkably understudied. This paper, published in the journal Inquiry, presents a foundational study of the ethics of diversity. It adapts ideas about biodiversity and sociodiversity to the overall category of diversity. It also presents three new thought experiments, with implications for AI ethics.

Climate Change, Uncertainty, and Global Catastrophic Risk

Climate Change, Uncertainty, and Global Catastrophic Risk

Is climate change a global catastrophic risk? This paper, published in the journal Futures, addresses the question by examining the definition of global catastrophic risk and by comparing climate change to another severe global risk, nuclear winter. The paper concludes that yes, climate change is a global catastrophic risk, and potentially a significant one.

Assessing the Risk of Takeover Catastrophe from Large Language Models

Assessing the Risk of Takeover Catastrophe from Large Language Models

For over 50 years, experts have worried about the risk of AI taking over the world and killing everyone. The concern had always been about hypothetical future AI systems—until recent LLMs emerged. This paper, published in the journal Risk Analysis, assesses how close LLMs are to having the capabilities needed to cause takeover catastrophe.

On the Intrinsic Value of Diversity

On the Intrinsic Value of Diversity

Diversity is a major ethics concept, but it is remarkably understudied. This paper, published in the journal Inquiry, presents a foundational study of the ethics of diversity. It adapts ideas about biodiversity and sociodiversity to the overall category of diversity. It also presents three new thought experiments, with implications for AI ethics.