January 2013 Newsletter

by | 10 January 2013

Dear friends,

Happy new year. 2012 was GCRI’s first full year in existence. It was a great year for us. We grew from three people to ten, launched a new websitepublicationsresources for the GCR community, and more. Now, having survived the December 2012 apocalypse, we look forward to an even better 2013.

We’re sending out the January newsletter a few days late because we wanted to include two new papers that were recently accepted for publication – one on geoengineering and one on nuclear war. Each paper demonstrates a key theme that GCRI seeks to emphasize in its research. The geoengineering paper demonstrates a systems approach to global catastrophic risk analysis, considering interactions between different risks instead of just looking at one risk at a time. The nuclear war demonstrates the application of sophisticated risk analysis methods to global catastrophic risk, providing a detailed understanding of how global catastrophes could occur. So in addition to providing new analysis of geoengineering and nuclear war, these two papers also offer frameworks for analyzing global catastrophic risk in general. Looking ahead to the new year, we aim to expand on this research to help humanity understand the threats it faces.

As always, we’re delighted to hear your comments, questions, and criticisms. We’re also grateful for your efforts to help spread the word to others who may find this of interest. A quick little tip: If you’d like to suggest that someone sign up for our newsletter, you can use this link: http://gcrinstitute.org/newsletter.html.

Sincerely,
Seth Baum, Executive Director

New Paper On Geoengineering “Double Catastrophe”

This paper develops a global catastrophe scenario involving climate change, geoengineering, and a separate catastrophe. This may be the worst-case scenario involving climate change. The basic idea is that the separate catastrophe prevents humanity from continuing geoengineering. The initial catastrophe could be a war, a disease outbreak, or something else. When the geoengineering stops, temperatures rapidly increase, causing a second catastrophe for an already vulnerable population. This “double catastrophe” may be difficult for humanity to survive. One key insight from this is that what happens with climate change and geoengineering depends on what happens with other types of catastrophes. See discussion on the GCRI blog.

Seth Baum, Tim Maher, and Jacob Haqq-Misra. Double catastrophe: Intermittent stratospheric geoengineering induced by societal collapse, forthcoming in Environment, Systems and Decisions.

New Paper On Inadvertent Nuclear War

This paper analyzes the risk of inadvertent nuclear war between the US and Russia. Inadvertent war (as defined here) occurs when one nation mistakenly concludes that a false alarm is real, and then launches nuclear weapons in what it believes is a counterattack. There have been a few close calls with inadvertent US-Russia/USSR nuclear war, such as the 1995 Norwegian rocket incident. The paper analyzes the ongoing probability of inadvertent US-Russia/USSR nuclear war using a detailed fault tree model, a method used heavily in other areas of risk analysis. The paper also investigates some options for reducing the risk. The paper finds there to be significant risk, and also that the risk can be reduced. See discussion on the GCRI blog.

Tony Barrett, Seth Baum, and Kelly Hostetler. Analyzing and reducing the risks of inadvertent nuclear war between the United States and Russia, forthcoming in Science and Global Security.

IEET Article On International Regulation Of Emerging Technologies

Grant Wilson published an article Emerging Technologies: Should They Be Internationally Regulated? at the Institute for Ethics and Emerging Technologies. The article summarizes Grant’s recent law journal paper Minimizing global catastrophic and existential risks from emerging technologies through international law.

Nuclear War Group Discusses US-Russia Nuclear War

At the end of November, GCRI’s nuclear war discussion group met to discuss the possibility that the a nuclear war between the US and Russia could occur. US-Russia nuclear war is of particular interest because these are still the countries with most of the world’s nuclear weapons. Despite the end of the Cold War, the risk remains. Indeed, there have been some tense moments in recent years, such as Russian concerns about NATO expansion in Eastern Europe and US politicians suggesting that Russia is a major adversary. Details can be found in the blog post Nuclear War Group Discusses Ongoing Risk Of US-Russia Nuclear War.

New Time Zones Resource

GCRI has already been having a lot of conversations with people from around the world, and we anticipate even more in 2013. To help with this, we’ve released a simple time zones resource, including world time zone maps and suggestions for scheduling global conversations. This resource is not about global catastrophic risk per se, and indeed could be helpful to anyone working across many time zones. However, global catastrophic risk is an inherently global issue and benefits from global participation. As with GCRI’s other resources, we release this in the spirit of supporting the broader global catastrophic risk community and anyone else who may find it helpful.

Author

Recent Publications

Climate Change, Uncertainty, and Global Catastrophic Risk

Climate Change, Uncertainty, and Global Catastrophic Risk

Is climate change a global catastrophic risk? This paper, published in the journal Futures, addresses the question by examining the definition of global catastrophic risk and by comparing climate change to another severe global risk, nuclear winter. The paper concludes that yes, climate change is a global catastrophic risk, and potentially a significant one.

Assessing the Risk of Takeover Catastrophe from Large Language Models

Assessing the Risk of Takeover Catastrophe from Large Language Models

For over 50 years, experts have worried about the risk of AI taking over the world and killing everyone. The concern had always been about hypothetical future AI systems—until recent LLMs emerged. This paper, published in the journal Risk Analysis, assesses how close LLMs are to having the capabilities needed to cause takeover catastrophe.

On the Intrinsic Value of Diversity

On the Intrinsic Value of Diversity

Diversity is a major ethics concept, but it is remarkably understudied. This paper, published in the journal Inquiry, presents a foundational study of the ethics of diversity. It adapts ideas about biodiversity and sociodiversity to the overall category of diversity. It also presents three new thought experiments, with implications for AI ethics.

Climate Change, Uncertainty, and Global Catastrophic Risk

Climate Change, Uncertainty, and Global Catastrophic Risk

Is climate change a global catastrophic risk? This paper, published in the journal Futures, addresses the question by examining the definition of global catastrophic risk and by comparing climate change to another severe global risk, nuclear winter. The paper concludes that yes, climate change is a global catastrophic risk, and potentially a significant one.

Assessing the Risk of Takeover Catastrophe from Large Language Models

Assessing the Risk of Takeover Catastrophe from Large Language Models

For over 50 years, experts have worried about the risk of AI taking over the world and killing everyone. The concern had always been about hypothetical future AI systems—until recent LLMs emerged. This paper, published in the journal Risk Analysis, assesses how close LLMs are to having the capabilities needed to cause takeover catastrophe.

On the Intrinsic Value of Diversity

On the Intrinsic Value of Diversity

Diversity is a major ethics concept, but it is remarkably understudied. This paper, published in the journal Inquiry, presents a foundational study of the ethics of diversity. It adapts ideas about biodiversity and sociodiversity to the overall category of diversity. It also presents three new thought experiments, with implications for AI ethics.