May Newsletter: Report from the United Nations

by | 2 June 2014

This was sent via email on 13 May. Click here to subscribe to the email newsletter.

Dear friends,

Last month I gave two talks at the United Nations. The first was a small meeting of experts from the P5, i.e. the permanent members of the UN Security Council: China, France, Russia, UK, and USA. I presented new research on nuclear winter risk. The second was at the big annual meeting for the Nuclear Non-Proliferation Treaty (NPT). I presented the paper Analyzing and reducing the risks of inadvertent nuclear war between the United States and Russia. Both talks, and the activities surrounding them, went very well. I got to teach some diplomats and activists a bit about nuclear war as a global catastrophic risk, and some fundamentals of quantitative risk analysis, and meanwhile they taught me a bit about how international diplomacy does (and doesn’t) work… and where to find the cafeteria at the UN headquarters, which is surprisingly hard to navigate for a place that welcomes people from around the world.

Some things I learned:

• Nuclear weapons is an excellent issue area for researchers to connect with diplomats. I was amazed at how few researchers were at the NPT. I practically had the whole place to myself. In contrast, the annual UN climate change meeting has swarms of researchers competing for attention, and the emerging technology risks do not have any diplomat attention – no UN meetings to attend. I intend to remain active with nuclear weapons in part to continue building relationships with diplomats that can later be leveraged for other global catastrophic risks. Other researchers could surely do the same.

• Nuclear weapons is, in my current estimation, the easiest major GCR to solve. There is broad international consensus against even small numbers of nuclear weapons. Nuclear weapons are also relatively hard to build and easy to monitor for. The main holdup is the nuclear weapons states’ claim that they cannot disarm until certain security conditions are achieved. Achieving them – or persuading the countries to disarm anyway – seems more feasible than achieving what would be needed for other major GCRs like climate change, biological weapons, or AI. For discussion of the practical steps needed to achieve major disarmament, I recommend the Deep Cuts project, a trilateral Germany-Russia-USA initiative. Also keep an eye out for the Humanitarian Consequences Initiative, which is bringing a new sense of urgency to this decades-old issue, as does the ongoing Ukraine crisis, for better or worse.

• Public pressure on politicians matters. For example, when the New START treaty passed the US Senate in 2010, several Senators apparently only voted for it because pressure from their constituents gave them the political space they felt they needed. Even in Iran, the efforts of moderate, reformist citizens has translated into a government that is much more open to international diplomacy for resolving nuclear weapons issues. Those of us in the GCR community may be surprised at how willing government officials are to hear us out on the issues.

• Being there matters. The UN process has its share of quirks, with a bit of a learning curve, but not an insurmountable one. While many of the official statements can be found at a UN website called PaperSmart, as with any conference, the unofficial side conversations are often more important. Likewise, while the diplomats and activists learned some risk analysis from me, I learned from them some directions to orient my research in order to make it more relevant to actual policy decisions. This underscores the importance of stakeholder engagement for research that aspires to have some impact for society.

If you value these sorts of insights, and these efforts to share GCR research with the international community (and with others), please consider donating to GCRI. Your support will help us continue producing leading GCR research in conversation with diplomats, policy makers, and other important global decision makers, so that the world can address these risks.

As always, thank you for your interest in our work. We welcome any comments, questions, and criticisms you may have.

Sincerely,
Seth Baum, Executive Director

GCR News Summary

Robert de Neufville’s latest news summary is available here: GCR News Summary April 2014. As always, these summarize recent events across the breadth of GCR topics.

Call for Papers: Confronting Future Catastrophic Threats To Humanity

Seth Baum and Bruce Tonn are co-editing a new special issue of the journal Futures on the topic Confronting Future Catastrophic Threats To Humanity. The call for papers states: “This special issue seeks to identify and discuss opportunities for action now that can help humanity prepare for catastrophic threats it may face in the future. Of interest are both actions to prevent the catastrophes, or to reduce their probability, and actions to help humanity endure them.”

Initial paper submissions are due 1 September 2014. For further details please see the call for papers web page.

Author

Recent Publications

Climate Change, Uncertainty, and Global Catastrophic Risk

Climate Change, Uncertainty, and Global Catastrophic Risk

Is climate change a global catastrophic risk? This paper, published in the journal Futures, addresses the question by examining the definition of global catastrophic risk and by comparing climate change to another severe global risk, nuclear winter. The paper concludes that yes, climate change is a global catastrophic risk, and potentially a significant one.

Assessing the Risk of Takeover Catastrophe from Large Language Models

Assessing the Risk of Takeover Catastrophe from Large Language Models

For over 50 years, experts have worried about the risk of AI taking over the world and killing everyone. The concern had always been about hypothetical future AI systems—until recent LLMs emerged. This paper, published in the journal Risk Analysis, assesses how close LLMs are to having the capabilities needed to cause takeover catastrophe.

On the Intrinsic Value of Diversity

On the Intrinsic Value of Diversity

Diversity is a major ethics concept, but it is remarkably understudied. This paper, published in the journal Inquiry, presents a foundational study of the ethics of diversity. It adapts ideas about biodiversity and sociodiversity to the overall category of diversity. It also presents three new thought experiments, with implications for AI ethics.

Climate Change, Uncertainty, and Global Catastrophic Risk

Climate Change, Uncertainty, and Global Catastrophic Risk

Is climate change a global catastrophic risk? This paper, published in the journal Futures, addresses the question by examining the definition of global catastrophic risk and by comparing climate change to another severe global risk, nuclear winter. The paper concludes that yes, climate change is a global catastrophic risk, and potentially a significant one.

Assessing the Risk of Takeover Catastrophe from Large Language Models

Assessing the Risk of Takeover Catastrophe from Large Language Models

For over 50 years, experts have worried about the risk of AI taking over the world and killing everyone. The concern had always been about hypothetical future AI systems—until recent LLMs emerged. This paper, published in the journal Risk Analysis, assesses how close LLMs are to having the capabilities needed to cause takeover catastrophe.

On the Intrinsic Value of Diversity

On the Intrinsic Value of Diversity

Diversity is a major ethics concept, but it is remarkably understudied. This paper, published in the journal Inquiry, presents a foundational study of the ethics of diversity. It adapts ideas about biodiversity and sociodiversity to the overall category of diversity. It also presents three new thought experiments, with implications for AI ethics.