February 2013 Newsletter

by | 7 March 2013

Note from the editor: GCRI forgot to post the February 2013 Newsletter on our blog last month, so here it is. Enjoy!

Dear friends,

It is always a bittersweet moment when a close colleague moves on with his career. For the last nine months, Tim Maher has been working as a Research Assistant for GCRI through the Bard College M.S. program in Climate Science and Policy. Now Tim is transitioning back to Bard to finish his degree. His presence will be greatly missed. Everywhere you look within GCRI, Tim’s mark can be seen. He played a central role in getting GCRI’s resources online, in particular the blogs and newsfeeds resource and the organization directory. He co-authored the journal paper Double catastrophe: Intermittent stratospheric geoengineering induced by societal collapse and is lead author on a second paper now under peer review. He helped host GCRI’s nuclear war discussion group, providing technical support to group participants and helping write the summaries. But most importantly, he was part of countless conversations about how to grow the GCRI organization. His presence there will be missed most.

Tim, we wish you the best of luck finishing up your M.S. program.

Sincerely,
Seth Baum, Executive Director

New Articles In IEET And Scientific American

There’s just one news update from January, as we spent most of the month on a private project and longer-term work. The update is two new popular articles, both by Seth Baum.

Seven reasons for integrated emerging technologies governance, published at the Institute for Ethics and Emerging Technologies. This article discusses why to include all different emerging technologies in one governance regime instead of handling each one separately. The seven reasons are forecasting, politics, relationships, dual-use technology, risk driven by research and development, lab transparency, and whistleblowing.

When global catastrophes collide: The climate engineering double catastrophe, published at Scientific American Blogs. This article summarizes the paper Double catastrophe: Intermittent stratospheric geoengineering induced by societal collapse.

Help Spread The Word

As always, we ask that you help spread the word about GCRI and about global catastrophic risk in general. You can forward this email to anyone you know, or send them this link to sign up to GCRI’s monthly email newsletter: http://gcrinstitute.org/newsletter.html. The newsletter is for everyone: colleagues, friends, family, etc. Thanks!

Author

Recent Publications

Climate Change, Uncertainty, and Global Catastrophic Risk

Climate Change, Uncertainty, and Global Catastrophic Risk

Is climate change a global catastrophic risk? This paper, published in the journal Futures, addresses the question by examining the definition of global catastrophic risk and by comparing climate change to another severe global risk, nuclear winter. The paper concludes that yes, climate change is a global catastrophic risk, and potentially a significant one.

Assessing the Risk of Takeover Catastrophe from Large Language Models

Assessing the Risk of Takeover Catastrophe from Large Language Models

For over 50 years, experts have worried about the risk of AI taking over the world and killing everyone. The concern had always been about hypothetical future AI systems—until recent LLMs emerged. This paper, published in the journal Risk Analysis, assesses how close LLMs are to having the capabilities needed to cause takeover catastrophe.

On the Intrinsic Value of Diversity

On the Intrinsic Value of Diversity

Diversity is a major ethics concept, but it is remarkably understudied. This paper, published in the journal Inquiry, presents a foundational study of the ethics of diversity. It adapts ideas about biodiversity and sociodiversity to the overall category of diversity. It also presents three new thought experiments, with implications for AI ethics.

Climate Change, Uncertainty, and Global Catastrophic Risk

Climate Change, Uncertainty, and Global Catastrophic Risk

Is climate change a global catastrophic risk? This paper, published in the journal Futures, addresses the question by examining the definition of global catastrophic risk and by comparing climate change to another severe global risk, nuclear winter. The paper concludes that yes, climate change is a global catastrophic risk, and potentially a significant one.

Assessing the Risk of Takeover Catastrophe from Large Language Models

Assessing the Risk of Takeover Catastrophe from Large Language Models

For over 50 years, experts have worried about the risk of AI taking over the world and killing everyone. The concern had always been about hypothetical future AI systems—until recent LLMs emerged. This paper, published in the journal Risk Analysis, assesses how close LLMs are to having the capabilities needed to cause takeover catastrophe.

On the Intrinsic Value of Diversity

On the Intrinsic Value of Diversity

Diversity is a major ethics concept, but it is remarkably understudied. This paper, published in the journal Inquiry, presents a foundational study of the ethics of diversity. It adapts ideas about biodiversity and sociodiversity to the overall category of diversity. It also presents three new thought experiments, with implications for AI ethics.