MIRI Updates
Nate and Eliezer’s forthcoming book has been getting a remarkably strong reception. I was under the impression that there are many people who find the extinction threat from AI credible, but that far fewer of them would be willing to...
This is part of the MIRI Single Author Series. Pieces in this series represent the beliefs and opinions of their named authors, and do not claim to speak for all of MIRI. Several promising software engineers have asked me: Should...
A huge announcement today: Eliezer Yudkowsky and Nate Soares have written a book attempting to raise the alarm about superintelligent AI for the widest possible audience — If Anyone Builds It, Everyone Dies. The book comes out this September 16,...
We’re excited to release a new AI governance research agenda from the MIRI Technical Governance Team. With this research agenda, we have two main aims: to describe the strategic landscape of AI development and to catalog important governance research questions....
This is part of the MIRI Single Author Series. Pieces in this series represent the beliefs and opinions of their named authors, and do not claim to speak for all of MIRI. In an op-ed published in TIME Magazine in...
This is part of the MIRI Single Author Series. Pieces in this series represent the beliefs and opinions of their named authors, and do not claim to speak for all of MIRI. Okay, I’m annoyed at people covering AI 2027...