MIRI Updates

Shah and Yudkowsky on alignment failures

  This is the final discussion log in the Late 2021 MIRI Conversations sequence, featuring Rohin Shah and Eliezer Yudkowsky, with additional comments from Rob Bensinger, Nate Soares, Richard Ngo, and Jaan Tallinn. The discussion begins with summaries and comments...

Ngo and Yudkowsky on scientific reasoning and pivotal acts

This is a transcript of a conversation between Richard Ngo and Eliezer Yudkowsky, facilitated by Nate Soares (and with some comments from Carl Shulman). This transcript continues the Late 2021 MIRI Conversations sequence, following Ngo’s view on alignment difficulty.  ...

Christiano and Yudkowsky on AI predictions and human intelligence

  This is a transcript of a conversation between Paul Christiano and Eliezer Yudkowsky, with comments by Rohin Shah, Beth Barnes, Richard Ngo, and Holden Karnofsky, continuing the Late 2021 MIRI Conversations. Color key:  Chat by Paul and Eliezer   Other...

February 2022 Newsletter

As of yesterday, we've released the final posts in the Late 2021 MIRI Conversations sequence, a collection of (relatively raw and unedited) AI strategy conversations: Ngo's view on alignment difficulty Ngo and Yudkowsky on scientific reasoning and pivotal acts Christiano and Yudkowsky...

January 2022 Newsletter

MIRI updates MIRI's $1.2 million Visible Thoughts Project bounty now has an FAQ, and an example of a successful partial run that you can use to inform your own runs. Scott Alexander reviews the first part of the Yudkowsky/Ngo debate. See also Richard...

December 2021 Newsletter

MIRI is offering $200,000 to build a dataset of AI-dungeon-style writing annotated with the thoughts used in the writing process, and an additional $1,000,000 for scaling that dataset an additional 10x: the Visible Thoughts Project. Additionally, MIRI is in the...

Browse
Browse
Subscribe
Follow us on