MIRI Updates
Ngo and Yudkowsky on AI capability gains
This is the second post in a series of transcribed conversations about AGI forecasting and alignment. See the first post for prefaces and more information about the format.
...Ngo and Yudkowsky on alignment difficulty
This post is the first in a series of transcribed Discord conversations between Richard Ngo and Eliezer Yudkowsky, moderated by Nate Soares. We’ve also added Richard and Nate’s running summaries of the conversation (and others’ replies) from Google Docs....
Discussion with Eliezer Yudkowsky on AGI interventions
The following is a partially redacted and lightly edited transcript of a chat conversation about AGI between Eliezer Yudkowsky and a set of invitees in early September 2021. By default, all other participants are anonymized as “Anonymous”. I think...
November 2021 Newsletter
MIRI updates MIRI won’t be running a formal fundraiser this year, though we’ll still be participating in Giving Tuesday and other matching opportunities. Visit intelligence.org/donate to donate and to get information on tax-advantaged donations, employer matching, etc. Giving Tuesday takes place on...
October 2021 Newsletter
Redwood Research is a new alignment research organization that just launched their website and released an explainer about what they're currently working on. We're quite excited about Redwood's work, and encourage our supporters to consider applying to work there to help boost Redwood's alignment...
September 2021 Newsletter
Scott Garrabrant has concluded the main section of his Finite Factored Sets sequence (“Details and Proofs”) with posts on inferring time and applications, future work, and speculation. Scott’s new frameworks are also now available as a pair of arXiv papers:...