MIRI Updates
Soares, Tallinn, and Yudkowsky discuss AGI cognition
This is a collection of follow-up discussions in the wake of Richard Ngo and Eliezer Yudkowsky’s first three conversations (1 and 2, 3). Color key: Chat Google Doc content Inline comments ...
Christiano, Cotra, and Yudkowsky on AI progress
This post is a transcript of a discussion between Paul Christiano, Ajeya Cotra, and Eliezer Yudkowsky on AGI forecasting, following up on Paul and Eliezer’s “Takeoff Speeds” discussion. Color key: Chat by Paul and Eliezer Chat by Ajeya ...
Yudkowsky and Christiano discuss “Takeoff Speeds”
This is a transcription of Eliezer Yudkowsky responding to Paul Christiano’s Takeoff Speeds live on Sep. 14, followed by a conversation between Eliezer and Paul. This discussion took place after Eliezer’s conversation with Richard Ngo, and was prompted by...
Ngo and Yudkowsky on AI capability gains
This is the second post in a series of transcribed conversations about AGI forecasting and alignment. See the first post for prefaces and more information about the format.
...Ngo and Yudkowsky on alignment difficulty
This post is the first in a series of transcribed Discord conversations between Richard Ngo and Eliezer Yudkowsky, moderated by Nate Soares. We’ve also added Richard and Nate’s running summaries of the conversation (and others’ replies) from Google Docs....
Discussion with Eliezer Yudkowsky on AGI interventions
The following is a partially redacted and lightly edited transcript of a chat conversation about AGI between Eliezer Yudkowsky and a set of invitees in early September 2021. By default, all other participants are anonymized as “Anonymous”. I think...