December 2021 Newsletter

Posted by & filed under Newsletters.

MIRI is offering $200,000 to build a dataset of AI-dungeon-style writing annotated with the thoughts used in the writing process, and an additional $1,000,000 for scaling that dataset an additional 10x: the Visible Thoughts Project. Additionally, MIRI is in the process of releasing a series of chat logs, the Late 2021 MIRI Conversations, featuring relatively… Read more »

Ngo’s view on alignment difficulty

Posted by & filed under Analysis, Conversations.

  This post features a write-up by Richard Ngo on his views, with inline comments.   Color key:   Chat     Google Doc content     Inline comments     13. Follow-ups to the Ngo/Yudkowsky conversation   13.1. Alignment difficulty debate: Richard Ngo’s case     [Ngo][9:31]  (Sep. 25) As promised, here’s a write-up… Read more »

Conversation on technology forecasting and gradualism

Posted by & filed under Analysis, Conversations.

  This post is a transcript of a multi-day discussion between Paul Christiano, Richard Ngo, Eliezer Yudkowsky, Rob Bensinger, Holden Karnofsky, Rohin Shah, Carl Shulman, Nate Soares, and Jaan Tallinn, following up on the Yudkowsky/Christiano debate in 1, 2, 3, and 4.   Color key:  Chat by Paul, Richard, and Eliezer   Other chat    12…. Read more »

More Christiano, Cotra, and Yudkowsky on AI progress

Posted by & filed under Analysis, Conversations.

  This post is a transcript of a discussion between Paul Christiano, Ajeya Cotra, and Eliezer Yudkowsky (with some comments from Rob Bensinger, Richard Ngo, and Carl Shulman), continuing from 1, 2, and 3.   Color key:  Chat by Paul and Eliezer   Other chat    10.2. Prototypes, historical perspectives, and betting   [Bensinger][4:25] I feel… Read more »

Shulman and Yudkowsky on AI progress

Posted by & filed under Analysis, Conversations.

  This post is a transcript of a discussion between Carl Shulman and Eliezer Yudkowsky, following up on a conversation with Paul Christiano and Ajeya Cotra.   Color key:  Chat by Carl and Eliezer   Other chat    9.14. Carl Shulman’s predictions   [Shulman][20:30] I’ll interject some points re the earlier discussion about how animal data… Read more »

Soares, Tallinn, and Yudkowsky discuss AGI cognition

Posted by & filed under Analysis, Conversations, Guest Posts.

  This is a collection of follow-up discussions in the wake of Richard Ngo and Eliezer Yudkowsky’s first three conversations (1 and 2, 3).   Color key:   Chat     Google Doc content     Inline comments     7. Follow-ups to the Ngo/Yudkowsky conversation   [Bensinger][1:50]  (Nov. 23 follow-up comment) Readers who aren’t… Read more »

Christiano, Cotra, and Yudkowsky on AI progress

Posted by & filed under Analysis, Conversations.

  This post is a transcript of a discussion between Paul Christiano, Ajeya Cotra, and Eliezer Yudkowsky on AGI forecasting, following up on Paul and Eliezer’s “Takeoff Speeds” discussion.   Color key:  Chat by Paul and Eliezer   Chat by Ajeya   Inline comments      8. September 20 conversation   8.1. Chess and Evergrande   [Christiano][15:28]… Read more »

Yudkowsky and Christiano discuss “Takeoff Speeds”

Posted by & filed under Analysis, Conversations.

  This is a transcription of Eliezer Yudkowsky responding to Paul Christiano’s Takeoff Speeds live on Sep. 14, followed by a conversation between Eliezer and Paul. This discussion took place after Eliezer’s conversation with Richard Ngo, and was prompted by an earlier request by Richard Ngo that Eliezer respond to Paul on Takeoff Speeds. Color… Read more »