Blog

Category: Analysis

Shulman and Yudkowsky on AI progress

  This post is a transcript of a discussion between Carl Shulman and Eliezer Yudkowsky, following up on a conversation with Paul Christiano and Ajeya Cotra.   Color key:  Chat by Carl and Eliezer   Other chat    9.14. Carl Shulman’s...

Biology-Inspired AGI Timelines: The Trick That Never Works

– 1988 – Hans Moravec: Behold my book Mind Children. Within, I project that, in 2010 or thereabouts, we shall achieve strong AI. I am not calling it “Artificial General Intelligence” because this term will not be coined for another...

Soares, Tallinn, and Yudkowsky discuss AGI cognition

  This is a collection of follow-up discussions in the wake of Richard Ngo and Eliezer Yudkowsky’s first three conversations (1 and 2, 3).   Color key:   Chat     Google Doc content     Inline comments    ...

Christiano, Cotra, and Yudkowsky on AI progress

  This post is a transcript of a discussion between Paul Christiano, Ajeya Cotra, and Eliezer Yudkowsky on AGI forecasting, following up on Paul and Eliezer’s “Takeoff Speeds” discussion.   Color key:  Chat by Paul and Eliezer   Chat by Ajeya ...

Yudkowsky and Christiano discuss “Takeoff Speeds”

  This is a transcription of Eliezer Yudkowsky responding to Paul Christiano’s Takeoff Speeds live on Sep. 14, followed by a conversation between Eliezer and Paul. This discussion took place after Eliezer’s conversation with Richard Ngo, and was prompted by...

Ngo and Yudkowsky on AI capability gains

  This is the second post in a series of transcribed conversations about AGI forecasting and alignment. See the first post for prefaces and more information about the format.

...