MIRI Updates

Ngo’s view on alignment difficulty

  This post features a write-up by Richard Ngo on his views, with inline comments.   Color key:   Chat     Google Doc content     Inline comments     13. Follow-ups to the Ngo/Yudkowsky conversation   13.1. Alignment...

Conversation on technology forecasting and gradualism

  This post is a transcript of a multi-day discussion between Paul Christiano, Richard Ngo, Eliezer Yudkowsky, Rob Bensinger, Holden Karnofsky, Rohin Shah, Carl Shulman, Nate Soares, and Jaan Tallinn, following up on the Yudkowsky/Christiano debate in 1, 2, 3,...

More Christiano, Cotra, and Yudkowsky on AI progress

  This post is a transcript of a discussion between Paul Christiano, Ajeya Cotra, and Eliezer Yudkowsky (with some comments from Rob Bensinger, Richard Ngo, and Carl Shulman), continuing from 1, 2, and 3.   Color key:  Chat by Paul...

Shulman and Yudkowsky on AI progress

  This post is a transcript of a discussion between Carl Shulman and Eliezer Yudkowsky, following up on a conversation with Paul Christiano and Ajeya Cotra.   Color key:  Chat by Carl and Eliezer   Other chat    9.14. Carl Shulman’s...

Biology-Inspired AGI Timelines: The Trick That Never Works

– 1988 – Hans Moravec: Behold my book Mind Children. Within, I project that, in 2010 or thereabouts, we shall achieve strong AI. I am not calling it “Artificial General Intelligence” because this term will not be coined for another...

Visible Thoughts Project and Bounty Announcement

(Update Jan. 12, 2022: We released an FAQ last month, with more details. Last updated Jan. 7.) (Update Jan. 19, 2022: We now have an example of a successful partial run, which you can use to inform how you do...

Browse
Browse
Subscribe
Follow us on