MIRI Updates
Welcome to Intelligence.org
Welcome to the new home for the Machine Intelligence Research Institute (MIRI), formerly called “The Singularity Institute.” The new design (from Katie Hartman, who also designed the new site for CFAR) reflects our recent shift in focus from “movement-building” to...
We are now the “Machine Intelligence Research Institute” (MIRI)
When Singularity University (SU) acquired the Singularity Summit from us in December, we also agreed to change the name of our institute to avoid brand confusion between the Singularity Institute and Singularity University. After much discussion and market research, we’ve...
Yudkowsky on Logical Uncertainty
A paraphrased transcript of a conversation with Eliezer Yudkowsky. Interviewer: I’d love to get a clarification from you on one of the “open problems in Friendly AI.” The logical uncertainty problem that Benja Fallenstein tackled had to do with having...
Yudkowsky on “What can we do now?”
A paraphrased transcript of a conversation with Eliezer Yudkowsky. Interviewer: Suppose you’re talking to a smart mathematician who looks like the kind of person who might have the skills needed to work on a Friendly AI team. But, he says,...
2012 Winter Matching Challenge a Success!
Thanks to our dedicated supporters, we met our goal for our 2012 Winter Fundraiser. Thank you! The fundraiser ran for 45 days, from December 6, 2012 to January 20, 2013. We met our $115,000 goal, raising a total of $230,000...
New Transcript: Eliezer Yudkowsky and Massimo Pigliucci on the Intelligence Explosion
In this 2010 conversation hosted by bloggingheads.tv, Eliezer Yudkowsky and Massimo Pigliucci attempt to unpack the fundamental assumptions involved in determining the plausability of a technological singularity. A transcript of the conversation is now available here, thanks to Ethan Dickinson...