MIRI Updates
Yudkowsky on Logical Uncertainty
A paraphrased transcript of a conversation with Eliezer Yudkowsky. Interviewer: I’d love to get a clarification from you on one of the “open problems in Friendly AI.” The logical uncertainty problem that Benja Fallenstein tackled had to do with having...
Yudkowsky on “What can we do now?”
A paraphrased transcript of a conversation with Eliezer Yudkowsky. Interviewer: Suppose you’re talking to a smart mathematician who looks like the kind of person who might have the skills needed to work on a Friendly AI team. But, he says,...
2012 Winter Matching Challenge a Success!
Thanks to our dedicated supporters, we met our goal for our 2012 Winter Fundraiser. Thank you! The fundraiser ran for 45 days, from December 6, 2012 to January 20, 2013. We met our $115,000 goal, raising a total of $230,000...
New Transcript: Eliezer Yudkowsky and Massimo Pigliucci on the Intelligence Explosion
In this 2010 conversation hosted by bloggingheads.tv, Eliezer Yudkowsky and Massimo Pigliucci attempt to unpack the fundamental assumptions involved in determining the plausability of a technological singularity. A transcript of the conversation is now available here, thanks to Ethan Dickinson...
January 2013 Newsletter
Greetings from the Executive Director Dear friends of the Machine Intelligence Research Institute, It’s been just over one year since I took the reins at the Machine Intelligence Research Institute. Looking back, I must say I’m proud of what we...
December 2012 Newsletter
Greetings from the Executive Director Dear friends of the Singularity Institute, This month marks the biggest shift in our operations since the Singularity Summit was founded in 2006. Now that Singularity University has acquired the Singularity Summit (details below), and...