MIRI Updates

When Will AI Be Created?

Strong AI appears to be the topic of the week. Kevin Drum at Mother Jones thinks AIs will be as smart as humans by 2040. Karl Smith at Forbes and “M.S.” at The Economist seem to roughly concur with Drum...

Advise MIRI with Your Domain-Specific Expertise

MIRI currently has a few dozen volunteer advisors on a wide range of subjects, but we need more! If you’d like to help MIRI pursue its mission more efficiently, please sign up to be a MIRI advisor. If you sign...

Five theses, two lemmas, and a couple of strategic implications

MIRI’s primary concern about self-improving AI isn’t so much that it might be created by ‘bad’ actors rather than ‘good’ actors in the global sphere; rather most of our concern is in remedying the situation in which no one knows...

AGI Impact Experts and Friendly AI Experts

MIRI’s mission is “to ensure that the creation of smarter-than-human intelligence has a positive impact.” A central strategy for achieving this mission is to find and train what one might call “AGI impact experts” and “Friendly AI experts.” AGI impact...

“Intelligence Explosion Microeconomics” Released

MIRI’s new, 93-page technical report by Eliezer Yudkowsky, “Intelligence Explosion Microeconomics,” has now been released. The report explains one of the open problems of our research program. Here’s the abstract: I. J. Good’s thesis of the ‘intelligence explosion’ is that...

“Singularity Hypotheses” Published

Singularity Hypotheses: A Scientific and Philosophical Assessment has now been published by Springer, in hardcover and ebook forms. The book contains 20 chapters about the prospect of machine superintelligence, including 4 chapters by MIRI researchers and research associates. “Intelligence Explosion:...

Browse
Browse
Subscribe
Follow us on