A recent xkcd comic described the Saturn V rocket using only the 1000 most frequently used words (in English). The rocket was called “up-goer five,” and the liquid hydrogen feed line was the “thing that lets in cold wet air to burn.” This inspired a geneticist to make the Up-Goer Five Text Editor, which forces you to use only the 1000 most frequent words. Mental Floss recently collected 18 scientific ideas explained using this restriction.
What does this have to do with MIRI? Well, young philosopher Robby Bensinger has now re-written MIRI’s Five Theses using the Up-Goer Five text editor, with amusing results:
- Intelligence explosion: If we make a computer that is good at doing hard things in lots of different situations without using much stuff up, it may be able to help us build better computers. Since computers are faster than humans, pretty soon the computer would probably be doing most of the work of making new and better computers. We would have a hard time controlling or understanding what was happening as the new computers got faster and grew more and more parts. By the time these computers ran out of ways to quickly and easily make better computers, the best computers would have already become much much better than humans at controlling what happens.
- Orthogonality: Different computers, and different minds as a whole, can want very different things. They can want things that are very good for humans, or very bad, or anything in between. We can be pretty sure that strong computers won’t think like humans, and most possible computers won’t try to change the world in the way a human would.
- Convergent instrumental goals: Although most possible minds want different things, they need a lot of the same things to get what they want. A computer and a human might want things that in the long run have nothing to do with each other, but have to fight for the same share of stuff first to get those different things.
- Complexity of value: It would take a huge number of parts, all put together in just the right way, to build a computer that does all the things humans want it to (and none of the things humans don’t want it to).
- Fragility of value: If we get a few of those parts a little bit wrong, the computer will probably make only bad things happen from then on. We need almost everything we want to happen, or we won’t have any fun.
That is all. You’re welcome.