Thanks to the generosity of several major donors,† every donation to the Machine Intelligence Research Institute made now until July 31, 2012 will be matched dollar-for-dollar, up to a total of $150,000!
Note: If you prefer to support rationality training, you are welcome to earmark your donations for “CFAR” (Center for Applied Rationality). Donations earmarked for CFAR will only be used for CFAR, and donations not earmarked for CFAR will only be used for Singularity research and outreach.
Since we published our strategic plan in August 2011, we have achieved most of the near-term goals outlined therein. Here are just a few examples:
- We outlined the open research problems related to our work (Section 1.1).
- We recruited several more research associates and about a dozen remote researchers (Section 1.2e).
- We held our annual Singularity Summit and gained corporate sponsors for it (Section 2.1).
- We made progress in decision theory (example) via LessWrong.com and our research associates (Section 2.2b).
- We published How to Run a Successful Less Wrong Meetup Group (Section 2.2d).
- We released pre-prints of several forthcoming research articles, including How Hard is Artificial Intelligence?, Intelligence Explosion: Evidence and Import, and The Singularity and Machine Ethics (Section 2.3).
- We redesigned our primary website (Section 2.6).
- We acquired $40,000/month in free Google Adwords advertising, to drive traffic to websites operated by the Machine Intelligence Research Institute (Section 2.6c).
- We began publishing monthly progress reports (Section 2.9b).
- We built up the Center for Applied Rationality such that it should be able to spin off from the Machine Intelligence Research Institute later this year (Section 3.1).
- We created a transparency section on our website, where visitors can find our IRS 990 forms, and also several standard organizational policies, e.g. a conflict of interest policy, non-discrimination policy, etc (Section 3.2).
In the coming year, the Machine Intelligence Research Institute plans to do the following:
- Hold our annual Singularity Summit, this year in San Francisco!
- Spin off the Center for Applied Rationality as a separate organization focused on rationality training, so that the Machine Intelligence Research Institute can be focused more exclusively on Singularity research and outreach.
- Publish additional research on AI risk and Friendly AI.
- Eliezer will write an “Open Problems in Friendly AI” sequence for Less Wrong. (For news on his rationality books, see here.)
- Finish Facing the Singularity and publish ebook versions of Facing the Singularity and The Sequences, 2006-2009.
- And much more! For details on what we might do with additional funding, see How to Purchase AI Risk Reduction.
If you’re planning to earmark your donation to CFAR (Center for Applied Rationality), here’s a preview of what CFAR plans to do in the next year:
- Develop additional lessons teaching the most important and useful parts of rationality. CFAR has already developed and tested over 18 hours of lessons so far, including classes on how to evaluate evidence using Bayesianism, how to make more accurate predictions, how to be more efficient using economics, how to use thought experiments to better understand your own motivations, and much more.
- Run immersive rationality retreats to teach from our curriculum and to connect aspiring rationalists with each other. CFAR ran pilot retreats in May and June. Participants in the May retreat called it “transformative” and “astonishing,” and the average response on the survey question, “Are you glad you came? (1-10)” was a 9.4. (We don’t have the June data yet, but people were similarly enthusiastic about that one.)
- Run SPARC, a camp on the advanced math of rationality for mathematically gifted high school students. CFAR has a stellar first-year class for SPARC 2012; most students admitted to the program placed in the top 50 on the USA Math Olympiad (or performed equivalently in a similar contest).
- Collect longitudinal data on the effects of rationality training, to improve our curriculum and to generate promising hypotheses to test and publish, in collaboration with other researchers. CFAR has already launched a one-year randomized controlled study tracking reasoning ability and various metrics of life success, using participants in our June minicamp and a control group.
- Develop apps and games about rationality, with the dual goals of (a) helping aspiring rationalists practice essential skills, and (b) making rationality fun and intriguing to a much wider audience. CFAR has two apps in beta testing: one training players to update their own beliefs the right amount after hearing other people’s beliefs, and another training players to calibrate their level of confidence in their own beliefs. CFAR is working with a developer on several more games training people to avoid cognitive biases.
- And more!
We appreciate your support for our high-impact work! Donate now, and seize a better than usual chance to move our work forward. Credit card transactions are securely processed using either PayPal or Google Checkout. If you have questions about donating, please contact Louie Helm at (510) 717-1477 or firstname.lastname@example.org.
† $150,000 of total matching funds has been provided by Jaan Tallinn, Tomer Kagan, Alexei Andreev, and Brandon Reinhart.