As the COVID-19 pandemic progresses, the LessWrong team has put together a database of resources for learning about the disease and staying updated, and 80,000 Hours has a new write-up on ways to help in the fight against COVID-19. In my non-MIRI time, I've been keeping my own quick and informal notes on various sources' COVID-19 recommendations in this Google Doc. Stay safe out there!
Updates
- My personal cruxes for working on AI safety: a talk transcript from MIRI researcher Buck Shlegeris.
- Daniel Kokotajlo of AI Impacts discusses Cortés, Pizarro, and Afonso as Precedents for Takeover.
- O'Keefe, Cihon, Garfinkel, Flynn, Leung, and Dafoe's “The Windfall Clause” proposes “an ex ante commitment by AI firms to donate a significant amount of any eventual extremely large profits” that result from “fundamental, economically transformative breakthroughs” like AGI.
- Microsoft announces the 17-billion-parameter language model Turing-NLG.
- Oren Etzioni thinks AGI is too far off to deserve much thought, and cites Andrew Ng's “overpopulation on Mars” metaphor approvingly — but he's also moving the debate in a very positive direction by listing specific observations that would make him change his mind.