Bostrom, Nick. Superintelligence (2014).
Superintelligence considers the danger that creating artificial intelligence that surpasses human intelligence could pose to the survival of the human race.
Bostrom, Nick and Milan Circkovic. Global Catastrophic Risks (2008).
Global Catastrophic Risks is a seminal collection of papers on the risk of global catastrophe.
Dartnell, Lewis. The Knowledge (2014).
The Knowledge outlines the basic, practical things we would need to know to rebuild civilization in the aftermath a catastrophe.
Diamond, Jared. Collapse (2004).
Collapse makes the case that five societies—the Greenland Norse, the Easter Islanders, the Pitcairn Islanders, the Anasazi, and the Maya—may have collapsed because of environmental problems they themselves created.
Kahneman, Daniel. Thinking, Fast and Slow (2011)
Thinking, Fast and Slow is a wonderfully-written summary of the work Kahneman did with Amos Tversky—for which Kahneman won the Nobel Prize in Economics after Tversky’s death—on the cognitive biases that affect judgment and decision-making. Thinking, Fast and Slow is one of the most popular books among superforecasters.
Kolbert, Elizabeth. The Sixth Extinction (2014).
The Sixth Extinction argues that we are in the middle of a global extinction event of our own making. At the rate species are dying off, the mass extinction we are causing may be on a par with the other major extinction events in Earth’s history.
Kurzweil, Ray. The Singularity Is Near (2005).
The Singularity Is Near argues that exponentially accelerating returns to technology will lead to a “technological singularity” in 2045 that will transform human life.
Meadows, Donella H., Jørgen Randers, and Dennis L Meadows, Limits to Growth (2004).
Originally published in 1972, Limits to Growth models different levels of economic and population growth using an early computer simulation and outlines different scenarios in which the global system could “overshoot” sustainable limits and ultimately collapse.
Moravec, Hans. Mind Children (1988)
Mind Children is a dated, but still classic look at the future of machine intelligence. Moravec argues that with the pace of improvement in computing capabilities machines will equal and begin to surpass humans in intelligence by 2030.
Naam, Ramez. The Infinite Resource (2013).
The Infinite Resource argues that knowledge is our greatest resource. We can overcome scarcity through innovation, but we have to act fast if we avoid doing irreparable damage to the planet we live on.
Parfit, Derek, Reasons and Persons (1984)
Reasons and Persons is an influential work of analytic philosophy on ethics, rationality, and personal identity. Reasons and Persons is known in particular for its discussion of what Parfit calls “the mere addition paradox”, which seems to show that under reasonable assumptions it would always be better if there were more people, even if their lives were only barely worth living.
Posner, Richard. Catastrophe (2004).
In Catastrophe, appeals court judge and economist Richard Posner makes a cost-benefit case for doing much more than we are currently doing to avert major catastrophes.
Rees, Martin. Just Six Numbers (2000).
In Just Six Numbers, British Astronomer Royal Martin Rees argues that six seemingly arbitrary fundamental constants determine the basic features of our universe. If any of these constants were even slightly different intelligent life would have been impossible.
Rees, Martin. Our Final Hour (2003).
Our Final Hour makes the case that technological advances make the 21st century is a pivotal moment in human history when humans will either expand into space or face extinction. Rees gives humanity just even odds of surviving the century.
Schlosser, Eric. Command and Control (2013).
Command and Control is a history of the management the US nuclear arsenal. Schlosser recounts a disturbing number of nuclear accidents and close calls—like the B-52 that dropped two nuclear weapons over North Carolina in 1961 or the Titan missile that blew up in Arkansas in 1980—as it examines the challenge of keeping nuclear weapons safely.
Tetlock, Philip E. Expert Political Judgment (2005).
Expert Political Judgment presents the results of Philip Tetlock’s original forecasting research. Tetlock found that subject-matter experts were essentially worthless as forecasters—that they were no better, in his famous phrase, than “dart-throwing chimps”. But he also found that some well-informed non-experts were able to predict near-term events with reasonable accuracy. In particular, he found that “foxes” who draw on a wide variety of different ideas and points of view were better forecasters than “hedgehogs” who use the same ideology or theory to explain everything that happens.
Tetlock, Philip E. and Gardner, Dan. Superforecasting (2015).
Superforecasting draws on the results of The Good Judgment Project, a large forecasting tournament sponsored by the US intelligence community, to understand at the habits of mind that make some people better at much predicting the future than others.
Ward, Peter D. and Brownlee, Donald. Rare Earth (2000).
Rare Earth makes the case that planets and regions of space like our own that are conducive to the development of complex life may be rare.
Links to books I recommend, review, or cite are Amazon Affiliate links. I receive a small percentage of any purchases made through these links.
General Existential and Catastrophic Risk
Baum, Seth. “Is Humanity Doomed?” Sustainability Vol. 2 (February 2010).
Baum, Seth. “Space Colonization and the Meaning of Life.” Nautilus (January 2017).
Baum, Seth et al. “Long-Term Trajectories of Human Civilization.” Foresight Vol. 21, No. 1 (2019).
Beard, Simon et al. “An Analysis and Evaluation of Methods Currently Used to Quantify the Likelihood of Existential Hazards.” Futures Vol. 115 (January 2020).
Bostrom, Nick. “Existential Risks.” Journal of Evolution and Technology Vol. 9 (March 2002).
Bostrom, Nick. “Astronomical Waste.” Utilitas Vol. 15 No. 3 (November 2003).
Bostrom, Nick. “Existential Risk Prevention as Global Priority.” Global Policy (February 2013).
Bostrom, Nick. “The Vulnerable World Hypothesis.” Working paper (2018).
Joy, Bill. “Why the Future Doesn’t Need Us.” Wired (April 2000).
Matheny, Jason G. “Reducing the Risk of Human Extinction.” Risk Analysis (October 2007).
Russell, Bertrand. “The Russell-Einstein Manifesto” (July 9, 1955).
Sandberg, Anders and Nick Bostrom. “Global Catastrophic Risks Survey.” Future of Humanity Institute Technical Report 2008-1 (2008).
Wilson, E.O. “Is Humanity Suicidal?” The New York Times Magazine (May 30, 1993).
Allen, Paul. “The Singularity Isn’t Near.” MIT Technology Review (October 12, 2011).
Amodei, Dario et al. “Concrete Problems in AI Safety” arXiv:1606.06565 (June 21, 2016).
Bostrom, Nick. “When Machines Outsmart Humans.” Futures (September 2003).
Butler, Samuel. “Darwin Among the Machines.” The Press (June 13, 1863).
Chalmers, David J. “The Singularity.” Journal of Consciousness Studies (2010).
Christiano, Paul et al. “Deep Reinforcement Learning From Human Preferences.” arXiv:1706.03741 (June 12, 2017).
Good, Irving John. “Speculations Concerning the First Ultraintelligent Machine.” Advances in Computers (1965).
Kelly, Kevin. “Thinkism.” The Technium (September 29, 2008).
Muehlhauser, Luke and Nick Bostrom. “Why We Need Friendly AI.” Think (Spring 2014).
Omohundro, Stephen M. “The Basic AI Drives.” Pei Wang, Ben Goertzel, and Stan Franklin eds., Artificial General Intelligence 2008: Proceedings of the First AGI Conference (2008).
Searle, John R. “Minds, Brains, and Programs.” The Behavioral and Brain Sciences (1980).
Turing, Alan. “Intelligent Machines, A Heretical Theory.” The ’51 Society, BBC (c. 1951).
Urban, Tim. “The Road to Superintelligence.” Wait But Why (January 22, 2015).
Urban, Tim. “The AI Revolution.” Wait But Why (January 27, 2015).
Vinge, Vernor. “Technological Singularity.” Whole Earth Review (Winter 1993).
Wiener, Norbert. “Some Moral And Technical Consequences of Automation.” Science (May 6, 1960).
Diseases and Genetic Engineering
Mann, Charles C. “1491.” The Atlantic (March 2002).
Feynman, Richard P. “There’s Plenty of Room at the Bottom.” Lecture at the California Institute of Technology (December 29, 1959).
Joy, Bill. “Why the Future Doesn’t Need Us.” Wired (April 2000).
Kurzweil, Ray. “The Law of Accelerating Returns.” Kurzweil Accelerating Intelligence (March 7, 2001).
Phoenix, Chris and Eric Drexler. “Safe Exponential Manufacturing.” Nanotechnology (June 20014).
Smalley, Richard E. “Of Chemistry, Love and Nanobots.” Scientific American (September 2001).
Fermi Paradox/Doomsday Argument
Brin, David. “The ‘Great Silence'”. Quarterly Journal of the Royal Astronomical Society (1983).
Gott, J. Richard III. “Implications of the Copernican Principle for Our Future Prospects.” Nature (May 27, 1993).
Hanson, Robin. “The Great Filter—Are We Almost Past It? Unpublished working paper (September 15, 1998).
Hawking, Stephen. “Life in the Universe.” Lecture (1996).
Kent, Sherman. “Words of Estimative Probability.” Studies in Intelligence (Fall 1964)
Mellers, Barbara et al. “Psychological Strategies for Winning a Geopolitical Forecasting Tournament.” Psychological Science (May 2014).
Mellers, Barbara et al. “Identifying and Cultivating Superforecasters as a Method of Improving Probabilistic Predictions.” Perspectives on Psychological Science (May 2015).
Popper, Karl. “An Approach to the Problem of Rationality and the Freedom of Man.” Lecture at the University of Washington (April 21, 1965).
Baum, Seth et al. “A Model for the Probability of Nuclear War.” Global Catastrophic Risk Institute Working Paper 18-1 (March 13, 2018).
Baum, Seth and Anthony Barrett. “A Model for the Impacts of Nuclear War.” Global Catastrophic Risk Institute Working Paper 18-2 (April 2018).
Brown, Harold and John Deutch, “The Nuclear Disarmament Fantasy.” The Wall Street Journal (November 19, 2007).
Robock, Alan et al. “Nuclear Winter Revisited With a Modern Climate Model and Current
Nuclear Arsenals.” Journal of Geophysical Research (July 6, 2007).
Sagan, Carl. “Nuclear War and Climatic Catastrophe.” Foreign Affairs (Winter 1983/84).
Schultz, George P. et al. “A World Free of Nuclear Weapons.” The Wall Street Journal (January 4, 2007).
Toon, Owen B. et al. “Environmental Consequences of Nuclear War.” Physics Today (December 2008).
Planetary Environment/Resource Use
Barnovsky, Anthony D. et al. “Has the Earth’s Sixth Mass Extinction Already Arrived?” Nature (March 2011).
Brand, Stewart. “How Slums Can Save the Planet.” Prospect (January 27, 2010).
Cronon, William. “The Trouble With Wilderness; or, Getting Back to the Wrong Nature.” William Cronon ed., Uncommon Ground (1995).
Crutzen, Paul J. “Geology of Mankind.” Nature (January 2002).
Diamond, Jared. “The Last Americans.” Harper’s Magazine (June 2003).
Ellis, Erle. “The Planet of No Return.” Breakthrough Journal (Fall 2011).
Ellis, Erle. “Overpopulation Is Not the Problem.” The New York Times (September 13, 2013).
Krausmann, Fridolin et al. “Global Human Appropriation of Net Primary Production Doubled in the 20th Century.” Proceedings of the National Academy of Sciences (June 2013).
Kolbert, Elizabeth. “Enter the Age of Man.” National Geographic (March 2011).
McKibben, Bill. “Global Warming’s Terrifying New Math.” Rolling Stone (July 2012).
Mann, Charles C. “State of the Species.” Orion Magazine (November/December 2012).
Naam, Ramez. “The Limits of the Earth, Part 1.” Scientific American (April 17, 2013).
Naam, Ramez. “The Limits of the Earth, Part 2.” Scientific American (April 18, 2013).
Rockström, Johan et al. “A Safe Operating Space for Humanity.” Nature (September 24, 2009).
Smil, Vaclav. “Moore’s Curse and the Great Energy Delusion.” The American (November 19, 2008).
Political, Economic, and Social Issues
Brin, David. “The Transparent Society.” Wired (April 12, 1996).
Huebner, Jonathan. “A Possible Declining Trend for Worldwide Innovation.” Technology Forecasting & Social Change (October 2005).
Keynes, John Maynard. “Economic Possibilities for Our Grandchildren.” Essays in Persuasion (1931).
Resilience and Recovery
Dartnell, Lewis. “Out of the Ashes. Aeon (April 13, 2015).