Should You Lose Sleep Over Global Catastrophic Risks?
I’ve finished the Bostrom-Cirkovic edited volume on Global Catastrophic Risks. The book is a fun read, but it ultimately failed to scare me – and I’m the author of one of the chapters! Out of a long list of conceivable horrors – everything from asteroids and super-vulcanism to AI and nanotech – the only threat that gives me pause is nuclear war. As for the rest, it’s not clear that any of them are a greater threat to my life or yours than airplane travel.
I don’t mean to say that these risks aren’t worth thinking about. If you use a standard $10M/life value, then the expected cost of exposing seven billion people to a one-in-a-million lifetime risk is $70B. If a book chapter has a one-in-a-million chance of eliminating that risk, that’s a $70,000 piece of scholarship – not bad for a couple months’ research. Nevertheless, no sensible person loses sleep over a one-in-a-million lifetime threat to his own life, or the lives of the people he cares about. Life’s too short.
Take my chapter on “the totalitarian threat.” The establishment of a world totalitarian order would be very bad, and it’s probably more likely than most people think. As I explain:
How seriously do I take the possibility that a world totalitarian government will emerge during the next one thousand years and last for a thousand years or more? Despite the complexity and guesswork inherent in answering this question, I will hazard a response. My unconditional probability – i.e., the probability I assign given all the information I now have – is 5%.
In fact, I argue that people who turn to world government to handle “global catastrophic risks” could easily be paving the way for the greater threat of totalitarianism:
But one of the main lessons of the history of totalitarianism is that moderation and inaction are underrated. Few problems turned out to be as “intolerable” as they seemed to people at the time, and many “problems” were better than the alternative. Countries that “did nothing” about poverty during the twentieth century frequently became rich through gradual economic growth. Countries that waged “total war” on poverty frequently not only choked off economic growth, but starved.
Along these lines, one particularly scary scenario for the future is that overblown doomsday worries become the rationale for world government, paving the way for an unanticipated global catastrophe: totalitarianism. Those who call for the countries of the world to unite against threats to humanity should consider the possibility that unification itself is the greater threat.
Nevertheless, I’ve never lost a night’s sleep over the possibility that crusades against global warming and asteroids will end in totalitarianism. From a world-historic perspective, this scenario is worth thinking about. But if my chapter makes you fear for your future, even I’d say you’re taking me too seriously.
Aug 2 2008 at 7:48pm
Is there any problem that is worth losing sleep over? I mean problems are worth thinking about, and protecting against, but it is hard to see that losing sleep helps.
Aug 2 2008 at 10:15pm
“Nevertheless, no sensible person loses sleep over a one-in-a-million lifetime threat to his own life, or the lives of the people he cares about. Life’s too short.”
But that is the point! Many of the authors and people associated with the conference and the book expect hope to live a lot longer through life extension and cryonics. It is a small step from radical life extension to concerns about the long term future of the world. This context should be obvious.
Aug 3 2008 at 3:31am
I see AI death and nanotech death, at least, as orders of magnitude more probable than airplane travel death; I wonder what’s causing you to dismiss these risks. Moreover, human civilization dying is bad not just because it kills all of us but because it prevents a potentially billions-of-years-long future.
Aug 4 2008 at 7:28am
I’m way in Robin’s camp on this. And as the other commenters in this thread hint at, Prof. Caplan, if you accept death, why are you trying to change anything about our reality? Why not accept irrational voting? Why not accept loss of sleep? Are these arbitrary choices you’re making, arbitrary aesthetics you’re promoting on your journey to being worm food? I’m not being derisive, I’m genuinely curious. Because Robin I relate to, even if I don’t agree with him on everything. Your perspective baffles me more, starting on these first principles.
Aug 5 2008 at 12:02pm
Al Gore et al is asking for the world economy to be revamped in their image.
Asteroid watchers are asking for a few tens of millions of dollars for some telescopes, computers, and gigapixel CCDs.
Aug 5 2008 at 12:08pm
I don’t deny that the asteroid risk assessors might ask for expensive space hardware, but only after their inexpensive equipment has found a substantial risk from a specific asteroid. Nobody is asking for expensive space hardware for the 40-per-million risks we already know about.
Comments are closed.