The University of Oxford’s Future of Humanity Institute recently held a conference on Global Catastrophic Risks. There’s an upcoming book which might be worth a read but what I’m more excited about is that soon all of the conference’s lectures will be made available for free.
Global catastrophes have occurred many times in history, even if we only count disasters causing more than 10 million deaths. A very partial list of examples includes the An Shi Rebellion (756-763), the Taiping Rebellion (1851-1864), and the famine of the Great Leap Forward in China, the Black Death in Europe, the Spanish flu pandemic, the two World Wars, the Nazi genocides, the famines in British India, Stalinist totalitarianism, and the decimation of the native American population through smallpox and other diseases following the arrival of European colonizers. Many others could be added to this list.
Although the current and future risks are of various kinds, treating global catastrophic risk as a field for academic enquiry is a useful, coherent and important endeavour.
Given that Sir Martin Rees, President of the Royal Society, recently estimated that the chances of humanity surviving the twenty-first century are fifty-fifty, I’m inclined to agree that topics like this are worthy of academic discussion. This argument reminds me of another article I read regarding the Large Hadron Collider where I quoted the following from The New York Times:
One problem is that society has never agreed on a standard of what is safe in these surreal realms when the odds of disaster might be tiny but the stakes are cosmically high. In such situations, probability estimates are often no more than “informed betting odds”.