An existential risk is one that threatens the existence of our entire species. The Cambridge Centre for the Study of Existential Risk (CSER) — a joint initiative between a philosopher, a scientist, and a software entrepreneur — was founded on the conviction that these risks require a great deal more scientific investigation than they presently receive. CSER is a multidisciplinary research centre dedicated to the study and mitigation of risks that could lead to human extinction.
Our goal is to steer a small fraction of Cambridge’s great intellectual resources, and of the reputation built on its past and present scientific pre-eminence, to the task of ensuring that our own species has a long-term future.
CSER is now hosted within Cambridge’s Centre for Research in the Arts, Social Sciences and Humanities (CRASSH), under the management of Dr. Seán Ó hÉigeartaigh. We are currently funded by a seed donation from founder Jaan Tallinn, and are seeking sources of funding for a number of planned research projects. We welcome enquiries and offers of support — please see our News & Contact page for contact details and a sign-up link for our new mailing list, CSER News.
Source: Who We Are | CSER
The Centre for the Study of Existential Risk (CSER) is a research centre at the University of Cambridge, intended to study possible extinction-level threats posed by present or future technology. The co-founders of the centre are Huw Price (a philosophy professor at Cambridge), Martin Rees (a cosmologist, astrophysicist, and former President of the Royal Society) and Jaan Tallinn (a computer programmer and co-founder of Skype). CSER's advisors include philosopher Peter Singer, computer scientist Stuart J. Russell, statistician David Spiegelhalter, and cosmologists Stephen Hawking and Max Tegmark. Their "goal is to steer a small fraction of Cambridge’s great intellectual resources, and of the reputation built on its past and present scientific pre-eminence, to the task of ensuring that our own species has a long-term future."
Areas of focus
The centre's founding was announced in November, 2012. Its name stems from Oxford philosopher Nick Bostrom's concept of existential risk, or risk "where an adverse outcome would either annihilate Earth-originating intelligent life or permanently and drastically curtail its potential". This includes technologies that might permanently deplete humanity's resources or block further scientific progress, in addition to ones that put the species itself at risk.
Among the global catastrophic risks to be studied by CSER are those stemming from possible future advances in artificial intelligence. The potential dangers of artificial general intelligence have been highlighted in early discussions of CSER, being likened in some press coverage to that of a robot uprising à la The Terminator. Price explained to the AFP news agency, "It seems a reasonable prediction that some time in this or the next century intelligence will escape from the constraints of biology". He added that when this happens "we're no longer the smartest things around," and will risk being at the mercy of "machines that are not malicious, but machines whose interests don't include us". Price has also mentioned synthetic biology as being dangerous because "[as a result of] new innovations, the steps necessary to produce a weaponized virus or other bioterror agent have been dramatically simplified" and that consequently “the number of individuals needed to wipe us all out is declining quite steeply.”
CSER has been covered in many different newspapers (particularly in the United Kingdom), mostly covering different topics of interest. University of Cambridge Research News' coverage of it focuses on risks from artificial general intelligence.
- Biba, Erin (1 June 2015). "Meet the Co-Founder of an Apocalypse Think Tank". Scientific American. Retrieved 2 July 2016.
- Lewsey, Fred (25 November 2012). "Humanity's last invention and our uncertain future". Research News. Retrieved 24 December 2012.
- "About CSER". Centre for the Study of Existential Risk.
- Connor, Steve (14 September 2013). "Can We Survive?". The New Zealand Herald.
- AP News, Cambridge to study technology's risk to humans, 25 November 2012.
- Bostrom, Nick (2002). "Existential Risks: Analyzing Human Extinction Scenarios" (PDF). Journal of Evolution and Technology. 9 (1). Retrieved 27 March 2014.
- Gaskell, Adi (27 November 2012). "Risk of a Terminator Style Robot Uprising to be Studied". Technorati. Retrieved 2 December 2012.
- Naughton, John (2 December 2012). "Could robots soon add to mankind's existential threats?". The Observer. Retrieved 24 December 2012.
- Hui, Sylvia (25 November 2012). "Cambridge to study technology's risks to humans". Associated Press. Retrieved 30 January 2012.
- Paramaguru, Kharunya (29 November 2012). "Rise of the machines: Cambridge University to study technology’s ‘existential risk’ to mankind". Time. Retrieved 2 May 2014.
- "Biological and Biotechnological Risks". Retrieved 29 May 2015.
- "Molecular nanotechnology". Centre for the Study of Existential Risk. Retrieved 4 May 2014.
- "Extreme Climate Change". Retrieved 29 May 2015.
- Osborne, Hannah (13 September 2013). "Doomsday list for apocalypse: bioterrorism, cyber-attacks and hostile computers threaten mankind". International Business Times. Retrieved 2 May 2014.
- "Systemic risks and fragile networks". Centre for the Study of Existential Risk. Retrieved 2 May 2014.
- "CSER media coverage". Centre for the Study of Existential Risk. Retrieved 19 June 2014.
- "Humanity's Last Invention and Our Uncertain Future". University of Cambridge Research News.
Risks from artificial intelligence