Let's face it, one day the world as we know it is going to end. Sure, it might only happen in roughly 6 billion years when our Sun reaches its peak size in the red giant stage, vapourises our planet's atmosphere, and strips away the crust and the mantle, before swallowing up the leftover core. 

But we probably won't be around for the centuries that will span this devastating process. Chances are something will wipe humanity out - and perhaps even destroy all life on Earth - much sooner that the Sun gets around to it.

Earlier this year, researchers from the Swedish Global Challenges Foundation in collaboration with the University of Oxford's Future of Humanity Institute published a report that outlined 12 global risks with potentially "infinite impact". The authors define such impact as "the end of human civilisation or even human life" - so, for all intents and purposes, we can call these scenarios apocalyptic.

Among these global risks are things like a global pandemic, ecological collapse, nuclear war, major asteroid impact, and climate change. Calculations that go into this type of research are tricky, and depend on so many unknowns that we simply can't predict with absolute certainty what exactly humanity's downfall is going to look like. But it is possible to estimate probability, which the researchers did by compiling data on already available estimates for the next 100 years (or 200 years, in the case of climate change).

So, what's the most likely thing to kill us all?

According to the Global Challenges Report, over the next 100 years there's a one in 10 chance we will develop artificial intelligence systems that would get rid of mankind simply because AI's don't need humans to thrive. Nothing personal, then. However, it's not so simple - in the recent years AI experts have also predicted that a superintelligent computer could solve more problems than it would create, if we play our cards right.

"This makes extremely intelligent AI's a unique risk, in that extinction is more likely than lesser impacts," states the report. "On a more positive note, an intelligence of such power could easily combat most other risks in this report, making extremely intelligent AI into a tool of great potential."

Therefore, a killer robot overlord is both the most likely scenario and at the same time not likely at all.

But there are other apocalypses with great potential. According to the report, there's a 5 percent chance that in the next 100 years, humans will be wiped out by a global pandemic or a nuclear war. And, if we don't do something about it, over the next 200 years there's a 5 percent chance that the impacts of climate change will do us in.

It's also worth noting that for the purposes of this report, the authors excluded the kinds of external impacts that we couldn't do anything about - such as nearby gamma-ray bursts setting our planet on fire (although these don't have a high probability anyway.)

By the way, if you're wondering about asteroid impacts, the probability is only one in 10,000 - it's more likely that a deadly supervirus will destroy us all before a 5-km space rock has a chance to dent our planet.

In the end, as astronomer Phil Plait wrote in his 2008 book Death from the Skies, the stuff we can't do anything about is not worth fretting over:

"It seems as if the whole cosmos is trying to snuff us out. In a sense, it is - there's danger aplenty in the Universe - but we have to take a practical view here. We have to appreciate the vastness of space and time, and our ability to manipulate events around us."

Besides, there are also things we can't even estimate the probability of - like an alien invasion.