Were you to look up in the sky roughly 2 million years ago you would have seen a star die in a spectacular blaze of glory.

It's long been debated whether this supernova explosion would have been close enough to impact life on Earth, and now physicists have shown that while it most likely wouldn't have triggered mass extinctions, it would have been a pretty bad day for Earthlings.

The new study also updates the distance at which a supernova could be deadly for life on Earth – it was previously thought a supernova would have to be around 25 light-years away to trigger mass extinctions, but the new paper suggests even a supernova 50 light-years away could be deadly.

Back in 2016, scientists announced they'd discovered traces of the isotope iron-60 in ancient ocean sediments and lunar soil, confirming a series of supernovae that lit up the sky between 3.2 and 1.7 million years ago.

Rough estimates put the supernovae at around 100 parsecs, or roughly 330 light-years away, suggesting they would have been visible during the day and about as bright as the Moon.

Since then, follow-up studies pretty much cut that distance in half, putting the dying stars about 60 parsecs, or 195 light-years away at the time.

University of Kansas researcher Adrian Melott wondered what this closer series of supernovae might have meant for life on Earth.

"The timing estimates are still not exact, but the thing that changed to cause us to write this paper is the distance. We did this computation because other people did work that made a revised distance estimate, which cut the distance in half," said Melott.

Supernovae occur when massive stars run out of fuel and collapse, resulting in a surge of energy that blasts a shock wave of radiation and particles across interstellar space.

Space is pretty damn big, so our Solar System rarely gets close enough to such awesome stellar events for that radiation and shower of high-speed radiation to be a problem for delicate biochemistry on our planet's surface.

But what exactly is close enough?

Melott and a team of researchers considered current evaluations on how close a supernova needed to be for Earth's biosphere to be within its "kill zone", and argued we might want to expand it a little.

"People estimated the 'kill zone' for a supernova in a paper in 2003, and they came up with about 25 light-years from Earth," said Melott.

"Now we think maybe it's a bit greater than that. They left some effects out or didn't have good numbers, so now we think it may be a bit larger distance."

Keeping in mind it would be a gradual rise in deaths resulting from cosmic rays, Melott and his colleagues now think a supernova even 40 to 50 light-years away would probably cause some serious carnage.

In spite of that bigger bubble of death and halving of their estimated distance, the supernovae that took place around 2.6 million years ago still wouldn't have been close enough to give rise to any mass extinctions.

Fossils happen to back this up. Melott's colleagues dug into Africa's fossil record on account of it being so geologically stable at the time of the supernovae, finding no solid evidence for a wide-spread die-off.

"There isn't a mass extinction, but there is kind of a lot of extinction going on at that time and species turnover," said Melott.

How much is due to a changing climate and how much to an increase in cosmic rays is kind of hard to tell.

So now we have some supernovae a few million years ago just 200-odd light years away which didn't cause a bunch of stuff to die. To get a better idea of what headaches – if any – the exploding stars would have caused, the researchers considered the mechanisms behind a spreading wave of deadly particles and radiation.

It turns out this is a little more complicated than we might first assume – far from being a simple tsunami of gamma rays and neutrons cranked up to a fraction of light speed, it all depends on the magnetic fields between us and the star.

"If there's a magnetic field, we don't know its orientation, so it can either create a superhighway for cosmic rays, or it could block them," said Melott.

Melott and his team assumed the supernova created its own 'bubble' of field lines, which Earth would fall inside as it expanded.

Far from providing a 'highway', those magnetic fields would be wiggling like a bowl of spaghetti.

Or as Melott put it, "The best analogy I can think of is more like off-road driving".

The results wouldn't have been deadly, but might still have been spectacular. Cosmic rays striking our atmosphere still would shed a weak, blue glow that would be visible at night, potentially affecting the sleep cycles of some diurnal animals.

Elementary particles would also be penetrating down as far as the troposphere, with some making it to the ground where they could give all life the equivalent of a couple of CT scans in radiation.

Not bad, but a number of unfortunate mammoths and sloths could find themselves with an extra tumour or two as a result.

Lastly, cascades of particle interactions in the atmosphere are known to promote conditions just right for lightning, leading to more strikes that could cause more frequent fires.

The research can currently be found on the pre-publishing website arXiv.org, but is due to be published in Astrophysical Journal.

The closest star which could go supernova any time soon, a red giant called Betelgeuse, is about 650 light years away, so should we warn people to start digging bunkers?

"I tell them they should worry about global warming and nuclear war, not this stuff," said Mellett