Oxygen is a fundamental part of life on Earth. Following a surge in this gas in the atmosphere, roughly 2.5 billion years ago, multicellular life on our planet began to thrive.

The timing is no coincidence, yet oxygen can't take all the credit. According to some scientists, there's another element out there also crucial to this evolutionary boom, and its name is iron.

In a new review on the availability of iron for life throughout our planet's history, University of Oxford Earth scientist Jon Wade and team propose this metal's fluctuations helped to drive evolution on Earth.

Today, iron is a necessary element for virtually all life. It's what allows cells to sense oxygen, generate energy, replicate DNA, and express genes. In fact, there are only two known organisms on our planet that currently do not require this metal to survive.

In the early days of Earth, there was plenty of geological iron to go around, especially in the mantle and crust. The solid iron located here was probably 'seeded' by meteorites from outer space, and because this material could dissolve into ancient oceans, iron was also abundant in the marine environment.

Following the Great Oxidation Event (GOE), however, conditions began to change. Soluble iron began to grow scarce and competition for iron among cells increased.

Life-forms therefore had to figure out how to recycle iron from dead cells, steal iron from live cells, or live in another cell and use its iron-grabbing apparatus to stay alive.

These battles over iron are what some scientists believe first triggered multicellular evolution.

"Infection, predation, and endosymbiosis are all behaviors that switch the focus of iron acquisition from mineral sources to other life-forms, and each of the three behaviors may evolve into the others over time – for example, initially exploitative infections may become mutually symbiotic," the authors explain.

Compared to modern eukaryotes, or multicellular organisms, older forms of single-celled life, like bacteria and Archaea, are thought to have relied more on iron to survive.

This suggests modern organisms have learned to use the element more efficiently over millions of years, as its presence in the environment fluctuated.

According to this new theory, Earth's oceans lost most of their soluble iron because of an increase in atmospheric oxygen. When water and solid iron interact in the presence of oxygen, the iron is rapidly oxidized – which is tougher for living things to make use of.

To grab the element in this form requires cells to evolve small organic molecules, called siderophores. Today, almost all bacteria, plants and fungi have these structures, but billions of years ago, this represented a new form of survival.

As life-forms with siderophores began to gather near a limited number of iron-rich geological sources, researchers think crowding inevitably led to "increasingly complex cell-cell interactions".

Archaea in the thermal springs of Yellowstone, for instance, can only really thrive on iron oxide mats. Whereas modern eukaryotes can live outside of these geological sources, as long as there are biological forms of iron available.

"Despite the depletion of bioavailable iron, throughout the rebound of life post-GOE and its subsequent diversification (and passage through other successive mass extinction events), iron has retained its preeminence in biological systems," the authors write.

"Presumably, this is because iron has unique electrochemical properties that make possible, or make efficient, a range of biochemical processes such that other elements cannot be broadly substituted for iron within proteins without causing a significant disadvantage."

The sheer lack of replacement for iron means organisms either had to compete, cheat, or cooperate to survive following the GOE, and these developments could very well have caused extreme adaptations in genomes and cellular behavior over time.

When the more recent Neoproterozoic Oxygenation event occurred, about 500 million years ago, it merely exacerbated these changes.

The initiation of terrestrial life may therefore have started from an abundance of iron, but only when iron became scarce did those life-forms begin to grow in complexity.

Given that a rise in atmospheric CO2 could increase iron deficiency in the food chain, researchers say we need to know more about how life copes with the ebbs and flows of this crucial element.

The findings also indicate a possible way to measure the potential of life on other planets, like Mars, where iron oxide can also be found in the mantle. If this planet is rich enough in iron, it could indicate a possible harbor for some of the simplest forms of life.

The research was published in PNAS.