On particularly busy days, you might feel like your brain is pushing out old information to make room for the new stuff, but it's actually got an incredible system to help it add more and more to its data banks each day - without sacrificing what's already there.

Now scientists say they've figured out a model that could explain how we're able to make new memories while keeping old ones intact – sometimes for our entire lives.

The mathematical model, developed by researchers at Columbia University, shows how different molecular clusters could act in tandem to achieve a huge storage capacity for memories. 

"The model that we have developed finally explains why the biology and chemistry underlying memory are so complex – and how this complexity drives the brain's ability to remember," said principal investigator Stefano Fusi, from Columbia University Medical Centre.

Scientists think that changes in the strength of synapses – which are the connections between neurons (nerve cells) – are used to store memories as new stimuli comes in.

This system used to be characterised as being like a series of dials, which could be turned up or down to determine the signal strength of the synapse. 

But dials can only turn up or down so far - in other words, there's a limit to them, which means that the amount of memory capacity in the 'dials' model itself was also too limited to be a viable representation of the human brain.

Another model published in 2005 suggested that each synapse could hold several of these dials, but even that hypothetical framework wasn't enough to explain your brain's huge capacity.

brain-potsA new model of how memories are formed. Credit: Lab of Stefano Fusi

Now, Fusi and his fellow researchers are proposing that these dials are not only operating at different times – but they're also constantly communicating with each other too.

"Once we added the communication between components to our model, the storage capacity increased by an enormous factor, becoming far more representative of what is achieved inside the living brain," explained team member Marcus Benna.

An easy way of imagining how the new model operates is to think of the system as a series of beakers connected by tubes. As liquid flows in or out of each beaker, the tubes are used to level out the volume in all of the beakers, and new memories are formed in long-term brain storage.

According to the researchers, this helps to explain why the brain doesn't run out of room as we learn new skills and take in new memories.

With around 86 billion neurons in the average brain, it's estimated that the brain could store around a petabyte of data, though talking about it in computer terms isn't really fair or accurate – as our brains are much more complex than that.

But while we can't realistically compare human memory to computer memory a petabyte of data, the researchers think the new model they've come up with could boost our attempts to mimic the complexity of synapse functioning in what's called neuromorphic hardware: computers designed to imitate a biological brain.

"Today, neuromorphic hardware is limited by memory capacity, which can be catastrophically low when these systems are designed to learn autonomously," said Fusi.

"Creating a better model of synaptic memory could help to solve this problem, speeding up the development of electronic devices that are both compact and energy efficient – and just as powerful as the human brain."

Details of the new model have been published in Nature Neuroscience.