A team of Russian physicists has figured out how to keep a key component in light-based computers from overheating, which means one of the biggest obstacles standing between us and processing data at the speed of light might have just been overcome.

The simple act of replacing electrons with light particles (photons) in our microprocessors would not only result in computers that run tens of thousands of times faster, it would also solve a very big problem that affects us all - we've just about hit the limit for how fast electrons can travel between the processor and the memory.

Known as the von-Neumann bottleneck, this problem means there's no point developing faster processors for electron-based computer systems if we've already hit the limit for how fast information can be transported to and from the memory. We need to completely rethink the system, and that's where quantum computers (which replace bits with qubits) and light-based computers (which replace electrons with photons) come in.

While the idea of replacing electrons with photons sounds pretty simple, actually making it happen is anything but. As we explained back in September, while running current computers on light instead of electricity would effectively speed up the rate at which we could transmit data, silicon chips still require the photons to be converted back to electrons in order to be processed. 

This means everything would be slowed back down again, and the system would consume a whole lot of extra energy during the conversion process, which makes it even less efficient than if we'd just used electrons in the first place.

So we need to rebuild our computers from the ground-up to handle photons, that much is clear, and the likes of IBM, Intel, HP, and the US Defense Force are currently investing billions of dollars into developing the 'optoelectronic chips' required. These chips compute electronically, but use light to move information. 

If you've ever seen a microchip up close, you'll know they're composed of all kinds of tightly wound channels along which the electrons travel. The problem with building a photon-compatible version of this is that it's extremely difficult to get light to travel around bends. The answer? Plasmonic components, "which take advantage of the unique oscillating interactions of photons and electrons on the surface of metal", Patrick Tucker explains over at Defense One.

Sounds good right? But once again, it's not that simple. A lightwave is approximately 1 micrometre (1,000 nanometres), but we're close to making transistors as small as 10 nanometres. So we have two options: transmit lightwaves 'as is' and destroy an efficiency gains by having enormous components, or confine the light into nanoscale surface waves known as surface plasmon polaritons (SPPs).

We can do all of this already, but in the process, the plasmonic components will experience temperature increases of around 100 Kelvin, and basically fizzle out and die. And keeping them cool isn't as easy as simply running a fan over them. "You need a cooling system that works on the scale of the photonic chip's key features, less than a billionth of a metre in size," says Tucker. "It's one reason why many don't consider fully light-based transistors a practical possibility for decades."

In the words of George Constanza himself, "Why must there always be a problem?"

But for the first time, researchers from the Moscow Institute of Physics and Technology say they've come up with a solution. The heat comes from when the SPPs are absorbed by the metal in the components, so the Russian researchers have inserted what they call 'high-performance thermal interfaces' into the components to protect them from the metal.

These interfaces are basically just layers of thermally conductive materials placed between the chip and a conventional cooling system to ensure efficient heat removal from the chip, the team explains in the journal ACS Photonics. They say this method can keep temperature increases to within 10 degrees Celsius.

It's now up to the researchers to demonstrate this working within a more complete computer system, and they've got their work cut out. Late last year, UK-based researchers made their own significant advances towards light-based computer technology, so it's 'game on' for everybody involved.