Boris Smokrovic/Unsplash

Bee Brains Could Help Your Camera Take Better Photos

It's a trick of the light.

DAVID NIELD
5 JUL 2017
 

New research on how bees perceive colour could be put to good use in our digital cameras, meaning photos shot by drones or phones would look more natural than ever.

It's all to do with colour constancy, the way that bees (and humans) can tell a flower is red no matter what the colour or quality of the light – a mental trick that the digital cameras of today really struggle with.

 

Researchers found that bees are using two colour receptors in their ocelli (the three extra eyes on the top of the head) that judge the colour of ambient light, in combination with two main compound eyes that detect flower colours more directly.

"Physics suggests the ocelli sensing of the colour of light could allow a brain to discount the naturally coloured illumination which would otherwise confuse colour perception," explains lead researcher Jair Garcia from RMIT University in Australia. 

"But for this to be true the information from the ocelli would have to be integrated with colours seen by the compound eyes."

In the past it was thought bees might use some kind of chromatic adaptation, like humans, to make colour constancy corrections. It's similar to adjusting the white balance on a photo to correct for the ambient light.

What the new research suggests is that bees are doing something different: the scientists traced neural activity from the ocelli, discovering that information was passed to the key colour processing areas of the bee brain.

Those three tiny upward-facing eyes measure the light coming from the sky and can make adjustments accordingly, correctly identifying flower colours – a very important skill when you're a bee on the hunt for pollen.

 

Up until now the purpose of the ocelli was a bit of a mystery, with some suggestions that it helps keep bees stable in the air.

The team then set down some mathematical principles behind this mix of data from the ocelli and compound eyes, principles which could eventually be used to program the same trick into a smartphone camera or an exploratory robot.

"For a digital system like a camera or a robot the colour of objects often changes," says one of the team, Adrian Dyer from RMIT. "Currently this problem is dealt with by assuming the world is, on average, grey."

"This means it's difficult to identify the true colour of ripe fruit or mineral-rich sands, limiting outdoor colour imaging solutions by drones, for example."

The model put forward by the scientists matches up with certain real-world behaviours, such as the way honey bees struggle to find the right flowers under artificial light.

As well as making your Facebook photos look more realistic in unusual lighting situations, this model could help robots trek through sunlight and shade without getting confused about what they're looking at.

Cameras, robots, and drones could even make use of extra ocelli-like sensors to judge ambient light conditions, suggest the researchers, and it wouldn't require all that much extra processing power.

"The discovery provides a superb solution to a classic problem and makes colour constancy computationally inexpensive," says one of the researchers, John Endler from Deakin University in Australia.

The research has been published in PNAS.

More From ScienceAlert

When Will the First Human Leave the Solar System?
16 hours ago