Echolocation is not just a skill that dolphins or bats possess. Believe it or not, humans can also 'see with sound', and it's surprisingly easy for people to learn.
Becoming an expert is another matter.
The best echolocators among us can use the clicks of their mouths or the taps of their canes to create an astonishingly accurate mental map of their surroundings, even in the absence of vision.
The information they gather from sound alone can reveal not just the location of surrounding objects, but their position, size, distance, shape, and material as well.
A new experiment has now provided the first "fine-grained account" for how the human brain pulls this off.
The findings suggest that with each returning echo, the central nervous system gradually builds and refines its picture of the surrounding space, homing in on the details.
In other words, the brain doesn't just rely on one single echo to perceive and navigate an environment, but a symphony of returning sounds. What's more, other research has shown that the brain calls upon visual pathways, as well as auditory ones, to decipher these cues.
The research was conducted by neuroscientists at the Smith-Kettlewell Eye Research Institute, a nonprofit research institute in San Francisco, California. It compared 4 expert echolocators to 21 sighted participants with no experience in echolocation.
In each session, participants were fitted with EEG caps to measure their brain activity. They then listened in a dark room as sequences of up to 11 synthetic clicking sounds were made. These sounds were followed by fake echoes, mimicking the noise bouncing off a virtual object in the room.
Participants had to determine where this virtual object was located, somewhere to their left or right, based on the echoes.

Just as researchers suspected, participants who were experts in echolocation were significantly better at figuring out where the virtual object was in front of them, scoring above chance every time.
Sighted participants, meanwhile, guessed at rates no better than 50 percent.
Still, it was the three expert echolocators who had become blind earlier in life who scored the best by far. These three individuals were correct about where the virtual object was located more than 70 percent of the time, even after hearing only a few clicks.
The findings suggest that early blindness may foster an enhanced sensitivity to sound. Interestingly, when the virtual object was further to a participant's right or left, it took fewer clicks for them to locate it. The best angle for the human brain was about 45 degrees from the midline.
The study authors also found that each returning sound stimulated the brain's spatial networks faster than the last. This may possibly reflect how sensory information is rapidly extracted, integrated, and refined into a coherent picture.
The study is small, but it aligns with broader evidence suggesting that when vision is lacking, the brain may become more attuned to spatial acoustic cues.
In two expert echolocators, with early-onset blindness, there was a "steep improvement" between the seventh and eighth clicks.
Related: Humans Can Learn How to 'Echolocate' in Just 10 Weeks, Experiment Shows
This suggests their "perceptual system effectively integrates echoacoustic features over time, then plateaus or saturates as ceiling performance is reached."
The study is among the first to use EEG recordings to explore how the human brain processes echolocation information from click to click. While more research is needed to understand the skill, this experiment "showcases the remarkable flexibility of the brain's perceptual systems in the absence of vision."
The brain's plasticity is not to be underestimated.
The study was published in eNeuro.
