Many species in the animal kingdom are naturally skilled echolocators as a matter of survival. These animals have highly developed biological sonars that transmit auditory signals to receive reflected auditory cues and make perceptual sense of their environment. This includes some birds, rodents, underwater cetaceans, and of course, bats.
The speciation of humans have evolved to utilize vision as the main sensory system for perceiving our world. However, some individuals with nonfunctional visual systems have demonstrated the possibility of sensory substitution. They have the ability to develop refined auditory and vestibular systems to navigate the external environment using auditory feedback, or echolocation.
A white cane is the universal symbol of access for blind and visually impaired, and is a tool for helping individuals travel through their environment. Some individuals utilize echolocation by tapping the cane against the ground, or learn to use tongue or finger clicks for auditory feedback. Expert echolocators are able to identify the physical properties of objects, describing the texture, whether hard, soft, or dense. Daniel Kish is an expert echolocator who has been blind since age 2, and is able to navigate riding a bicycle using echolocation.
The outer ear collects and directs reflected sound spectrums to the middle and inner ear, where vibrations are produced that propagate along the basilar membrane of the cochlear – a spiraled, conical chamber of bone. This is where hair cells are adapted to respond to certain frequencies, and become stimulated and send nervous impulses to the auditory central nervous system. Specialized ganglia in the brain stem and auditory cortex interpret the frequency-modulated signals.
Recently at the Canadian Association of Neuroscience meeting, Dr. Mel Goodale from the University of Western Ontario, Brain Mind Institute presented research that expert echolocators use the same area of the brain as sighted individuals to make perceptual sense of their environment. He used functional magnetic resonance imaging (fMRI) to compare the brain activity between three groups of individuals – sighted, blind non-echolocators, and blind echolocators. Echolocators used clicks to gauge different materials, while non-echolocators, either visually impaired or blind-folded, were encouraged to try echolocating and emitting auditory signals. Only the blind echolocators showed activation in the parahippocampal cortex (PHC), the brain region associated with sense perception in sighted individuals. This is not surprising as the echolocators were the only group able to interpret any information from the auditory cues, thus had functional integration of neural networks required for perception.
The expert echolocators were also capable of identifying the true physical size of objects, independent of viewing distance and weight. This was believed to be a visual phenomenon explained by objects perceived by a different visual angle on the retina, but these results demonstrate it can also operate by auditory based echolocation.
New technologies are also emerging to help visually impaired individuals navigate the environment, including electronic canes with ultrasonic transmitters and sensors. The SmartCane is a white cane with built in ultrasonic transmitters and sensors that can detect nearby objects that warn the user by sending vibrations through the cane. An Ultracane is another model, with dual short and long range modes to detect obstacles anywhere from 2 to 4 meters away. A portable ultrasound kit, called the ultrabike, also exists that can attach to the handlebars of any bike. These sensors that can detect up to 8m ahead sending vibrations to the handlebars, and is only recommended for use on supervised cycling tracks.
A smart phone app developed by an audiologist at the University of Southhampton is being created for the visually impaired, but is also for anyone interested in learning more about echolocation, in hopes that this will raise awareness for visually impaired research. Another app called iAid has been recently developed by a 17-year old, who was inspired at age 12 after interacting with a blind woman crossing the street. The app consists of a belt with four ultrasonic sensors, with control of a joystick and smartphone. There is even a new video game being launched that features a blind woman using only echolocation to solve mysteries in a haunted house.
Individuals such as Daniel Kish who have highly functional echolocation skills, also promote education regarding self-directed achievement of people with blindness, and teach blind people how to develop and integrate echolocation into their life. He founded a non-profit educational organization in called World Access for the Blind for this purpose and to increase public awareness about the strengths and capabilities of blind people, helping over 10,000 students in 40 countries. Daniel Kish presented a TED Talk this March discussing his perspective of his life, blindness, and helping others navigate their challenges.
With increasing awareness and education about the possibilities of echolocation, we can appreciate and be compassionate to the fact that every organism has unique perceptual sensations and experiences. The ability for humans to substitute sensory skills demonstrates the resiliency and ability to flourish and adapt to given environmental and experiential circumstances.
Feel free to share your opinions in the comment section below!