Prof. Amir Amedi of the Hebrew University answers questions from attendees at a Jan. 16 presentation.

A white cane has been used for generations to help guide the mobility of people who are blind. Researchers at the Hebrew University of Jerusalem created the EyeCane, which added technology that indicates to the user the distance to obstructions. From there, other technological advances were added to identify the types of items in the area – a couch, chairs, table, lamp – and convey the information to the user’s ear. Still not satisfied, the scientists combined the invention with artificial intelligence and complex auditory accompaniments so that the user could identify the size, shape, colour, brightness and other attributes of the space around them to get a full “picture” of their surroundings.

Presented by the Vancouver chapter of the Canadian Friends of the Hebrew University, Prof. Amir Amedi spoke at Schara Tzedeck Synagogue Jan. 16. Amedi is a professor in Hebrew University’s Department of Medical Neurobiology and an adjoint research professor at the Sorbonne in Paris. He is currently a visiting professor at McGill University in Montreal.

The interdisciplinary marriage of computer science, neurology, philosophy, rehabilitation, physics and other fields is leading to unprecedented advances in aids for people with disabilities. Some of the foremost innovation is taking place at Hebrew University, where Amedi works with a large team across many faculties.

The EyeCane is just one example of the sorts of tools being developed in Amedi’s lab, items that are known as sensory-substitution devices (SSDs). The most common SSD is the written word, Amedi explained. For millennia, humans communicated only verbally. Written language is a device that substitutes two senses – speaking and hearing – into a different form: writing and reading.

Amedi discussed the development of agriculture, then cities, then written language, then printing, each of which took tens of thousands of years to evolve, allowing the human brain plenty of time to accommodate the changes. Today, though, new technologies come flying at us daily and the question this raises, according to Amedi, is how our brains are able to adapt so readily to such sudden changes – an issue Amedi refers to as a “real estate problem” in the brain.

“How can the brain, in the slow evolutionary process, adapt to more and more information, more and more technologies?” he asked.

One theory posits that parts of the brain get recycled to deal with cognitive tasks it has not previously confronted.

A parallel invention of Amedi’s lab is an auditory process that allows blind people to “see” with their brain. Sight is really a function of the brain, not the eyes, he said. The eyes are the conduit, but the brain does the cognitive work of seeing. Bypassing the non-functional eyes and going through the ears directly to the part of the brain where sight is computed, Amedi and his team have been able to create a complex musical language that allows blind people to absorb immense amounts of information about the environment around them.

In a demonstration, Amedi walked the audience through the first lesson users of the technology are taught. Simple sounds – similar to Morse code – represent lines. A musical scale going up or down represents stairs. A smile is depicted by a falling then rising tone. Pitch is added to determine height. Timbre is introduced to depict different colours. In a remarkably short time, blind people are able to ascertain immense awareness of their visual environments.

Significantly, Amedi added, brain imaging indicates that the part of the brain processing the information is identical, whether a sighted person is looking at something with their eyes or a blind person is “looking” at something using the auditory sensory-substitution process.

More information about Amedi’s work is online at brain.huji.ac.il.