Hillel's Tech Corner: Seeing with your ears

RenewSenses created a new language for individuals with visual impairment to see using their ears, and the results are astounding.

RenewSenses (photo credit: Courtesy)
RenewSenses
(photo credit: Courtesy)
Do me a favor and reach into your pocket, take out your smartphone, launch the camera app, and snap a picture of whatever is in front of you. Now look at that picture and think about the fact that we all have these cutting-edge cameras in our pockets, and yet, 250 million people around the world still cannot see the object you just captured with your phone.
My point is, given all the technology to which we have access, blindness and visual impairment should be something technology can solve, or at the very least, improve. Now, don’t get me wrong, there is a lot of innovation in this area, and we have covered Orcam, an incredible Jerusalem-based company that is changing the game with its device.
RenewSenses is another incredibly innovative company attacking this challenge from a whole other angle.
If I were to ask you, “What body part do you use to see?” you would most likely respond instantly “My eyes,” but did you ever stop to think about the brain’s central role here? Without the brain, can we really see? Would you recognize a pattern, an object, a person, without the brain’s involvement? Obviously not.
So, what if there was a way to bypass the eye’s transmission to the brain and use a camera to recognize that pattern, object or person and then send it straight to the brain by means of sound? Sounds like science fiction, right?
That is exactly what RenewSenses does. It has basically created a new language for individuals with visual impairment to see using their ears, and the results are astounding. I am talking about over 70% accuracy.
The company’s wearable devices combine cutting-edge computer vision and vision-to-audio sensory substitution methods, enabling users to detect and locate the objects, people and general visual characteristics in their immediate surroundings through sound.
One of the testimonials on the RenewSenses website says it better than I could: “This experience is a whole new dimension. It is like you need to use this other part of your brain that has been sleeping for years.” In another testimonial, a user from Germany expressed joy over how she was able to experience the decorations on her Christmas tree.
The company’s innovative approach and resulting technologies have been recognized by some of the leading experts in the market. The Israeli Innovation Authority granted RenewSenses the maximum funds available under R&D for assistive-technology programs to support its mission. RenewSenses also presented at the Tel Aviv demo day of the highly selective 8200 Impact accelerator, and won a grant from the new accessibility program created by Bank Hapoalim, the largest bank in Israel, all while raising funds from impact angel investors.
Advertisement
Additionally, RenewSenses won first place in the Jerusalem-based MassChallenge Israel Accelerator.

Stay updated with the latest news!

Subscribe to The Jerusalem Post Newsletter


The company was founded by CEO Tomer Behor and Prof. Amir Amedi, an internationally acclaimed neuroscientist with 15 years of experience in the field of brain plasticity and multisensory integration. RenewSenses was formed following years of research done in Prof. Amedi’s lab in the Hebrew University of Jerusalem. With the support of Yissum, the technology-transfer company of the Hebrew University, Behor and Amedi proceeded to transition from academia to commercialization, with the goal of enabling people who are visually impaired with the solution they need to achieve unprecedented independence.
Renewsenses currently has two products: EyeMusic and AI Cane.
EyeMusic is an IP-protected algorithm enabling users to “see” through sound. It is a free iOS and Android application that connects to glasses with a camera, allowing users to have a more complete understanding of their surroundings. Artificial intelligence helps the users identify the objects and people around them, including their respective spatial location. This is all achieved through a combination of musical notes and speech to convey information on any given scenes, colors, shapes, people and locations of objects.
Users of EyeMusic wear a miniature camera connected to a smartphone and stereo headphones. The images are converted into sounds using an algorithm, allowing the user to listen and then interpret the visual information. After training, blind individuals can recognize the letters of the alphabet, “see” pictures of animals, and even find an object or person in a complex visual landscape.
Teaching the brain to “see” through sound can be a long process combined of trial-and-error, and the RenewSenses team is working on a new, higher-resolution vision-to-sound algorithm with greater accuracy, enabling users to receive much more information than ever before.
RenewSenses’ computer-vision algorithms enable real-time recognition of the objects and people around the user – all running on a smartphone, and without a need of an Internet connection. By tapping into the power of AI, RenewSenses has managed create neural networks that serve as “auxiliary wheels” for the brain to get a good grasp of perception. Not only does this give the user the ability to perceive the actual visual information, the app also tells them in real-time what is in front of them. RenewSenses uses AI as a step toward the direction to restore human capabilities of perception, and the ability of the brain to process the sensory information.
Thanks to RenewSenses, a user who is blind can enter a room for the first time, and independently navigate toward a chair. After training, which is done through real-life “gamified” challenges, users are able to get a perception of their environment – the closest to vision it can get.
RenewSenses’ AI Cane is a hand-held device that assists blind or visually impaired people navigate independently, with a special focus on obstacle detection. The AI Cane is designed to vibrate when detecting obstacles, while stating the name of the object in front of the user. In this way, for example, a user is able to avoid walls, locate doors and stairs, and other obstacles that may just be on the ground. It is a flashlight-like orientation device that emits infrared rays to translate distance into auditory and tactile cues, enabling the user to sense objects within an adjustable range of up to five meters. After brief training, AI Cane users can estimate distances, avoid obstacles and successfully navigate in simple environments with full awareness of the objects and people around them.
While the device is not meant to replacing the white cane, it is certainly a much-needed addition to it that can also be used as a standalone. A soft launch of this device is expected in New England within the next two months, in collaboration with Project Ray, another Israeli company that is creating smartphones for people who are visually impaired.
It is still the early days for this company, but given the track record of the team, the creativity of their solution, and the gigantic target market, I am sure we will be hearing (pun very much intended!) a lot more from this company in the near future.