New! Sign up for our free email newsletter.
Science News
from research organizations

Vision via sound for the blind

Date:
October 25, 2023
Source:
University of Technology Sydney
Summary:
Smart glasses that use a technique similar to a bat's echolocation could help blind and low-vision people navigate their surroundings, according to researchers. 
Share:
FULL STORY

ustralian researchers have developed cutting-edge technology known as "acoustic touch" that helps people 'see' using sound. The technology has the potential to transform the lives of those who are blind or have low vision.

Around 39 million people worldwide are blind, according to the World Health Organisation, and an additional 246 million people live with low vision, impacting their ability to participate in everyday life activities.

The next generation smart glasses, which translate visual information into distinct sound icons, were developed by researchers from the University of Technology Sydney and the University of Sydney, together with Sydney start-up ARIA Research.

"Smart glasses typically use computer vision and other sensory information to translate the wearer's surrounding into computer-synthesized speech," said Distinguished Professor Chin-Teng Lin, a global leader in brain-computer interface research from the University of Technology Sydney.

"However, acoustic touch technology sonifies objects, creating unique sound representations as they enter the device's field of view. For example, the sound of rustling leaves might signify a plant, or a buzzing sound might represent a mobile phone," he said.

A study into the efficacy and usability of acoustic touch technology to assist people who are blind, led by Dr Howe Zhu from the University of Technology Sydney, has just been published in the journal PLOS ONE.

The researchers tested the device with 14 participants; seven individuals with blindness or low vision and seven blindfolded sighted individuals who served as a control group.

They found that the wearable device, equipped with acoustic touch technology, significantly enhanced the ability of blind or low-vision individuals to recognise and reach for objects, without causing too much mental effort.

"The auditory feedback empowers users to identify and reach for objects with remarkable accuracy," said Dr Zhu. "Our findings indicate that acoustic touch has the potential to offer a wearable and effective method of sensory augmentation for the visually impaired community."

The research underscores the importance of developing assistive technology in overcoming the challenges such as locating specific household items and personal belongings.

By addressing these day-to-day challenges, the acoustic touch technology opens new doors for individuals who are blind or have low vision, enhancing their independence and quality of life.

With ongoing advancements, the acoustic touch technology could become an integral part of assistive technologies, supporting individuals to access their environment more efficiently and effectively than ever before.


Story Source:

Materials provided by University of Technology Sydney. Note: Content may be edited for style and length.


Journal Reference:

  1. Howe Yuan Zhu, Shayikh Nadim Hossain, Craig Jin, Avinash K. Singh, Minh Tran Duc Nguyen, Lil Deverell, Vincent Nguyen, Felicity S. Gates, Ibai Gorordo Fernandez, Marx Vergel Melencio, Julee-anne Renee Bell, Chin-Teng Lin. An investigation into the effectiveness of using acoustic touch to assist people who are blind. PLOS ONE, 2023; 18 (10): e0290431 DOI: 10.1371/journal.pone.0290431

Cite This Page:

University of Technology Sydney. "Vision via sound for the blind." ScienceDaily. ScienceDaily, 25 October 2023. <www.sciencedaily.com/releases/2023/10/231025223433.htm>.
University of Technology Sydney. (2023, October 25). Vision via sound for the blind. ScienceDaily. Retrieved November 17, 2024 from www.sciencedaily.com/releases/2023/10/231025223433.htm
University of Technology Sydney. "Vision via sound for the blind." ScienceDaily. www.sciencedaily.com/releases/2023/10/231025223433.htm (accessed November 17, 2024).

Explore More

from ScienceDaily

RELATED STORIES