EVENT TICKETS
ALL TICKETS >
2025 New Year's Eve
Regular Events
Hurry! Get Your Tickets Now! Countdown has begun!!

2025 Midnight Madness NYE PARTY
Regular Events
Join us for an unforgettable night filled with glitz, glamour, and good vibes! The 2025 Midnight Madness NYE Party promises to be a night to remember with Live Music by DJ Malay

Big Fat New Year Eve 2025
Regular Events
Arizona's Largest & Hottest New Year’s Eve Event: Big Fat Bollywood Bash - Tuesday Dec 31, 2024. Tickets @ early bird pricing on sale now (limited quantity of group discount

Researchers develop app for Google Glasses to help visually impairedWashington, Apr 25(AZINS) An application that can magnify a smartphone screen to potentially benefit low-vision users has been developed by researchers, including an Indian-origin.

The smartphone application projects a magnified smartphone screen to Google Glass, which users can navigate using head movements to view a corresponding portion of the magnified screen, researchers said.

The technology can potentially benefit low-vision users, many of whom find the smartphone's built-in zoom feature to be difficult to use due to the loss of context, they said.

"When people with low visual acuity zoom in on their smartphones, they see only a small portion of the screen, and it is difficult for them to navigate around - they do not know whether the current position is in the centre of the screen or in the corner of the screen," said Gang Luo from Harvard Medical School.

"This application transfers the image of smartphone screens to Google Glass and allows users to control the portion of the screen they see by moving their heads to scan, which gives them a very good sense of orientation," said Luo.

People with low vision often have great difficulty reading and discerning fine details. Magnification is considered the most effective method of compensating for visual loss, researchers said.

They developed the head-motion application to address the limitations of conventional smartphone screen zooming, which does not provide sufficient context and can be painstaking to navigate.

Researchers observed two groups of subjects (one group that used the head-motion Google Glass application and the other using the built-in zoom feature on a smartphone) and measured the time it took for them to complete certain tasks.

They showed that the head-based navigation method reduced the average trial time compared to conventional manual scrolling by about 28 per cent.

Researchers now want to incorporate more gestures on the Google Glass to interact with smartphones.

They also want to study the effectiveness of head-motion based navigation compared to other commonly-used smartphone accessibility features, such as voice-based navigation.

"Given the current heightened interest in smart glasses, such as Microsoft's Hololens and Epson's Moverio, it is conceivable to think of a smart glass working independently without requiring a paired mobile device in near future," said Shrinivas Pundlik from Harvard Medical School.

"The concept of head-controlled screen navigation can be useful in such glasses even for people who are not visually impaired," said Pundlik.

The findings were published in the journal IEEE Transactions on Neural Systems and Rehabilitation Engineering.