Cupertino (California), May 20 (SocialNews.XYZ) To further help people with mobility, vision, hearing and cognitive disabilities, Apple has announced powerful accessibility software updates across iOS, watchOS and iPadOS, including a new SignTime service to connect Apple Store and Apple Support customers with on-demand sign language interpreters.
To begin with, customers visiting Apple Store locations in the US, the UK and France can use SignTime to remotely access a sign language interpreter right in their web browsers.
Later this year, people with limb differences will be able to navigate Apple Watch using AssistiveTouch.
iPad will also support third-party eye-tracking hardware for easier control and for blind and low-vision communities and Apple's 'VoiceOver' screen reader will get even smarter using on-device intelligence to explore objects within images, the company said in a statement late on Wednesday.
"With these new features, we're pushing the boundaries of innovation with next-generation technologies that bring the fun and function of Apple technology to even more people -- and we can't wait to share them with our users," said Sarah Herrlinger, Apple's senior director of Global Accessibility Policy and Initiatives.
In support of neurodiversity, Apple is introducing new background sounds to help minimise distractions, and for those who are deaf or hard of hearing and Made for iPhone (MFi) will soon support new bi-directional hearing aids.
AssistiveTouch for watchOS allows users with upper body limb differences to enjoy the benefits of Apple Watch without ever having to touch the display or controls.
Using built-in motion sensors like the gyroscope and accelerometer, along with the optical heart rate sensor and on-device machine learning, Apple Watch can detect subtle differences in muscle movement and tendon activity, which lets users navigate a cursor on the display through a series of hand gestures, like a pinch or a clench.
"iPadOS will support third-party eye-tracking devices, making it possible for people to control iPad using just their eyes," Apple said.
Later this year, compatible MFi devices will track where a person is looking onscreen and the pointer will move to follow the person's gaze, while extended eye contact performs an action, like a tap, the company added.
Source: IANS
About Gopi
Gopi Adusumilli is a Programmer. He is the editor of SocialNews.XYZ and President of AGK Fire Inc.
He enjoys designing websites, developing mobile applications and publishing news articles on current events from various authenticated news sources.
When it comes to writing he likes to write about current world politics and Indian Movies. His future plans include developing SocialNews.XYZ into a News website that has no bias or judgment towards any.
He can be reached at gopi@socialnews.xyz