Meta has unveiled a series of innovative accessibility and user support features, which include audio explainers in Ray-Ban Meta glasses, sign-language translation capabilities in WhatsApp, advancements in wristband interaction, and much more.
To kick things off, Meta is enhancing the descriptive features within Ray-Ban Meta glasses, which will empower wearers to gain a deeper understanding of their surroundings.
In Meta’s own words:
“Starting today, we’re introducing the ability to customize Meta AI to provide detailed responses on Ray-Ban Meta glasses based on what’s in front of you. With this new feature, Meta AI will be able to provide more descriptive responses when people ask about their environment.”
These enhancements will tremendously benefit individuals with varying levels of vision, offering them more options to comprehend their surroundings, as audio explainers can be delivered directly into their ears upon request.
This could also significantly enhance the popularity of Meta’s smart glasses among a broader spectrum of users. The integration of on-demand AI has already contributed to increased sales of the device, and the introduction of such assistive functionalities will further expand their audience reach.
Meta has announced that this feature will be gradually rolled out to all users in both the U.S. and Canada in the upcoming weeks, with plans to extend to additional markets soon thereafter.
“To get started, navigate to the Device settings section within the Meta AI app and toggle on detailed responses under Accessibility.”
Moreover, Meta is introducing a new feature called “Call a Volunteer,” which connects individuals who are blind or have low vision to a network of sighted volunteers in real-time to assist with various tasks.
In another exciting development, Meta has highlighted its ongoing work on sEMG (surface electromyography) interaction via a wristband device, which utilizes electromagnetic signals from the body to facilitate digital interactions.
Meta has been actively developing wrist-controlled functionality for its forthcoming AR glasses, which will significantly enhance accessibility for users.
Currently, Meta is in the process of building upon its advancements with the wrist interaction device, focusing on enhancing usability and effectiveness:
“In April, we completed data collection with a Clinical Research Organization (CRO) to evaluate the ability of individuals with hand tremors (due to Parkinson’s and Essential Tremor) to utilize sEMG-based models for computer controls (like swiping and clicking) and for sEMG-based handwriting. We also have an active research collaboration with Carnegie Mellon University to empower individuals with hand paralysis caused by spinal cord injury to use sEMG-based controls for human-computer interactions. These individuals exhibit minimal motor signals, which our high-resolution technology can detect. We are able to instruct individuals on how to rapidly use these signals, facilitating HCI as soon as Day 1 of system use.”
The potential applications for such technology could be immense, and Meta is making strides in developing enhanced wristband interaction devices that may one day enable direct interaction for individuals with limited movement capabilities.
Lastly, Meta has pointed to the progressive use of its AI models for new assistance features, including “Sign-Speak,” created by a third-party provider, which allows WhatsApp users to translate spoken language into sign language (and vice versa) using AI-generated video clips.

This innovation could represent yet another significant advancement in enhancing communication, fostering greater engagement among users with diverse abilities.
These initiatives are part of valuable projects with far-reaching implications for accessibility and inclusion.
You can explore more about Meta’s latest accessibility advancements by visiting here.








