There has been a lot of talk about the technology Ray-Ban has rolled out with their new Meta glasses. These glasses were intended to make life tasks easier for the individual wearing them, which may be truer than ever for the individual who is experiencing vision loss. As an Assistive Technology Specialist with Future In Sight, and someone with vision loss, the Ray-Ban Meta glasses were something I was very eager to try.
What Are They?
Simply put, they are smart eyewear that combines advanced technology with the classic look of Ray-Ban frames. Built-in cameras, speakers, and microphones enable the user to see and hear the environment, allowing the user to interact with the world in new ways. The glasses use Bluetooth technology and can be paired with iPhone and Android smartphones to assist with things like managing calls and using apps.
Originally, these fashionable wearables were designed for vloggers and streamers; those who create content for platforms like Facebook, YouTube, Instagram, etc. When the blind and visually impaired community explored this new technology, it brought the benefits of the Ray-Ban Meta glasses to an entirely new level. To better explain how these glasses can positively impact the life of someone living with vision loss – or in my case, total blindness – I documented a day in my life while using my own Ray-Ban Meta glasses.
Today began as any day in my house does, with a dog that needed to go out. With my glasses on my face, I asked, “Hey Meta, what’s the weather today?” With confirmation from my Ray-Ban Meta glasses that today’s forecast calls for sunshine and warmer temperatures, I dressed and existed my house with our Mini Golden Doodle. White cane in one hand and dog leash in the other, I quickly realized that I hadn’t turned on my Spotify playlist. “Hey Meta, play music by Tori Kelly on Spotify.” Moments later I am humming along to the music. I’m confidently hearing my dog, the traffic and other sounds around me, as the wearables use open-ear speaker technology, balancing audio content and external sounds. Something catches my attention so I slide my finger backward along the right arm of the frames, easily adjusting the volume so I can better hear the noise around me – geese. We are almost back home when my cane taps something unexpected in my path on the sidewalk. My dog sniffs it to no avail, and now I’m curious. “Hey Meta, look and tell me what you see?” This command asks the glasses to capture and analyze a photo using artificial intelligence (AI). I learned via audible response from the glasses that it is a Cozy Coupe! I laughed out loud at the thought of this. I followed up and asked for more feedback. The response thoroughly detailed it as an ice cream truck-style Cozy Coupe by Little Tikes. It was attached to a mailbox by a bungee cord and there was a sign taped to the side of the toy vehicle. I further asked what was on the sign and received another audible response indicating in bold lettering if interested to call the phone number on the sign. I couldn’t resist the photo opp. My dog jumped in the toy ice cream truck, and I announced, “Hey Meta, take a photo.” I then instructed Meta to send the last photo to my husband in a text message. I was positive about what would happen next! My phone rang through the Ray-Ban Meta AI glasses, so I quickly double tapped on the right arm of the frames and answered the call. My husband and I both laughed over the image I shared. With another quick double tap on the frame the call ended, and my doodle and I navigated the rest of the way home.
Throughout my workday, my Ray-Ban Meta AI glasses served as an assistant, offering answers to questions like the distance between certain towns, and allowing me to schedule appointments at different locations with enough time in between for travel. My glasses also assisted by reading aloud what was on my computer screen when my usual screen reader stopped working, allowing me to troubleshoot the solution.
Later, my sister asked if I wanted to meet her for dinner after work. Thankfully my glasses came in handy again, allowing me to capture a photo of the city bus that arrived at the bus stop, ensuring it was the correct bus number. Once at the restaurant with my sister, I was able to use my glasses to review the menu, something I would previously have had to ask my guest or a waiter to do for me. I commanded my glasses, “Look and tell me what you see,” and listed as they read the menu. This was a little overwhelming at first, but then I followed up with, “Read the salads”, and I immediately heard the different salads spoken privately through the glasses. I was able to place my dinner order independently and confidently. After dinner, my sister gave me a ride home. On the way, we stopped at a grocery store because we both had a couple of items to pick up. Instead of walking into the store with her, I mentioned I’d meet her back at the car. Using my orientation and mobility skills and familiarity with the basic layout of the store, in conjunction with my glasses as an added AI assistant, I was able to identify aisles, products, and even specific ingredients, allowing me to get my shopping done independently – something I have never been able to do before now. Admittedly, this was an enjoyable personal win for me. The Ray-Ban Meta AI glasses offer that seamless experience, where I don’t have to pull out my phone and stop to access everything. My movements were fluid as I spoke to Meta and received timely helpful responses audibly through the glasses. At the end of my day, I placed my Meta AI Ray-Ban glasses on my bedside table, feeling a sense of satisfaction for today and looking forward to their added assistance for tomorrow.
As I continue to explore all the capabilities of the Meta AI Ray-Ban glasses, it is important to point out that for anyone considering using a tool like this, having all of the skills and training offered by organizations such as Future In Sight is critical to success. We also need to remain mindful that all options have strengths and weaknesses, and while this tool might work for me, it does not necessarily mean it is the best tool for everyone. Based on my own experience, I have compiled a list of things I believe are strengths and limitations of the Ray-Ban Meta AI glasses.
Strengths
- Object recognition and identification of everyday items such as chairs, doors and groceries
- Text to speech (reading text printed on signs, menus and mail)
- Navigation assistance (reading bus numbers, street signs and room numbers)
- Scene description (providing a basic layout of a room or a detailed description of a scene)
- People detection (identifies individuals you encounter regularly) *this is not an immediate tool as it takes time for the glasses to learn your routines*
Limitations
- Limited accuracy in crowded areas
- Reliant on a stable Bluetooth connection with a smartphone and internet access for many functions to work properly
- Battery life may run low quickly depending on extent of use (though the glasses case does provide a quick charge)
- Audio feedback can be difficult to hear clearly in noisy environments
- Limited interaction capabilities (glasses can respond to some follow-up questions, but cannot handle complex queries or provide detailed explanations in certain situations)
Overall, the Ray-Ban Meta AI smart glasses offer a stylish and affordable option compared to most other wearable devices. Their capabilities and inclusive design are impressive now and have an even more promising future.
If you or someone you love is experiencing vision loss and could benefit from our services, please contact Future In Sight at [email protected] or 603-224-4039 today!
About the Author: Stephanie Hurd is the Assistive Technology Specialist at Future In Sight.