Smart Glasses vs. Smart Frames for the Blind

Accessibility, Technology, Research
Peter Crumley
4 minutes
January 17, 2021

Adaptive accessible technology continues to be developed to support the blind at an ever-accelerating pace. This exciting new technology is now incorporating LIDAR data. When a smart frame or glasses is paired with a smartphone and specialized apps for the blind, this technology can allow for detailed real-time analysis of captured surrounding environmental information.

Developers continue to chase the holy grail of providing technologically assisted accurate and interactive mobility feedback to reduce the need for a guide dog or white cane skills. However, there remains a serious disconnection. Most of these technological developments are pressed forward through the companies' individual efforts without partnerships with other companies working on the same path.

black smart glasses on the floor
Photo by Bram Van Oost on Unsplash


What should be the first step for smart mobility of the blind?

For example, developing affordable, functional, and interactive smart glasses in collaboration with the blind AI/AR applications could be the first step that might solve the current problems of the market. However, before taking this step, there is another misconception that needs to be solved. The developers are still failing to distinguish between blind and low vision people, and they categorize both under the "visually "impaired" umbrella. This confusion creates counter-productive efforts in developing smart glasses for the blind.  

To date, smart glasses are developed with a universal one-size-fits-all approach, with only the "low-vision" people in mind. That means all available versions of smart glasses incorporate only AR displays and corrective and magnification properties. Along with the fact that these are costly, they do not apply to the needs of blind users. On the other hand, this lack of inclusion presents an opportunity for the blind to educate developers about what a blind person's perspective is. By doing so, they can help to replace currently applied perceived needs that are caused by the sighted person's perspective.  

While smart glasses can be developed for the use of low-vision users, who can benefit from the advantages of AR technology, smart frames should be designed with the needs of blind or severely low vision people in mind, for whom augmented reality means very little but haptic and sound feedback are better options. So, without further ado, here are some ideas about how a smart frame for the blind could ideally look:

* Affordable - Maximum price point of $500

* Comfortable - Lightweight all-day wearability

* Stylish open ear design

* Quality sound - Full dynamic range

* Alternate Reality sound compatible with 3D sound delivery

* LIDAR 3D camera - mounted on the bridge of the glasses

* Look ahead orientation technology

* Bluetooth connectivity - with multiple device options

* Weatherproof

* Extended battery life - approximately 10 Hours

* Customizable lenses

* USB-C connection

* Voice-activated microphone

* Volume and EQ adjustable controls on frame legs

* Touch surface on frame leg to perform basic VoiceOver gestures/taps


The future of the smart frames

Previously, smart frame development has been directed towards addressing the audio needs of the sighted community. There were no direct developmental efforts towards blind users or no collaboration attempts between blind app developers and smart glasses companies. The exception, to date, has been the Bose Frames. However incomplete, those frames were the first step of supporting the blind user.

The first feature was the ability to support forward-facing directional positioning of the phone and AR sound delivery incorporated into the first generation Bose Altro and Rondo Frames. It was controlled via the Bose Connect app installed on the phone. These are the first steps towards the inclusion of some essential needs of the blind user. Additionally, Bose Frames' introduction created a platform for blind developers to explore new accessibility possibilities with audio properties.

Recently, Bose introduced the second generation Tenor and Soprano models, along with a new Tempo Sport Model. All three models' control/setup is done through the Bose Music App. While these new models have some significant new functions, including a microphone and Tempo Sport model's sweat and weather-resistance, unfortunately, Bose took a step backward. They removed the AR forward-facing function through a firmware oversight within the Bose Music App. Fortunately, the AR hardware has been retained in the Tenor, Soprano, and Tempo Sport models, which might allow blind users, to keep their hopes up for a future reintegration of this critical function via an update to the current Bose Music App.

Hopefully, Bose will recognize the potential that Bose Frames have for the independence and societal inclusion of blind people. Especially if they managed to incorporate any of the "essential smart frames parameters," we just discussed above. And again, hopefully, the third generation of Bose Frames will include the all-important LIDAR 3D camera. Also, a bottom-up action might help to accelerate the change, that is, blind users and blind app developers raise their voices and demand what is necessary to make smart frames work for their needs.

Collaboration between smart frame manufacturers and blind app developers is critical and necessary to provide a complete package capable of providing hands-free environmental information to the blind user. It is not convenient for the blind solo traveler to hold their phone with the rear camera facing forward. It is not a good long-term solution for the prolonged use of these apps either. Pairing smart apps with "smart frames" will eliminate the need for holding one’s phone in their hand all the time.

What does it mean for Supersense's explore/find features' emerging capability in support of seamless mobility of a blind person? Video information collected hands-free and interactive feedback delivered by sound and vibration. We hope that their upcoming Lidar app Super Lidar will cover some part of this, no hands-free option yet, though. The best way of using this collection of environmental information will be through the use of fully refined smart Frames.

Peter Crumley

An illustration of a smartphone

Download Supersense

Supersense is one of the most downloaded and highest rated apps for the blind on both App Store and Google Play. We constantly add new features and and enhance the existing ones. Download and try it for free!

iOS AppStore Download Google Play Store Download

Let's Talk!

We’d love to have a conversation. If you are a part of the blind and visually impaired community, you’d like to be part of our mission, or share your ideas and collaborate with us, get in touch with us.

We are based in Cambridge, Massachusetts.

Fill out the form below to reach us or email us at info@mediate.tech

Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.

Supersense

by Mediate

Copyright © 2020 Supersense
All rights reserved

Supersense is a trademark of Virtual Collaboration Research Inc.

Stay up to date
Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.