Meta has added third-party app development support, writing gestures, and walking directions to its Ray-Ban Display smart glasses [1].
This shift transforms the glasses from a closed hardware ecosystem into an open platform. By allowing external developers to build services, Meta aims to increase the daily utility of the wearable and reduce reliance on a limited set of first-party features [1, 2].
Users can now interact with the glasses using writing gestures and access walking directions [1, 3]. These updates are designed to make the device more intuitive for navigation and communication without requiring the user to constantly interact with a smartphone screen [2].
Control for these new features is managed through a dedicated wrist-band, known as the Neural Band [1, 4]. This hardware addition allows for more precise input and control over the glasses' interface, enabling the writing gestures and app navigation described in the update [1].
Meta said the goal of the update is to make the Ray-Ban Display glasses more useful by enabling new interaction modes [1, 2]. The company said a broader ecosystem of apps will drive adoption and provide a more versatile user experience [1].
Opening the platform allows developers to create specialized tools that leverage the glasses' display and sensors [1, 4]. This could lead to a variety of niche applications, ranging from productivity tools to augmented reality utilities, though the specific nature of the first third-party apps remains to be seen [4].
“Meta has added third-party app development support, writing gestures, and walking directions to its Ray-Ban Display smart glasses.”
By transitioning to an open developer model and introducing the Neural Band, Meta is attempting to solve the 'input problem' that has plagued smart glasses. Moving away from simple voice commands toward gesture-based control and third-party extensibility suggests a strategy to position the hardware as a legitimate computing platform rather than a mere accessory.





