Meta released a software update for Ray-Ban Display smart glasses that introduces writing gestures and walking-direction overlays [1, 2].
This shift transforms the wearable from a closed system into an open platform. By allowing external developers to build services, Meta aims to increase the utility of the hardware and foster a broader ecosystem of augmented reality experiences [2, 3].
The update, announced Wednesday, May 14, 2024 [3], enables users to interact with the glasses through new gesture-based controls. These writing gestures allow for more intuitive input, while the navigation overlays provide real-time directions directly within the user's field of vision [1, 2].
Beyond these native features, Meta is now opening the platform to third-party apps and games [2, 3]. This move allows developers to create custom software tailored for the glasses' unique display and sensor array. The company said the goal is to make the device more useful for daily tasks and entertainment [2].
The hardware, which carries a retail price of $799 [4], has previously relied on a limited set of first-party features. The introduction of a developer ecosystem suggests a strategy to move beyond simple smart-glass functionality and toward a more comprehensive AR interface.
Meta did not specify a rollout timeline for the first wave of third-party applications, though the platform is now accessible for development [3].
“Meta is now opening the platform to third-party apps and games.”
By transitioning the Ray-Ban Display glasses to an open platform, Meta is attempting to replicate the 'app store' model that drove the success of smartphones. Moving from a fixed set of features to a developer-driven ecosystem allows Meta to outsource innovation and find niche use cases for AR without having to build every application internally.





