Meta Platforms Inc. is facing lawsuits and investigations over allegations that its Ray-Ban Meta smart glasses record audio and video without consent.

These developments highlight a growing tension between wearable AI integration and personal privacy. If employees are accessing private footage, it suggests a systemic failure in the company's data protection promises.

The glasses were released in 2023 [1]. However, reports surfacing in 2024 indicate that the devices may capture and transmit visual and audio data that Meta employees can access. This has led to a class-action lawsuit filed in the U.S. and separate investigations conducted by two Swedish newspapers.

The core of the controversy involves the AI-enabled nature of the glasses. Because the devices transmit data to the cloud for processing, critics argue it creates a gateway for unauthorized surveillance. Reports indicate that Meta workers have reviewed footage captured by users, raising concerns about who has access to the private lives of those wearing the glasses.

There is a conflict regarding the efficacy of the company's safeguards. Some reports state that hidden cameras in the Ray-Ban Meta glasses are being misused. Conversely, other analysis suggests that Meta's privacy policy attempts to protect users, though doubts remain regarding the actual implementation of those protections.

Meta has not provided a specific rebuttal to the Swedish findings in the provided reports. The legal challenges in the U.S. focus on whether the company adequately informed users and bystanders about the extent of the recording, and the subsequent human review of that data.

Meta Platforms Inc. is facing lawsuits and investigations over allegations that its Ray-Ban Meta smart glasses record audio and video without consent.

This situation underscores the legal and ethical risks of 'always-on' wearable AI. While Meta positions the glasses as a productivity tool, the transition of data from a private device to a human reviewer at a corporate headquarters transforms a consumer gadget into a potential surveillance tool, likely prompting stricter regulatory scrutiny of wearable AI in the US and EU.