Meta RayBan smart glasses

Meta might use your visual data to train RayBan Meta AI, only way to opt out is to stop using its AI features

Meta Ray-Ban smart glasses might be using all the images shared with it to train its Meta AI. This means everything you capture with it or ask it to analyse any image, it will take it to teach its AI tool.

by · India Today

In Short

  • Ray-Ban smart glasses updated with AI integration
  • Images from Ray-Ban glasses may train Meta's AI model
  • Meta's AI policies raise privacy and data usage concerns

During the Meta Connect 2024, last week, the company unveiled its latest innovation of AR glasses, Orion. These glasses are equipped with advanced AI innovations, including hand tracking, AI voice assistant, eye tracking and more. But this is not it. During the event, Meta also revealed that its previous Ray-Ban smart glasses also received several AI updates. The Ray-Ban Meta smart glasses is now deeply integrated with Meta AI. These glasses feature an HD camera, capture button, open-ear speakers and touchpad. Interesting, isn't it? But media reports have claimed that Meta is most probably using visual data that will train its AI systems.

According to TechCrunch, Meta has confirmed that the Ray-Ban smart glasses might use all the images to train its AI model. The report stated that previously, the company stayed quiet about the query. But after some days, it was pointed out that this might be the case, based on the privacy policies. The report further explained that Ray-Ban Meta photos and videos are kept private unless submitted to Meta AI for analysis. Once analysed, they're subject to different policies. Meta doesn't use captured content for training without user-initiated AI analysis. This distinction is crucial for understanding data usage and privacy implications of using Ray-Ban Meta with Meta AI features.

This means that unknowingly, the users might be giving out their personal details via images to the Meta AI model. In essence, the company is leveraging its first consumer AI device to amass a large amount of data, which can be used to develop increasingly advanced AI models. The only way to opt out is by choosing not to use Meta's multimodal AI features altogether.

It is even more alarming now, as Meta has started rolling out new updates with advanced AI features. The new AI features allow Ray-Ban Meta users to interact with Meta AI more naturally, making it easier to send new data that can be used for training. Additionally, at the 2024 Connect conference last week, the company introduced a live video analysis feature for Ray-Ban Meta, which continuously streams images to Meta's multimodal AI models. In a promotional video, Meta demonstrated how this feature could help users scan their closets and have AI suggest an outfit.

In fact, Meta's smart glasses privacy policies clearly stated, "Text transcripts and related data are stored by default to help improve Meta’s products." Not just text, but it looks like both voice transcripts and images are being used to train Meta AI. But this is not astounding.

Since inception, the AI world has grown by learning from the data provided by us. Whether it is OpenAI or Microsoft or Meta, its AI models live on the data provided by the user. But what is most scary here is that Meta introduced smart glasses with the idea of bringing it for everyday use. Using it for every part of your day makes it weird as it will record everything.