Now Reading
Meta AI Glasses Add Conversation Focus

Meta AI Glasses Add Conversation Focus

Meta AI smart glasses display

Meta has introduced new AI-powered features for its smart glasses, expanding how users interact with sound and music in everyday settings. The update focuses on improving conversations in noisy places while also linking visual context to music playback.

Clearer conversations in noisy environments

First, the update adds a conversation-focused mode that helps users hear people speaking nearby, even in loud surroundings. The feature amplifies the voice of the person in front of the wearer while reducing background noise. In addition, users can fine-tune the amplification level by swiping the right temple or adjusting settings, which allows better control across places like restaurants, trains, or crowded venues. As a result, the glasses move closer to functioning as practical hearing-assistance tools rather than simple accessories.

Music that matches what you see

Meanwhile, the update introduces a visual-to-music feature powered by Spotify. When users look at certain objects, such as album covers or seasonal decorations, the glasses can play a related song automatically. Although this feature leans more toward experimentation, it also highlights a broader push to connect visual input with immediate actions across apps. Consequently, the glasses begin to blend contextual awareness with entertainment features.

See Also
Light-activated nanoparticles for cancer

Availability and rollout details

Initially, the conversation-focused feature will launch only in the United States and Canada on supported Ray-Ban Meta and Oakley Meta HSTN models. However, the Spotify integration will reach a wider group of countries where English support is available. Additionally, the software update will first reach users enrolled in an early access program before expanding to a broader audience over time.

© 2024 The Technology Express. All Rights Reserved.