- Meta has launched a new standalone app for its Meta AI assistant, powered by Llama 4
- The app connects across Meta platforms and devices, including Ray-Ban Meta smart glasses
- The Meta AI personalizes its behavior based on your Instagram and Facebook activity
Meta AI is moving into its own space with the launch of a new standalone app. Fueled by Meta’s new Llama 4 AI model, the new app is simultaneously a standalone product and a replacement for Meta View, which was previously used to connect to the Ray-Ban Meta smart glasses.
Meta’s making a big play here, positioning voice interactions as the most intuitive and natural way to interact with your AI. The app supports hands-free chatting and even includes a demo of full-duplex speech, a feature that lets you talk and listen at the same time.
That’s very useful considering how keen Meta is to connect Meta AI with the company’s larger product portfolio, especially the Ray-Ban Meta smart glasses. These AI-enabled spectacles will now operate through the Meta AI app, replacing the Meta View app they currently rely on.
That means you can start a conversation on one platform and easily transition to another. All you need to do is open the Devices tab on the app and replicate your settings and saved information.
Ask a question through your smart glasses, get a reply from Meta AI, and then pick up that same thread on your phone or desktop later. You can switch from voice chat in your glasses to reading the conversation in your app’s history tab. For example, you could be on a walk and ask Meta AI through your glasses to find a nearby bookstore. The answer will be saved in your Meta AI app for later review.
The other major element of the Meta AI app is the Discover feed. You can see publicly shared things like successful prompt ideas and images they’ve generated on the feed, then remix them for your own purposes.
Additionally, the desktop version of Meta AI is also getting revamped with a new interface and more image generation options. There’s also an experimental document editor for composing and editing text, adding visuals, and exporting it as a PDF.
Meta has spent many months spreading Meta AI across Instagram, Facebook, Messenger, and WhatsApp, but now, this is the first time Meta AI isn’t hosted within another mobile app.
The AI’s connection to Meta’s other apps does give it an edge (or a flaw, depending on your view) by allowing it to adapt its behavior based on what you do on those other apps. Meta AI draws on your Instagram and Facebook activity to personalize its answers.
Ask it where to go for dinner, and it might suggest a ramen spot your friend posted about last week. Ask for tips on an upcoming vacation, and it’ll remember you once posted that you love to “travel light but overpack emotionally” and suggest an itinerary that might fit that attitude.
Meta clearly wants Meta AI to be central in all your digital activities. The way the company pitches the app, it seems like you’ll always be checking in with it, whether on your phone or on your head.
There are obvious parallels with the ChatGPT app in terms of style. But Meta seems to want to differentiate its app from OpenAI’s creation by emphasizing the personal over the broader utility of an AI assistant.
And if there’s one thing Meta has more of than nearly anyone, it’s personal data. Meta AI tapping into your social data, voice habits, and even your smart glasses to deliver responses designed for you feels very on-brand.
The idea of Meta AI forming a mental scrapbook of your life based on what you liked on Instagram or posted on Facebook might not appeal to everyone, of course. But if you’re concerned, you can always put on the smart glasses and ask Meta AI for help.
You might also like
https://cdn.mos.cms.futurecdn.net/fhC9VvjQcDtUEdPdGbB3c6.png
Source link
erichs211@gmail.com (Eric Hal Schwartz)