- The Ray-Ban Meta smart glasses are getting two major AI updates
- AI camera features are rolling out to countries in Europe
- Live translation tools are rolling out everywhere the glasses are available
Following the rollout of enhanced Meta AI features to the UK earlier this month, Meta has announced yet another update to its Ray-Ban smart glasses that will make them closer to the ultimate tourist gadget.
That’s because two features are rolling out more widely: look and ask, and live translation.
Thanks to their cameras, your glasses (when prompted) can snap a picture and use that as context for a question you ask them like “Hey Meta, what’s the name of that flower?” or “Hey Meta, what can you tell me about this landmark?”
This tool was available in the UK and US, but it’s now arriving in countries in Europe, including Italy, France, Spain, Norway and Germany – you can check out the full list of supported countries on Meta’s website.
On a recent trip to Italy I used my glasses to help me learn more about Pompeii and other historical sites as I travelled, though it could sometimes be a challenge to get the glasses to understand what landmark I was talking about because my pronunciation wasn’t stellar. I also couldn’t find out more about a landmark until I learnt what it was called, so that I could say its name to the glasses.
Being able to snap a picture instead and have the glasses recognize landmarks for me would have made the specs so much more useful as a tour guide, so I’m excited to give them a whirl on my next European holiday.
Say what?
The other tool everyone can get excited for is live translation, which is finally rolling out to all countries that support the glasses (so the US, UK, Australia, and those European countries getting look and ask).
Your smart specs will be able to translate between English, French, Italian, and Spanish.
Best of all you won’t need a Wi-Fi connection, provided you’ve downloaded the necessary language pack.
What’s more, you don’t need to worry about conversations being one-sided. You’ll hear the translation through the glasses, but the person you’re talking to can read what you’re saying in the Meta View app on your phone.
Outside of face-to-face conversations I can see this tool being super handy for situations where you don’t have time to get your phone out, for example to help you understand public transport announcements.
Along with the glasses’ sign-translation abilities, these new features will make your specs even more of an essential travel companion – I certainly won’t be leaving them at home the next time I take a vacation.
You might also like
https://cdn.mos.cms.futurecdn.net/vwUHdMHmSzXBbvHek3F3U5.jpg
Source link
hamish.hector@futurenet.com (Hamish Hector)