Physical Address
304 North Cardinal St.
Dorchester Center, MA 02124
Physical Address
304 North Cardinal St.
Dorchester Center, MA 02124
Mark Zuckerberg has unveiled Meta’s smart glasses that can translate foreign languages on the go and an upgraded artificial intelligence assistant that can respond to questions in the voice of Dame Judi Dench.
The Silicon Valley billionaire previewed the technology at Meta’s annual Connect conference at the company’s headquarters in Menlo Park, California.
The Ray-Ban Meta smart glasses, which are integrated with Meta AI, a rival to ChatGPT, can listen to someone speak in French, Italian or Spanish and translate it into English. The existing glasses can only translate written text.
The updated AI glasses will be able to capture what the wearer is seeing in real time and offer advice such as which restaurant to visit in a city.
Wearers can ask Meta AI to set a reminder or remember things they see. They can also use the glasses to scan QR codes or phone numbers.
Zuckerberg is hoping the glasses, which start at $299.99 and can already play music and take pictures, become a mainstream product, on a par with Apple’s AirPods.
Earlier this month he told the Acquired podcast: “What you ideally have is glasses and through the glasses there’s one part of it where the glasses can see what you see and they can hear what you hear, and in doing so they can be kind of the perfect AI assistant for you because they have context on what you’re doing.”
The AI features will initially be available in the United States, Canada, Australia and New Zealand.
Zuckerberg has previously claimed that Meta AI is “on track to be the most-used AI assistant in the world by the end of the year”. It is not currently available in the UK.
New uses for the technology unveiled on Wednesday include an audio feature whereby users can ask Meta AI questions and hear a response, rather than typing written questions in text. The assistant can be found on Meta’s apps, including WhatsApp, Instagram and Facebook.
Users can select celebrity voices to respond to questions asking for information or advice. As well as Dench, the voice options include John Cena, the actor and WWE wrestler; Kristen Bell, the actress known for her roles in American television shows Gossip Girl and Veronica Mars; Awkwafina, the rapper and comedian; and the actor Keegan-Michael Key.
In the US, Meta AI will be able to “see”, for example by observing photos shared in a chat. It can tell users more about what’s in the photo, or edit them.
Meta also revealed a significantly cheaper version of its virtual reality headset, which can be used to play video games, attend virtual concerts, or manage multiple virtual work screens.
Meta Quest 3S, priced at $299.99 (or £289.99 in the UK), is designed to attract more families and newcomers to the technology. The existing model costs $499.99.
Meta is hoping to attract Christmas shoppers, with pre-orders already available, and the product hits shelves on October 15.
About 1,200 developers and engineers gathered to listen to Zuckerberg unveil the product updates on Wednesday.
Earlier this month Meta confirmed it would begin using data from public posts of users in the UK to train its AI models.
• AI bill opens up bitter rift in Silicon Valley
Hundreds of thousands of people this week have shared a hoax image which claims to deny Meta the right to use their data to train artificial intelligence (AI) models.
The hoax image, titled “Goodbye Meta AI”, was shared on Instagram stories by the actresses Julianne Moore and Ashley Tisdale, as well as the England cricketer Jonny Bairstow. However, sharing the post on an Instagram story does not count as a valid form of objection to Meta’s data policies. Instead users can opt out of AI training through their account settings.
Zuckerberg also revealed a pair of smart glasses that can project holograms into the real world.Orion is the first prototype of full holographic advance reality glasses. They will allow users to project cinema screens, monitors and even holographic images of people who are far away. Texts will pop up in the users’ side vision via holograms.
Zuckerberg described the glasses as “the most advanced in the world”. He did not say when they would be ready for release — Meta is working with partners to make them cheaper and more stylish.
As a sceptic of wearable technology, I was impressed by the Ray-Ban Meta smart glasses.
The glasses operate as a camera on your face as well as an audio device.
Wearers can say “Meta, take a picture”, or “take a video”, and the glasses will listen and respond. No more selfie arms in photos or videos for wearers of the glasses. There is also a small button above the Ray-Ban logo you can press if you don’t want to be overheard talking to your digital assistant.
The glasses also respond to an instruction to “play music on Spotify”, but risk annoying people in your close vicinity as they will be able to hear it.
They can translate foreign languages on the go. I homed in on a sign in Spanish, said, “Hey Meta, translate this into English”, and could hear the translation in my ear.
The updated digital assistant can give you a description of what you are looking at, a tool which I can’t imagine being as useful.
New features include being able to ask your Meta glasses to set a timer for five minutes, or scan QR codes. Essentially, it is a product which will allow you to incorporate AI into your daily life. Crucially, they don’t look like AI glasses, so you can walk around in them without looking like a tech bro, unlike the Google glasses.
The glasses I tried had a blue-light effect, so they did make me feel like I was looking at a screen. However, I was told that Meta offers transition lenses which remove that effect. The battery lasts for about four hours, depending on how intensively you are using the AI functions. They charge while sitting in their Ray-Ban case.