Meta’s Ray-Ban glasses are about to get a major upgrade that will make them straight out of a sci-fi movie. The new artificial intelligence software will allow wearers to see the real world and get descriptions of what they are looking at, similar to the A.I. assistant in the movie “Her.”
These glasses, which start at $300 for frames and $17 for lenses, have primarily been used for taking photos and videos and listening to music. However, with the new A.I. software, they can now scan landmarks, translate languages, and identify animals and fruits.
To use the A.I. software, wearers simply say, “Hey, Meta,” followed by a prompt like “Look and tell me what kind of dog this is.” The A.I. then responds in a computer-generated voice through the glasses’ speakers.
Tech columnist Brian X. Chen and Meta reporter Mike Isaac got early access to the update and tested the technology in various settings, from the zoo to grocery stores. They were entertained by the A.I.’s mistakes, like mistaking a monkey for a giraffe, but also impressed by its ability to identify gluten-free cookies and Bernese Mountain dogs.
While the A.I. isn’t always accurate, Meta assures that feedback will help improve the glasses over time. The glasses also offer features like translating languages, identifying landmarks, and even assisting in the kitchen.
Overall, Meta’s A.I.-powered glasses provide a glimpse into the future of technology, with some limitations and challenges still to overcome. But the ability to do things like translate languages and identify landmarks through stylish glasses shows how far technology has advanced.