across Dining, Translate, Text, Shopping, and Auto.
The new Lens features will provide better and faster overlays of information over real-world objects, Google said.
The Dining filter will automatically highlight popular dishes on a menu, tapping into Google Maps to see photos and reviews of specific dishes.
“And when you’re done with your meal, just point the camera at your receipt, and Lens can help calculate the tip and split the bill,” Google says.
By using the Translate filter, Google Lens will detect a language and overlay a translation on top of the words. It works across over 100 languages, according to Google.
The Text filter allows users to copy and paste text from objects including Wi-Fi passwords, gift card codes and recipes onto their phone; and Shopping provides similar items when the camera is pointing at clothing, furniture, or home decor, as well as barcode scanning capabilities.
Lastly, Auto provides search results based on whatever object a user is pointing their camera at.
“We’re taking Google Lens and taking it from, ‘oh, it’s an identification tool, what’s this, show me things like this,’ to an AR browser, meaning you can actually superimpose information right on the camera,” Aparna Chennapragada, vice president and general manager for camera and AR products at Google, told Techhnews earlier in May.
“One of the questions we had was, if we can teach the camera to read, can we use the camera to help people read?” she added on the Lens Translate feature. “This is obviously useful in cases where you’re in a foreign city and you can’t speak the language, but in many parts of the world, people can’t speak or read their own language.”