Google Lens is one of those apps that blew us away when we first heard about it but has since sat dormant on all our phones. It is, of course, a logical move for Google to allow you to “search what you see” by simply pointing your phone’s camera at it. It is very impressive that Google’s AI can take contextual information from what it sees and perform a relevant web search to help you find out what you’re looking at. It is just not something we turn to very often.
In fact, the biggest use cases that go along with Google Lens relate to buying things you see when you’re not in shops or learning about things when you’re traveling. Not everybody is interested in these types of activities, however, which is probably why a lot of us don’t think of Lens that often. Google, however, is now adding new skills to Google Lens that’ll give it the power to help us in a much more popular activity: eating out in restaurants.
Google Lens is getting some great new ‘dining filters’ to help us out in restaurants
Not everybody wants to buy the latest clothes and gadgets and not everybody does a lot of traveling. Almost all of us, however, eat out in restaurants occasionally and these new Google Lens filters could help us decide what to eat. If you’ve ever felt rushed in a restaurant and didn’t know what to order, Google Lens can now help you out.
The first new Google Lens skill relates to the restaurant’s menu. If you point Google Lens at the menu, as long the restaurant’s name is visible, it will overlay the restaurant’s most popular dishes on top of what you see. You should see stars next to the restaurant’s most popular dishes and then tapping on them will show more information including pictures of what the dish looks like and reviews other patrons have written about them.
The other key dining-related skill that Google Lens is getting will help you out at the end of your meal. Once your waiter has brought you your bill, show it to Google Lens. Lens will then be able to calculate the tip you should leave and will even help you split the bill evenly, should you need to do so.
It is good to see Google Lens getting some more widely useful features. One of the main reasons Lens hasn’t really grabbed our attention is because it is often quicker to input the search query yourself than it is to wait for Lens to grab the information. Recognizing large amounts of text quickly, however, is one of Lens’ best skills so the new menu feature could offer a genuinely useful use case. The bill feature should be useful too if it works quickly enough.