Google Lens, Google’s computer vision search engine, is coming to desktop Chrome. Google didn’t exactly share a timeline, but a teaser tweet showed what the feature will look like.
On desktop Chrome, you’ll soon be able to right-click an image and pick “Search with Google Lens,” which will dim the page and bring up a clipping tool so you can throw a certain image to Google’s photo AI. After a round-trip to the Internet, a sidebar will pop up showing several results.
While Google.com’s image search just tries to find similar pictures, Lens can actually identify things in a picture, like people, text, math equations, animals, landmarks, products, and more. It can translate text through the camera and even copy text from the real world (with OCR) and paste it into an app. The feature has existed on Android and iOS for a while, first as a camera-driven search that brought up a live viewfinder, then in Google Photos, and more recently as a long-press option for web pictures in Chrome for Android.
Google Lens is also getting a bit smarter. A new feature is coming to the service that will let you ask follow-up questions to an image search. Google has two demos here that are very impressive. One has a user scan a picture of a shirt and ask for “socks with this pattern” before Google brings up a match. It would be pretty much impossible to search for a specific clothing pattern otherwise. You could type in descriptors like “floral pattern,” but that would get you similar patterns you would have to scroll through, not the same pattern.
Another example is a really great use case for vision search: finding a thing you don’t know the name of. In the example, the user has a broken bike and needs to fix something with the rear cogset. They don’t know what the rear gear changer-thingy is called, though, so they just take a picture of it and ask Google. Apparently, it’s a “derailleur,” and from there the user types in “how to fix,” and Google finds instructions.
Basically, Lens is getting the ability to search for images and text at the same time. Both of these are impressive examples, but they’re canned demos, so it’s hard to know how well any of this will actually work. Google says the feature will arrive “in the coming months.”
https://arstechnica.com/?p=1799986