Google had introduced “Google Lens” last year, and during its keynote at this year’s Google I/O, the company announced some new features for it, plus some other Google Lens-related news. Let’s start with the fact that Google Lens is coming to camera apps on supported devices from LG, Motorola, Xiaomi, Sony Mobile, HMD/Nokia, Transsion, TCL, OnePlus, BQ, ASUS, and the Google Pixel, of course.
Now, as far as features are concerned, “Smart Text Selection” has been added. This feature will basically let you select text from an image, from a recipe, for example, or some paper that you took a picture of. After you select it, you can copy / paste it in Google Search, share to a friend, or whatever else you find fit, as you would with text selected from a digital document. The second feature Google announced is “Style Match”. If you ever wanted to find similar home decor or outfit ideas based on something you’ve seen in real-life, well, this feature may come in handy. After you took a picture of an outfit or home decor item, you can open Google Lens and in addition to getting info about that specific item, you can also see things that offer similar style to it. And last, but not least, Google announced that Google Lens now works in real time. What does that mean, exactly? Well, it’s now able to proactively surface information instantly, and offer you that info as you move your camera’s viewfinder. Google says that Google Lens utilizes machine learning in order to do this, “state-of-the-art” machine learning, that is. Google Lens uses both on-device intelligence, and also cloud TPUs in order to do this, as it needs to identify billions of words, phrases, places, and things in a split second.
Google has announced that these new features will start rolling out in the next few weeks, in case you’re eager to test them out. Google Lens, at the moment, can be quite useful if you’re trying to find out some info based on images you took, for example, find a specific product to buy online, or figure out what dog breed you’re looking at, but these new features, especially the last one, are pushing Google Lens to a new level, though it remains to be seen if it will function as well as Google says it will.