The visual search feature first made a debut in Google Photos on the Pixel 2 devices before expanding to the original Pixel and Pixel XL late last month. You might be thinking that what's the point of it being available on Google Assistant as it is already available on Google Photos? Google Lens is an AI-based feature that scans an image or screenshot to provide relevant information. Now it looks like the roll-out has started as some users are seeing Google Lens in Assistant running on Pixel and Pixel 2 smartphones.
Most importantly, once Google Assistant exhausted its support options, it offered to connect the device owner to Google support via phone or chat-user's choice.
Users are now seeing a new Lens button in Google Assistant which opens up the camera when it's pressed.
In case you didn't know it yet, Google Lens is an app that lets you do visual search not just from actual photos but from the objects around you, just by pointing your camera at said object.
The Original Justice League Post-Credits Scene Featured Green Lantern
Disney's superhero threequel Thor: Ragnarok is holding steady in third place with an estimated $21.8 million in its third weekend. It is the only film in the DC Extended Universe to not crack the $100 million mark in North America at its opening.
Tapping on any one of the suggestions will bring up links for performing a web search or performing an action through other relevant apps.
Just like the iPhone X, the Google Pixel 2 has also been suffering from call quality issues in the form of buzzing and crackling sounds at higher volumes.
Google Lens was first announced back in May during the Google I/O developers conference where CEO Sundar Pichai explained the search giant's push in artificial intelligence and machine learning. This was also spotted by some eagle-eyed Googlers, seeing that Assistant will also be getting Lens-ed soon.