The visual search feature first made a debut in Google Photos on the Pixel 2 devices before expanding to the original Pixel and Pixel XL late last month. You might be thinking that what's the point of it being available on Google Assistant as it is already available on Google Photos? Google Lens is an AI-based feature that scans an image or screenshot to provide relevant information. Now it looks like the roll-out has started as some users are seeing Google Lens in Assistant running on Pixel and Pixel 2 smartphones.
Most importantly, once Google Assistant exhausted its support options, it offered to connect the device owner to Google support via phone or chat-user's choice.
Users are now seeing a new Lens button in Google Assistant which opens up the camera when it's pressed.
In case you didn't know it yet, Google Lens is an app that lets you do visual search not just from actual photos but from the objects around you, just by pointing your camera at said object.
AAA expects 50 million Americans to travel over Thanksgiving
Conversely, the average rate for AAA Two Diamond Rated hotels has decreased five percent with an average nightly cost of $117. Air travel is expected to grow by 5 percent to 3.95 million trips, buoyed by the cheapest tickets since 2013, AAA said.
Tapping on any one of the suggestions will bring up links for performing a web search or performing an action through other relevant apps.
Just like the iPhone X, the Google Pixel 2 has also been suffering from call quality issues in the form of buzzing and crackling sounds at higher volumes.
Google Lens was first announced back in May during the Google I/O developers conference where CEO Sundar Pichai explained the search giant's push in artificial intelligence and machine learning. This was also spotted by some eagle-eyed Googlers, seeing that Assistant will also be getting Lens-ed soon.