[ad_1]

Google’s enhancing Google Lens, its computer vision-powered app that brings up information related to the objects it identifies, with new features.

Starting today, Lens can surface skin conditions similar to what you might see on your own skin, like moles and rashes. Uploading a picture or photo through Lens will kick off a search for visual matches, which will also work for other physical maladies that you might not be sure how to describe with words (like a bump on the lip, a line on nails or hair loss).

It’s a step short of the AI-driven app Google launched in 2021 to diagnose skin, hair and nail conditions. That app, which debuted first in the E.U., faced barriers to entry in the U.S., where it would have had to have been approved by the Food and Drug Administration. (Google declined to seek approval.)

Google Bard skin search

Image Credits: Google

Still, the Lens feature might be useful for folks deciding whether to seek medical attention or over-the-counter treatments.

Elsewhere, as previously announced at I/O, Lens is integrating with Bard, Google’s AI-powered chatbot experience. Users will be able to include images in their Bard prompts and Lens will work behind the scenes to help Bard make sense of what’s being shown. For example, shown a photo of shoes and asked what they’re called, Bard — informed by Lens’ analysis — will come back with an answer.

It’s the latest update to Bard, Google’s answer to ChatGPT, as Google invests an increasing amount of resources into generative AI technologies. Just last week, Google introduced a capability that allows Bard to write, execute and test its own code in the background — improving its ability to program and solve complex math problems. And in May, Google partnered with Adobe to bring art generation to Bard.

[ad_2]

Source link

Leave a Reply

Your email address will not be published. Required fields are marked *