The Google Assistant is getting more conversational

Google today announced at its Google I/O developer conference that the Google Assistant, the company’s virtual personal assistant that’s available on devices like the Google Home and Pixel phone, will soon get more conversational.

As Google’s Scott Huffman announced today, 70 percent of Google Assistant requests are already in natural language — not the typical keyword queries you’d usually use in Google Search.

Huffman noted that the Assistant will get more conversational in the coming months and also allow you to have conversations around things that you see, for example. It’ll be integrated with the new Google Lens and that product’s built-in image recognition technology. That way, you’ll be able to use the Assistant to easily talk about things around you. It’ll be some time before you can do that, though, because it won’t be rolling out until a few months from now.

If you don’t want to talk, you also will soon be able to type your queries on your phone. After all, there are times when you don’t want to speak your query out loud — maybe because you are in a public place or simply don’t want other people to hear your conversation. This feature is rolling out today.

“The Google Assistant should be hands-down the easiest way to get things done,” Huffman noted. “We’re starting to crack the hard computer science of conversationality.”