At today’s Google I/O, big changes were announced regarding how we can interact with the Google Assistant in the future. Perhaps the biggest change for most is that you no longer have to say “Hey Google” to trigger Assistant. This was previously rumored, but Google has now made it official.
Nest Hub Max owners will see the new options starting today in the US. The first is called Look and Talk and works exactly as it sounds. You just look at the Nest Hub Max and start talking. The device uses face and voice recognition to recognize you, so you continue to get personalized results and all processing is done entirely locally. None of this facial recognition data is sent to the cloud and it is an opt-in service.
The second is Quick Phrases, an extension of interaction with Google Assistant. Again, this eliminates the need to use the familiar trigger phrase, but you can do things like: For example, set a timer, ask for the time, or turn your lights on and off.
This comes with improvements in how Google Assistant understands you. Assistant is now smarter when it comes to understanding natural language, including the umms and fallacies we often throw in. The language models are pushed to the device to speed up processing. This breakthrough was achieved by building better neural networks on the Google Tensor chip.
“Looking to the future, Assistant will be able to better understand the imperfections of human speech without stumbling along the way – including the pauses, “umms” and pauses – making your interactions feel much closer to a natural conversation.”
The example used on stage was asking for a song but pausing and not exactly knowing the full artist name. Google Assistant is now smart enough to understand the speech and pause and work out what the missing part would be.