Google has announced a new 'Look and Talk' feature that can activate Google Assistant without requiring the use of its 'Hey Google' wake command.
The feature is beginning to roll out today in the U.S. on Nest Hub Max.
Once you opt in, you can simply look at the screen and ask for what you need. From the beginning, we’ve built Look and Talk with your privacy in mind. It’s designed to activate when you opt in and both Face Match and Voice Match recognize it’s you. And video from these interactions is processed entirely on-device, so it isn’t shared with Google or anyone else. Let’s say I need to fix my leaky kitchen sink. As I walk into the room, I can just look at my Nest Hub Max and say “Show plumbers near me” — without having to say “Hey Google” first.
The company says 'Look and Talk' uses over 100 signals from both the camera and microphone to recognize if you're making eye contact with your device. These include proximity, head orientation, gaze direction, lip movement, context awareness and intent classification.
Additionally, Nest Hub Max is getting quick phrases that let you skip saying 'Hey Google' for common daily tasks like "Turn on the hallway lights" or "Set a timer for 10 minutes". Quick phrases only work when you've opted-in and Voice Match recognizes it's you.
Finally, Google says it's working on new, more powerful speech and language models that can understand nuances of human speech.
Looking ahead, Assistant will be able to better understand the imperfections of human speech without getting tripped up — including the pauses, “umms” and interruptions — making your interactions feel much closer to a natural conversation.
Check out the video below for more details...
The feature is beginning to roll out today in the U.S. on Nest Hub Max.
Once you opt in, you can simply look at the screen and ask for what you need. From the beginning, we’ve built Look and Talk with your privacy in mind. It’s designed to activate when you opt in and both Face Match and Voice Match recognize it’s you. And video from these interactions is processed entirely on-device, so it isn’t shared with Google or anyone else. Let’s say I need to fix my leaky kitchen sink. As I walk into the room, I can just look at my Nest Hub Max and say “Show plumbers near me” — without having to say “Hey Google” first.
The company says 'Look and Talk' uses over 100 signals from both the camera and microphone to recognize if you're making eye contact with your device. These include proximity, head orientation, gaze direction, lip movement, context awareness and intent classification.
Additionally, Nest Hub Max is getting quick phrases that let you skip saying 'Hey Google' for common daily tasks like "Turn on the hallway lights" or "Set a timer for 10 minutes". Quick phrases only work when you've opted-in and Voice Match recognizes it's you.
Finally, Google says it's working on new, more powerful speech and language models that can understand nuances of human speech.
Looking ahead, Assistant will be able to better understand the imperfections of human speech without getting tripped up — including the pauses, “umms” and interruptions — making your interactions feel much closer to a natural conversation.
Check out the video below for more details...