Australia markets closed
  • ALL ORDS

    7,426.60
    +76.00 (+1.03%)
     
  • AUD/USD

    0.7018
    -0.0014 (-0.20%)
     
  • ASX 200

    7,182.70
    +70.20 (+0.99%)
     
  • OIL

    113.91
    +1.51 (+1.34%)
     
  • GOLD

    1,815.70
    -3.20 (-0.18%)
     
  • BTC-AUD

    42,637.59
    -1,104.36 (-2.52%)
     
  • CMC Crypto 200

    672.10
    +429.42 (+176.95%)
     

Google makes its AI assistant more accessible with 'Look and Talk'

·Senior Editor
·2-min read
NurPhoto via Getty Images

Google Assistant is already pretty handy, filling in your payment info on take out orders, helping get the kids to school on time, controlling your stereo systems' volume and your home's smart light schedules. At its I/O 2022 keynote today, company executives showed off some of the new features arriving soon for the AI.

The first of these is "Look and Talk." Instead of having to repeatedly start your requests to Assistant with "Hey Google," this new feature relies on computer vision and voice matching to constantly pay attention to the user. As Sissie Hsiao, Google's VP of Assistant, explained on stage, all the user has to do is look at their Nest Hub Max and state their request. Google is also developing a series of quick commands that users will be able to shout out without having to gaze longingly at their tablet screen or say "Hey Google" first — things like "turn on the lights" and "set a 10-minute alarm."

asdf
asdf

All of the data captured in that interaction — specifically the user's face and voice prints, used to verify the user — are processed locally on the Hub itself, Hsiao continued, and not shared with Google "or anyone else." What's more, you'll have to specifically opt into the service before you can use it.

According to Hsiao, the backend of this process relies on a half-dozen machine learning models and 100 camera and mic inputs — i.e., proximity, head orientation and gaze direction — to ensure that the machine knows when you're talking to it versus talking in front of it. The company also claims that it worked diligently to make sure that this system works for people across the full spectrum of human skin tones.

Looking ahead, Google plans to continue refining its NLP models to further enhance the responsiveness and fidelity of Assistant's responses by "building new, more powerful speech and language models that can understand the nuances of human speech," Hsiao said. "Assistant will be able to better understand the imperfections of human speech without getting tripped up — including the pauses, 'umms' and interruptions — making your interactions feel much closer to a natural conversation."

Follow all of the news from Google I/O 2022 right here!

Our goal is to create a safe and engaging place for users to connect over interests and passions. In order to improve our community experience, we are temporarily suspending article commenting