Google's New Tech Can Read Your Body Language—Without Cameras - 9 minutes read




+++lead-in-text

What if your computer decided not to blare out a notification jingle because it noticed you weren't sitting at your desk? What if your TV saw you leave the couch to answer the front door and paused Netflix automatically, then resumed playback when you sat back down? What if our computers took more social cues from our movements and learned to be more considerate companions?

+++

It sounds futuristic and perhaps more than a little invasive—a computer watching your every move? But it feels less creepy once you learn that these technologies don't have to rely on a camera to see where you are and what you're doing. Instead, they use radar. Google's Advanced Technology and Products division—better known as ATAP, the department behind oddball projects such as a [touch-sensitive denim spent the past year exploring how computers can [use to understand our needs or intentions and then react to us appropriately.

This is not the first time we've seen Google use radar to provide its gadgets with spatial awareness. In 2015, [Google unveiled a sensor that can use radar's electromagnetic waves to pick up precise gestures and movements. It was first seen in the [Google Pixel ability to detect simple hand gestures so the user could snooze alarms or pause music without having to physically touch the smartphone. More recently, radar sensors were embedded inside the second-generation [Nest Hub smart to detect the movement and breathing patterns of the person sleeping next to it. The device was then able to track the person's sleep without requiring them to strap on a smartwatch.

The same Soli sensor is being used in this new round of research, but instead of using the sensor input to directly control a computer, ATAP is instead using the sensor data to enable computers to recognize our everyday movements and make new kinds of choices.

“We believe as technology becomes more present in our life, it's fair to start asking technology itself to take a few more cues from us,” says Leonardo Giusti, head of design at ATAP. In the same way your mom might remind you to grab an umbrella before you head out the door, perhaps your thermostat can relay the same message as you walk past and glance at it—or your TV can lower the volume if it detects you've fallen asleep on the couch.

### Radar Research


[#image: human entering a computer's personal space.|||

Giusti says much of the research is based on the study of how people use space around them to mediate social interactions. As you get closer to another person, you expect increased engagement and intimacy. The ATAP team used this and other social cues to establish that people *and* devices have their own concepts of personal space. 

Radar can detect you moving closer to a computer and entering its personal space. This might mean the computer can then choose to perform certain actions, like booting up the screen without requiring you to press a button. This kind of interaction already exists in current [Google Nest smart though instead of radar, Google employs [ultrasonic sound to measure a person's distance from the device. When a Nest Hub notices you're moving closer, it highlights current reminders, calendar events, or other important notifications. 

Proximity alone isn't enough. What if you just ended up walking past the machine and looking in a different direction? To solve this, Soli can capture greater subtleties in movements and gestures, such as body orientation, the pathway you might be taking, and the direction your head is facing—aided by machine learning algorithms that further refine the data. All this rich radar information helps it better guess if you are indeed about to start an interaction with the device, and what the type of engagement might be. 

This improved sensing came from the team performing a series of choreographed tasks within their own living rooms (they stayed home during the pandemic) with overhead cameras tracking their movements and real-time radar sensing. 



[#video: were able to move in different ways, we performed different variations of that movement, and then—given this was a real-time system that we were working with—we were able to improvise and kind of build off of our findings in real time," says Lauren Bedal, senior interaction designer at ATAP. 

Bedal, who has a background in dance, says the process is quite similar to how choreographers take a basic movement idea—known as a movement motif—and explore variations on it, such as how the dancer shifts their weight or changes their body position and orientation. From these studies, the team formalized a set of movements, which were all inspired by nonverbal communication and how we naturally interact with devices: approaching or leaving, passing by, turning toward or away, and glancing.

Bedal listed a few examples of computers reacting to these movements. If a device senses you approaching, it can pull up touch controls; step close to a device and it can highlight incoming emails; leave a room, and the TV can bookmark where you left and resume from that position when you're back. If a device determines that you're just passing by, it won't bother you with low-priority notifications. If you're in the kitchen following a video recipe, the device can pause when you move away to grab ingredients and resume as you step back and express that intent to reengage. And if you glance at a smart display when you're on a phone call, the device could offer the option to transfer to a video call on it so you can put your phone down.

“All of these movements start to hint at a future way of interacting with computers that feel very invisible by leveraging the natural ways that we move, and the idea is that computers can kind of recede into the background and only help us in the right moments,” Bedal says. “We're really just pushing the bounds of what we perceive to be possible for human-computer interaction.” 

### OK, Computer
Utilizing radar to influence how computers react to us comes with challenges. For example, while radar *can* detect multiple people in a room, if the subjects are too close together, the sensor just sees the gaggle of people as an amorphous blob, which confuses decision-making. There's also plenty more to be done, which is why Bedal highlighted (a few times) that this work is very much in the research phase—so no, don't expect it in your next-gen smart display just yet. 



[#image: radar technology can sense where you're looking without using cameras.|||

There's good reason to think radar can help learn your routines over time too. This is one area ATAP's Giusti says is on the research roadmap, with opportunities like suggesting healthy habits pertaining to your own goals. I imagine my smart display turning into a giant stop sign when it realizes I'm heading to the snack cabinet at midnight.

There's also a balance these devices will need to strike when it comes to performing a set of actions it *thinks* you'd want. For example, what if I want the TV on while I'm in the kitchen cooking? The radar wouldn't detect anyone watching the TV and would pause it instead of leaving it on. “As we start to research some of these interaction paradigms that feel very invisible and seamless and fluid, there needs to be a right balance between user control and automation,” Bedal says. “It should be effortless, but we should be considering the number of controls or configurations the user may want on their side.”

The ATAP team chose to use radar because it's one of the more privacy-friendly methods of gathering rich spatial data. (It also has really low latency, works in the dark, and external factors like sound or temperature don't affect it.) Unlike a camera, radar doesn't capture and store distinguishable images of your body, your face, or other means of identification. “It's more like an advanced motion sensor,” Giusti says. Soli has a detectable range of around 9 feet—less than most cameras—but multiple gadgets in your home with the Soli sensor could effectively blanket your space and create an effective mesh network for tracking your whereabouts in a home. (It's worth noting that data from the Soli sensor in the current Google Nest Hub is processed locally and the raw data is never sent to the cloud.) 



[#image: device with ATAP's new technology inside can sense you approaching and then change its state based on what it anticipates you might want to do.|||

Chris Harrison, a researcher studying human-computer interaction at Carnegie Mellon University and director of the [Future Interfaces says consumers will have to decide whether they want to make this privacy tradeoff—after all, Google is “the world leader in monetizing your data”—but he still thinks Google's camera-free approach is very much a user-first and privacy-first perspective. “There's no such thing as privacy-invading and not privacy-invading,” Harrison says. “Everything is on a spectrum.”

As devices are inevitably enabled with sensors—like Soli—to collect more data, they're more capable of understanding us. Ultimately, Harrison expects to see the kinds of improved human-computer interactions ATAP envisions in all facets of technology.

“Humans are hardwired to really understand human behavior, and when computers break it, it does lead to these sort of extra frustrating [situations],” Harrison says. “Bringing people like social scientists and behavioral scientists into the field of computing makes for these experiences that are much more pleasant and much more kind of humanistic.”

Google ATAP's research is one part of a new series called [*In the Lab With Google which will debut new episodes in the coming months on its [YouTube Future episodes will take a look at other projects in Google's research division. 

***
### More Great WIRED Stories
- 📩 The latest on tech, science, and more: [Get our [Driving while Inside the high-tech quest to find out
- You (might) need a patent for that [woolly [Sony's drives a race car like a champ
- How to sell your old [smartwatch or fitness Inside the lab where tries to hack its own chips
- 👁️ Explore AI like never before with [our new 🏃🏽‍♀️ Want the best tools to get healthy? Check out our Gear team’s picks for the [best fitness [running (including and and [best

Source: Wired

Powered by NewsAPI.org