If you thought Google's surveillance capabilities were creepy before, they're about to get even creepier. The tech giant has developed a new AI system that can read your body language and predict what you're going to do next—without even needing cameras. That's right, Google's new technology can interpret the way you move, from the tiniest twitch of your fingers to the slightest shift in your posture, and use that information to divine your intentions. And it's all thanks to a little something called Project Soli.
Read the entire article here: https://bit.ly/3q9KioK
Why is this important?
What if your computer decided not to blare out a notification jingle because it noticed you weren’t sitting at your desk? What if your TV saw you leave the couch to answer the front door and paused Netflix automatically, then resumed playback when you sat back down? What if our computers took more social cues from our movements and learned to be more considerate companions? It sounds futuristic and perhaps more than a little invasive a computer watching your every move? But it feels less creepy once you learn that these technologies don’t have to rely on a camera to see where you are and what you’re doing. Instead, they use radar. Google’s Advanced Technology and Products division better known as ATAP, the department behind oddball projects such as a touch-sensitive denim jacket—has spent the past year exploring how computers can use radar to understand our needs or intentions and then react to us appropriately. Recently, radar sensors were embedded inside the second-generation Nest Hub smart display to detect the movement and breathing patterns of the person sleeping next to it. The device was then able to track the person’s sleep without requiring them to strap on a smartwatch.
The latest disruptive trends with converging technologies that will change your life!