Google doesn’t have to be a mind-reader to know what you want and need
“How long will it take before phones can read our minds? It’d be awesome if this thing could just know what I want,” a friend recently mused, while staring at his beloved Android device. He glanced up when I replied. “It doesn’t need to read your mind to know what you want.”
“A mobile phone has eyes, ears, a skin, and knows your location,” Google senior vice president Vic Gundotra said while demonstrating the good ol’ Nexus One in early 2010, according to Steven Levy’s In The Plex. “Eyes, because you never see one that doesn’t have a camera. Ears, because they all have microphones. Skin because a lot of these devices are touch screens. And GPS allows you to know your location.”
Two years after Gundotra’s remarks, the phone you’re carrying likely has a lot of extra parts: An accelerometer, a gyroscope, a barometer, ambient light sensors, a compass, and so on. It knows more than ever, too — thanks to tighter integration into the cloud.
In another year or two — or however long it takes for Google’s Project Glass to arrive in the hands of consumers — your phone will be even more powerful. It’ll have the ability to interact with a sleek device which sits on your face all day. The sheer quantity of data this connection will add into the mix is mind-boggling. Glass will see exactly what you see, it will know when you tilt your head with interest, it will understand the significance of your body movements, and perhaps it might even know when your eyes widen in excitement.
At this point, you can choose to sound the alarm and shout that it’s terrifying that some devices — or, since we’re focusing on Google, a single company — will know so much about you. Or you can take a leap of faith — as you do every time you send an email, type out a Search query, add another card to your Google Wallet account, and so on — and embrace the convenience that’s to come.
“What may take 30 to 60 seconds on a phone will instead take two to four seconds on Glass.” Steve Lee, product director for Project Glass.
“If you walk around the streets of New York, people have their smartphones out and they’re looking down. They do that while they’re standing around, waiting for a bus or a taxi, or while they’re walking. Even if you go out to dinner with a friend or a date, the technology is taking them away,” Lee explained. With Glass, there’ll be almost no interruption, no delay.
Don’t for a second think that all the magic happens because of the hardware though. Everything always goes back to Google’s roots — its ability to put data into context.
“We’ve often said the perfect search engine will provide you with exactly what you need to know at exactly the right moment, potentially without you having to ask for it,” Jon Wiley, lead user experience designer for Google Search. He was discussing a research exercise which sought to explore which information people need, but don’t seek out on Google.
Wiley’s research exercise brings to mind not only what Google Now updates could offer, but also something that Google co-founder Sergey Brin explained to Levy back in 2004. In the future, he said, “you can have computers that pay attention to what’s going on around them and suggest useful information.”
That future — and those computers — are now and Glass. What if that little device noticed that you keep turning to look at a particular store’s window display? Perhaps it’ll recommend the dress you saw there the next time you’re idly searching through Google Shopping. What if Glass recognizes, based on your erratic body movements, that you’re doing one of those “I gotta pee! I gotta pee!” dances and hopping from one foot to the other? Maybe it’ll guide you to the nearest bathroom before you even need to enter a search query. What if you kept staring intently, without blinking, at the guy sitting across from you on the subway? Might Glass understand your interest and tell you that the man looks familiar because he was the lead actor in a romantic comedy you recently watched?
“[Y]ou can imagine your brain being augmented by Google,” Google co-founder Larry Page said to Reuters in Feb. 2004. “For example you think about something and your cell phone could whisper the answer into your ear.”
Or perhaps you won’t even need to think about something. Google will soon know what you want to know before you even realize it.