
After months of speculation, Google revealed some information(on April 4th, 2012 to be specific)
about the Project Glass. Instead of using a smartphone to find information about an object, translate a text, get directions, compare prices, you can use smart glasses that augment the reality and help you understand more about things around you.
Google says, “We think technology should work for youโto be there when you need it and get out of your way when you don’t.” A group from Google[x] started Project Glass to build this kind of technology, one that helps you explore and share your world, putting you back in the moment.
Since, it has been more than a week, I had a good chance to read and analyze most of the views several Tech blogs shared about this project. Here are few more details, about Project Glass and with whom my opinion synced well.
Google’s concept glasses have a:
a) Camera,
b) Microphone,
c) 3G/4G data connection (to send and receive data in real time) and
d) Number of sensors, including motion and GPS.
One of the people who used the glasses said that “they let technology get out of your way. If I want to take a picture, I don’t have to reach into my pocket and take out my phone; I just press a button at the top of the glasses and that’s it.”
Stay updated on Telegram with latest updates from Google Home/Assistant ecosystem.
New York Times reported that “the glasses [could] go on sale to the public by the end of the year. People familiar with the Google glasses said they would be Android-based, and will include a small screen that will sit a few inches from someone’s eye.
Seth Weintraub found that “the navigation system currently used is a head tilting-to scroll and click, I/O on the glasses will also include voice input and output, and we are told the CPU/RAM/storage hardware is near the equivalent of a generation-old Android smartphone”.
As per TechCrunch, Apple and Facebook should get together and compete with Google on this, since Project Glass takes a ton of the things you use your iPhone and iPod for and puts them into your glasses. The glasses will likely run a version of Android and since theyโre voice controlled (Googleโs competitor to Siri). People might buy Google glasses rather than snapping up the latest Apple device.

But Apple is the worldโs greatest hardware company, and it should seek to capitalize on Googleโs lack of hardware experience, and spend some of its cash reserves to lock up critical component manufacturers. This technology sure seems like the future, so Apple needs to be ready to pounce. But the problem remains that it has no social network or other key services to power its own version.
Facebook Should Team Up With Apple
Not having its own mobile OS or device is hurting Facebook, and eyeglass computing could turn into round two. The video already showed Google+ as the preferred sharing method. Unlike an Android phone where you could just open the Facebook app, Project Glass wonโt necessarily allow third-party apps, at least at first, and could make them harder to access than Google+ which will be baked in.

Apple needs somewhere to share the content youโd create with its glasses, and Facebook needs to make sure Apple lets it get deeply embedded, with or without Twitter alongside it.
Postscript: If Apple or Facebook consider eyeglass computing as marketable to mainstream in the next few years, today they should be terrified Of Google Glasses and this should give them a jolt.
Stay updated on Google News with the latest updates from Google Home/Assistant ecosystem.
TechCrunch also reported this may end up being called Google Eye.
Google is beginning public tests of augmented reality glasses with the code name Project Glass. A promising mock-up video of what the device might eventually be capable of shows someone using voice commands to send messages, take photos, share to Google+, see the locations of friends, view maps, get directions, set calendar reminders, and more. Here’s a why and how the glasses could be helpful and a video to demonstrate what it might enable you to do:
Technology should work for youโto be there when you need it and get out of your way when you donโt.
Google is sharing this information now because they want to start a conversation and learn from your valuable input. So we took a few design photos to show what this technology could look like.
Googleโs augmented Project Glass is going to disrupt your business model. If you donโt even have to pull your phone out to take a photo, get directions, or message with friends, why would you need to buy the latest iPhone or spend so much time on Facebook?
Kevin Tofel @ GigaOM, reported Google glasses make sense as the โnextโ mobile device and hereโs why I agree with him โWe have gone from immobile desktops to portable laptops and now we are toting tablets and pocketable smartphones. Where can we go from here if not to the growing number of connected, wearable gadgets. As silly as the idea may look or sound to some, I find merit in the approach, as it seems like a logical next step.โ

From a consumer perspective, Project Glass also forwards another theme that has been growing. Touchable user interfaces have reinvented how we use mobile devices, but hardware design is advancing to the point where the interfaces are starting to disappear. Instead of holding a tablet, people are interacting directly with an app, Web page, photo or other digital object in a reduced interface, with either voice or minute gestures. In essence, such glasses would allow people to digitally interact with the physical world around them without a device or user interface getting in the way.
It could be a year before these eyewear reach stores, but thatโs why these and other tech companies need to strategize now. If they wait to see if the device is a hit, the world could be seeing through Google-tinted glasses by the time they adapt. It will be interesting to see if Google will actually sell these smart glasses. There are a lot of issues that need to be solved before releasing a commercial product: from battery life to packaging so much technology in a such a small product, from improving Google Goggles to handling real-time video streaming.
How rapid prototyping was used to create Project Glass
At TEDYouth, Tom Chi explains how Rapid prototyping (is a method used to accelerate the innovation process) was used to create one of Googleโs newest inventions, Google Glass.
But hereโs the kicker, despite its lack of hardware experience, Google is the best positioned company to make, or at least provide the software for eyeglass computers. It has Android, Google+, Maps, Gmail, Calendar, Latitude, and more. Thatโs why itโs ridiculous when people call Project Glass a diversion or waste of resources.
Interested in learning more about the Google Assistant and Smart Home? Subscribe to WAV newsletter via Email.
Update: Feb. 2013. Time for #ifihadglass explorers to get their Project Glass

Google surely knows how to tease you.
Last year Google showed Glass to the world for the first time – by jumping out of airships, crashing New York Fashion Week and even took a ride on the subway. Itโs been an exhilarating journey so far, and thereโs a lot more to come. Theyโre developing new technology that is designed to be unobtrusive and liberating, and so far weโve only scratched the surface of the true potential of Glass.
Now Google wants you to get involved and thatโs why today theyโre expanding Glass Explorer Program. Theyโre looking for bold, creative individuals who want to join and be a part of shaping the future of Glass. Glass is still in the early stages, so there will be some twists and turns along the way. While we canโt promise everything will be perfect, we can promise it will be exciting.ย
Eventually all may get a chance to be a Glass Explorer, but theyโre starting a bit smaller. So, if you want to be one of the first Explorers, go to www.google.com/glass/start/how-to-get-one to find out how.
Seeking Glass Explorers: goo.gl/PbKWr
You told Google what you would do #ifihadglass, nowโs your chance to show us.
In Feb, Google asked you to join. In just two and half days, it went from 0 to 2000 explorers.
Now itโs time for you to rejoice, Google is delivering Project Glass to you.
Explorer Program started last year, these are the people who were the first to show their interest to try Project Glass by signing up at Google I/O 2012. And weโre now seeing amazing things coming from them. Theyโve really embraced the Explorer spirit โ the spirit of someone whoโs ready to jump in, get their hands dirty and take on a challenge.ย
In February, Google opened up the Explorer Program by asking people across Google+ and Twitter what they would do if they had Glass. Over the next few weeks, theyโll be slowly rolling out invitations, If you were one of the successful #ifihadglass applicants.
Share your hands-on experiences, unboxing videos, first photos through glass here with all of us. Please follow along as we share some of our ideas and stories. Weโd love to hear yours, too. What would you like to see from Project Glass? Let us know in the comments below.


Deepak Ravlani Yes, yes, UI’s, design and public get familiar/adoption rates are always important with something new. Haha, keep an eye on attendees, who might be hanging out live on the air, from a DIY pair:)
LikeLike
Deepak Ravlani
Hi, yes, thanks, I watched that as it occurred. And of course, anyone who is at any public or private event, could be using a device like Samsung S3/many others to broadcast in a hangout or hangoutontheair. Did that, done that, do that, but holding a smartphone in front of my face is a tad awkward, draws attention to me, rather than what I see and hear at the event.ย Imagine being at a large stadium watching a large sporting event, with 10 people sitting in various locations in the stands, all wearing Project Glass, when they have become the norm, not drawing attention to the person wearing them.ย Will they be able to broadcast the game, liveontheair, switching between views? And what will the viewers of the liveontheair broadcast hear from the 10 project glass devices? If this would work well, it would obviously be quite disruptive. And that, I believe is the purpose of beta testing a device like Project Glass.
LikeLike
Would it be possible to broadcast, using project glass, a live hangoutontheair to Youtube? Not that you would would in this case, but imagine the potential if you could. It would be a powerful demonstration of what could be done with Project Glass.
LikeLike
Please note this is a UCSD event only, and is not open to the public.
LikeLike