There's been a lot of breathless talk around the launch of Google Glass. Self righteous nerds the world over are now poised to attack the first person they see wearing Glass in a public space. But what if that person were blind? What if Glass was actually the best piece of assistive technology yet for blind and partially sighted people? Let's spend a little time exploring this...
[Picture credit: Sergey Brin wearing Google Glass on the New York Subway, via Twitter user Noah Zerkin]
Regular readers may recall a post from a couple of years ago where I demonstrated how you could use the Android scripting layer to drive the camera, speech recognition and crucially interact with product data online, with the result being a trivial python script that turns your phone into an audible bar code scanner. Cue witty aside here about the Google Product Search API being closed down as part of the latest spring clean at the Googleplex...
Jam and Jelly
But this is all a bit fiddly, particularly if you are trying to navigate your way independently through the local supermarket while using a white stick or being led by a guide dog. Now picture that same supermarket scene with Glass: "OK Glass, what's in the cans in front of me?" Or perhaps... "am I in the jam or petfood section?" (jam == jelly, for my American readers ;-)
We've also heard a lot of negative stuff about facial recognition technology. Now picture it from a blind person's perspective - "Hey Alice, I can see your friend Bob". Let's imagine that we went one step further and added some opt-in location awareness a la Google Latitude, then your glasses could even help you to meet up with a blind friend. "Hey Bob, Alice is two blocks away from you. There's a locally run independent artisanal coffee shop coming up in 100 metres where we could meet - shall I send him the location and use their API to order us both a tall skinny mocaccino?"
A Grand Day Out
For some more practical examples of how Glass could help blind and partially sighted people with independent living, let's imagine that I am blind and aiming to catch a train to London. How could Glass help me out?
- OK, Glass: When is the next train to London?
- OK, Glass: Book me a ticket on the 1234 train to London and reserve me a seat
- OK, Glass: Which bus route is best to get me from here to the train station?
- OK, Glass: Directions to the nearest eastbound bus stop on the Number 7 route
- OK, Glass: Alert me when you see the Number 7 bus coming, and direct me to the doors when it stops
- OK, Glass: Direct me to the ticket collection kiosk
- OK, Glass: Tell me what you see (The ticket collection kiosks in the UK are pretty much inaccessible, so in reality you would probably find yourself going to the ticket office next. If it hadn't been downsized...)
- OK, Glass: Am I holding my train ticket the right way up for the barriers?
- OK, Glass: Direct me to the ticket barriers
- OK, Glass: Which platform is my train leaving from, and is it on time?
- OK, Glass: Direct me to platform 2, and alert me when my train has pulled in and stopped
- OK, Glass: Direct me to the nearest train door
- OK, Glass: Tell me when I reach my reserved seat on the train
- OK, Glass: Tell me when the kettle is directly above the coffee cup, and alert me when the cup is three quarters full
The Android Connection
Android, the open source Linux distribution running on 750 million smartphones and tablets, is also the technology underpinning Glass. It will be interesting to see how much of the Glass software Google make open source - it's easy to picture other Glass type products appearing that use Android but aren't aligned to the Google ecosystem, a business model already proven by Amazon with the Kindle Fire, and numerous Android devices in mainland China.
What's the big deal about Android? Well, bear in mind that much of I have described above is already possible with existing apps and APIs on Android and those achingly fashionable open data feeds - it's just a bit impractical unless you strap your phone to your head, or something like that.
The science fiction part (for now - but perhaps you can change that, dear reader!) is the bit where Glass "directs" you in real time by processing the camera imagery, e.g. to the exact bit of the train where the door can be found, or your reserved seat. But if you think about it, optical character recognition of images has already been demo'd to great effect by Google Googles and there are some interesting pointers from the lesser known but seriously awesome The vOICe for Android.
Closing Thoughts
To get a feel for potential revolutionary applications for Google Glass, just try walking around for a while with your eyes closed! See how I managed to get this far without mentioning collision avoidance?
Now listen to the Google Glass promo video, embedded above, with your eyes closed. OK, Glass: Take a picture, and tell me what you see...
No comments:
Post a Comment