[Electronics-talk] BBC News - Smartphone cameras bring independence to blind people

Hai Nguyen Ly gymnastdave at sbcglobal.net
Fri Aug 19 18:05:57 UTC 2011


http://www.bbc.co.uk/news/technology-14505748

Smartphone cameras bring independence to blind people

19 August 2011 Last updated at 03:19 ET By Damon Rose Editor, BBC Ouch! VizWiz puts out the user's query to a panel of volunteer helpers
Snapping an image with your smartphone camera brings more than just a pretty picture if you are blind. With the right app, it can increase your independence. 

Knowing what food is inside a packet or details about the post which has just arrived on your doormat are everyday things that most people take for granted.

Blind people have traditionally sought this kind of visual information from family and friends, or from an employed personal assistant. But this has meant having to fit in with other people's time or spend significant money on help. Now there are an increasing number of alternatives.

As smart phones become more accessible, some with built in speech and Braille output, it is possible for people with sight loss to get slivers of visual assistance when there's no one else around to ask.

Want to know what colour your shirt is? Use a colour detector app. Want to know if it is still daylight outside? Use a light detector app. Want to read a notice on your work's noticeboard? Use a text recognition app, of course.

What's in this jar?
The most recent visual assistance product to hit the app store is VizWiz. As well as giving you automated image recognition from intelligent software, it throws your questions open to a small band of volunteers standing-by on the internet - a human cloud, willing to donate ten seconds of their time here and there to describe photos which come in.

On its website, the VizWiz is described as: "Take a Picture, Speak a Question, and Get an Answer".

The free app and service, developed by the University of Rochester in New York, has received between ten and 12 thousand questions in its first two months. The volunteers are made up of staff and students who receive a sound alert when a question arrives, either via Twitter, text message or the web. They tap in a response which is received by the original sender.

"The most popular type of question is a product that they have which has text written on it, a label with instructions. People want to know what it says, how to cook it or when it expires," said Professor Jeff Bigham, the man behind the service.

"We can very clearly track the time of day," explained Prof. Bigham.

"In the morning people are asking about clothing, the colour or pattern. A few people ask if their shirt matches their pants."

"Around one or two eastern time we start getting questions about wine from what we assume is the UK, asking what label, what year, that kind of thing."

It is this kind of subjective answer that a piece of software can't give and that a human service can. But humans need sleep. Prof. Bigham admits that, though computer scientists are famed for staying up very late, the 6am to 7am timeslot can be a bit difficult to fill with volunteers from the university.

Human cloud
"It's a really exciting time to work in access technology. A great new resource is that there are people out there on the web. Everyone is connected and we can do a lot of interesting things with it," he said.

"People have been throwing around terms like Human Cloud for a while, and Crowd in the Cloud.

"A lot of work which happened in crowd sourcing before it, took time. Like Wikipedia, it 'took time' for articles to emerge. What's interesting with our service is the realtime aspect of it. Someone out there needs help from the cloud and, in almost real time, they get it."

Users know that it is humans at the other end and this has generated some "crazy" questions that could never have been answered by automated recognition software.

"We had one person who kept taking a picture of the sky and asking 'what is this"' every 5 minutes for a couple of hours," said Prof. Bigham.

"I had no idea what was going on. It also happens we loosely monitor Twitter. Someone later tweeted 'VizWiz just helped me watch the sunset'."

Blind photography
In a perhaps unexpected 21st century development, blind people are now finding they need to learn the basics of photography in order to take advantage of the growing number of text and image recognition services on smart phones.

How do you hold the camera up? And how close do you put it to the object you want to know more about? Angles, perspective, distance and light, are concepts that don't come naturally to people who have never been able to see.

The oMoby app is capable of recognising products from a photograph
Steve Nutt is an IT consultant in Hertfordshire who has been blind since birth. It took him two weeks to master how to frame a shot which he does in a very functional way, quite different to how sighted people would do it.

He explains: "If you're taking a picture of, say, a tin, you need to make sure you get the whole tin in there. I would stand it up so you get all the sides with the label and snap from about 8 inches above it.

"If you are taking a picture of some text on a piece of paper, centralise the camera and lift it up about ten inches. Keep your hand dead straight and dead still when taking the image.

"You have to also bear in mind the size of the thing you're taking the picture of. the smaller the thing, the closer you need to be to it ... I'd be lying if I said it was easy."

Jeff Bigham's team sees the results of the camerawork coming from users like Steve. Not everyone gets it right with their first shot.

"We definitely get a few attempts sometimes. It's not always easy to frame the photos. Sometimes the centre is out of the photo. if they're asking what is on a can of soup label, we generally say 'we can't tell what this is, the label is likely on the other side of the can'."








More information about the Electronics-Talk mailing list