[Iabs-talk] Note-Taking Made Easy For Legally Blind Students : NPR

Glenn Moore glennmooreiii at yahoo.com
Thu Jun 2 06:09:33 UTC 2011


It's been a while since I got to hear Science Friday; pretty cool.

-G. Robert Moore III
(via iPhone)

On May 23, 2011, at 8:28, Hai Nguyen Ly <gymnastdave at sbcglobal.net> wrote:


http://www.npr.org/2011/04/15/135442950/note-taking-made-easy-for-legally-blind-students?ft=1&f=1019

Listen to the Story

Copyright © 2011 National Public Radio®. For personal, noncommercial use only. See Terms of Use. For other uses, prior permission required.

IRA FLATOW, host:

This is SCIENCE FRIDAY. I'm Ira Flatow.

Remember back in college, sitting in those big lecture halls, trying to follow what your professor was saying and read through those equations on the board and scribble it all down furiously at the same time in your notebook?

Well, for legally blind students like my next guest, it's not an easy thing to do. So he's invented a tool to help low-vision students like himself get those notes down and watch the lecture at the same time, a feat that has won him and his colleagues top prize in the U.S. finals of the Microsoft Imagine Cup.

Want to know how it works, how to get your hands on one of these things? Well, 1-800-989-8255 is our number. You can also tweet us, @scifri. Let me bring him on, see if he can answer some of those questions.

David Hayden is the inventor and team captain for the Team Note-Taker at Arizona State University in Tempe. He joins us by phone. Welcome to SCIENCE FRIDAY, David.

Mr. DAVID HAYDEN (Student, Arizona State University):

Mr. HAYDEN: Thank you for having me, Ira.

FLATOW: Tell us about the device you built. How does it work?

Mr. HAYDEN: So like you said, the basic problem we're trying to solve is we're trying to help low-vision students take notes in class. And you've got this great big board, but to view that board, they need assisted technologies to provide magnification. 

So - but every time they go zooming on that board, they have a limited field of view. So when you go and look at your notes, you write something down. Then you look up, then you have to find that spot you were last at, commit something to memory, go down, write the notes, and then you keep cycling like that.

And that incurs a delay in note-taking that fully sighted students have to deal with, and in a fast-paced classroom, that's going to accumulate to the point where the low-vision student is not going to be able to keep up in taking notes.

And so we developed a device - when I was I was having some of these problems in keeping up in note-taking after I started a math major in 2007 and had to withdraw from some classes, and I found that the existing assistive technologies, they either fall into that category of having that board/no-board delay I just described, or they require classroom infrastructure or support personnel that you are relying on to make them effective.

So there's smart boards, which, you know, you can install in the classroom. The professor writes on them and they can stream the content. Or you have camera-based installations that are meant for archiving lecture content. But similarly, they could stream video to a student's laptop if you had the proper setup.

But you know, when we were looking at assistive technology solutions, I felt that you can't always assume that you're going to have an installation like that available. So I mean, maybe your university has the money and the will to set this up, but what happens when you go to a conference, or when you go to...

FLATOW: So you have a portable device that looks like it's about the size of a milk carton with a camera on it, and you can adjust where the camera looks, and then you plug it into your laptop?

Mr. HAYDEN: That's correct. So what we did is we set up - we figured out that for low-vision students to take notes in class, they need to be able to take handwritten notes if they can, since that's important for science, technology, engineering and math classes. And they need it to be portable so they're not relying on other people.

And similarly, they need to be able to view the board and their notes without a contact(ph) switching between the two, a large delay in contact(ph) switching.

And so what we did is we built a - and by we I mean myself and the Center for Cognitive Ubiquitous Computing, so this was a research lab I was volunteering programming time at in 2007, and about the same time I was having trouble in classes. And I realized that, you know, I have my own research problem on my hands.

So yeah, what we developed is, we put a tablet PC on the desk. So that's a Windows computer that sits flat on the desk. And it can take the multi-touch input like the modern-day tablets, but also, and importantly, it can take pen(ph) inputs so that you can have sub-pixel-accurate inking so that you can write, you know, fine detailed equations and diagrams.

Attached to this tablet PC is a custom-designed pan-tilt zoom camera. So that's a camera that provides a live view of the board, but with that live view, we need to zoom in, and once we're zoomed in, we need to be able to reposition where the camera is aiming.

So to do that, we put some servo-motors underneath the camera. So you can pan it left and right and tilt it up and down. So you can point at arbitrary spots on the board.

And then on the tablet PC, we put a split-screen interface. On half of the screen you've got a digital notepad for taking handwritten or typed notes, and in the other half you have live video of the board. And to control where the camera is zoomed in or where it's pointing, you have just your basic controls so that you can just tap any point on the image and it'll center on that, or you can drag any feature in the image, and whatever feature starts under your finger will remain under your finger as the camera as the camera is panning and tilting.

FLATOW: We actually have a video on our website at sciencefriday.com, a video you provided us of how this all works. And so it solves the problem for people with low-vision problems like yourself.

Will it be available anytime soon for people to buy it?

Mr. HAYDEN: We're hoping so. So we've gone through three generations of prototypes, and the this generation isn't quite ready for commercial production, but we're working toward - so the third generation, we used 3-D printing, stereo lithography, for printing the shell(ph), and we have a bunch of industrial components that we combine inside.

And it's a little bit larger and noisier and more expensive than we'd like to have in a product that we'd bring out to the market, but in a fourth generation design, what we'd do is move to - we'd construct an injection mold, and we would compress a number of the industrial components into it.

So we have a (unintelligible) board that each serve an individual function, whereas in a product we would take all of those boards and design a single board that does all of those tasks. And so this does things like digitizing the video and talking to the servo-motors.

FLATOW: Do you have a company interested in making this for you?

Mr. HAYDEN: We're actually, right now we're talking about spinning off a company. So the two options, of course, are spinning off a company or licensing it to one of the assistive technology vendors. So we are in discussions right now about spinning off a company.

FLATOW: Might you have an iPad version or something for these new devices that are coming out?

Mr. HAYDEN: Sure, so we've certainly talked about the collection of new tablets. There's iPads and Android tablets, which are very thin and provide nice battery life. And a lot of educational programs are starting to use them, at least in pilot studies.

The difficulty with the current tablets, as opposed to the tablet PCs, is that they don't have this pen input. And so on the iPad, for instance, it has a capacitive touchscreen. So you can get a special stylus that will mark on the screen, but this isn't digital inking. It's not sufficiently high-resolution to write equations.

So the iPad, if we were to hook up a camera to an iPad, or similar tablets, we would only be providing a video magnifier. Now, that's nothing to scoff at. There's a lot of low-vision students, I think, who would be happy with such a solution. You have a video magnifier, and then you can just take your own notes on a notepad.

But one of the nice things that we do with the Note-Taker is we have the video and the audio being recorded on the device, the tablet PC. So we know what time everything's happening in the class. And similarly, we know what time you take every keystroke or every pen stroke.

So when you want to go back and review your notes later, you don't just have a notepad on one - you don't just have your notes written on the notepad and the audio and video files that are just very linear, but you have them all together.

And so, you know, if you want to look through your notes, you might find that there's a place where you weren't keeping up on the note-taking as well. Maybe you were falling behind, or maybe you fell asleep or something. Something that happened to me often enough.

(Soundbite of laughter)

Mr. HAYDEN: And...

FLATOW: All of us.

Mr. HAYDEN: Yes. So you can select those notes, and since we know what time you were taking those notes, and what tie you were recording the video, we can just say: Okay, what video and audio is being recorded at that time? So you select those notes and it starts playing the audio and video of the area where you weren't taking enough notes on. And then you can just augment, you know, add in the remaining notes.

FLATOW: Well, I know you're going to be a contestant in July for the worldwide finals of the Imagine Cup. I want to wish you good luck.

Mr. HAYDEN: Yeah, thank you, appreciate it.

FLATOW: And thank you for taking time to be with us today. And do you foresee yourself as having a lot of competition out there? You think you're pretty confident about winning the whole, you know, ball of wax here in the finals?

Mr. HAYDEN: Well, I mean, so the nice thing about our project is that the problem is well-defined, and we have - we know the market. I mean, there's about 19 million low-vision adults in the U.S., and you know, fewer than 40 percent of them are participating in the workforce. And we think that the inaccessibility of education is in part to blame for that.

And so, yeah, I think that we have a very good entry into the competition. But I was there in the Imagine Cup last year, and you know there was - in Poland - and there was over 400 students from 69 different countries. And so there's going to be some stiff competition. But we think that our solution solves a real problem. So we're pretty confident.

FLATOW: Are you already thinking about improvements to the design you have already?

Mr. HAYDEN: Oh, absolutely, yes. I mean, the third generation we just recently designed, and the fourth generation is next up. And so we got some funding from the National Science Foundation in 2009.

And that's good through about 2011. And then we're talking about either - you know, there's kind of two paths we can go. It's either continue doing the academic research with the user studies. We've got a half-dozen low-vision students using it right now and engaging about 50 more as we speak.

So the one path, there's - on the one path there's the academia, staying in academia for a little bit, letting the technology incubate a little longer, or spinning off that company. So these are the questions we're asking ourselves right now.

FLATOW: Let me give you one last high-tech question, coming in from Second Life: Why not use a high-res camera and digital pan instead of servo-motors? I think both of our listeners who understand that question - yeah, go ahead.

Mr. HAYDEN: Sure, so you can do that to some extent. Like, fully sighted students, I've seen some students using 720p or 1080p cameras in the classroom, and they'll record them, and once you review them later, they are able to see the material on the board.

But low-vision students have the unique need that they need lots of magnification. So a typical optical magnifier is going to provide them, let's say, 12x zoom. And so if you look at, say, a 1080p webcam, which is, you know, 1920 by 1080 pixels, if you start zooming in 10x zoom, you're not going to have enough resolution to read the material as well as if you had, say, a 36x zoom lens like our camera has.

FLATOW: So you have 36 power on that? Wow.

Mr. HAYDEN: Yes. So I mean like in Seattle recently, at the Imagine Cup, we were on the 20th floor reading license plates.

FLATOW: Did you have to find or make a special lens? Did you go to special lens makers, or were they out there available?

Mr. HAYDEN: No, so there are industrial cameras. We were using industrial components, including an industrial camera. The only market that uses - the only market that uses industrial cameras with zoom lenses is actually security.

So, you know - you want to be able to monitor a floor so you put a pan-tilt zoom camera so you can reduce the number of cameras that cover a particular area. And zoom can help with that.

One of the unfortunate things about using these cameras, though, is that, you know, the security markets are still stuck in analog. So they're connected to CC TVs. And so - right. So we're actually looking forward to some of the newer interconnect technologies like USB3 and Thunderbolt, which provide more power to the peripheral and might allow us to expect cameras that are all digital with zoom lenses to come out here shortly.

FLATOW: Well, good luck to you, David, and thanks for coming on.

Mr. HAYDEN: Thank you.

FLATOW: David Hayden, inventor and team captain for Team Note-Taker at the Arizona State University. He's a finalist in the top prize in the U.S. finals of the Microsoft Imagine Cup.

Copyright © 2011 National Public Radio®. All rights reserved. No quotes from the materials contained herein may be used in any media without attribution to National Public Radio. This transcript is provided for personal, noncommercial use only, pursuant to our Terms of Use. Any other use requires NPR's prior permission. Visit our permissions page for further information.

NPR transcripts are created on a rush deadline by a contractor for NPR, and accuracy and availability may vary. This text may not be in its final form and may be updated or revised in the future. Please be aware that the authoritative record of NPR's programming is the audio.





_______________________________________________
Iabs-talk mailing list
Iabs-talk at nfbnet.org
http://www.nfbnet.org/mailman/listinfo/iabs-talk_nfbnet.org
To unsubscribe, change your list options or get your account info for Iabs-talk:
http://www.nfbnet.org/mailman/options/iabs-talk_nfbnet.org/glennmooreiii%40yahoo.com



More information about the IABS-Talk mailing list