Waterloo students create a system to help visually impaired using touch screens
WatVision is a three part process: smartphone app, industry standard detection markers placed on each corner of a touchscreen and a ring worn on the users’ finger.
Two mechatronics students from the University of Waterloo, recently won the James Dyson Award for their project WatVision, a smartphone based application that makes makes touch screens accessible to the visually impaired population.
Awarded to Craig Loewen and Lior Lustgarten, the annual James Dyson Award is international design competition that celebrates, encourages and inspires the next generation of design engineers, through trying to solve a real-world problem. In the case of WatVision, it’s a three part process: smartphone app, industry standard detection markers placed on each corner of a touchscreen and a ring worn on the users’ finger.
The phone’s camera identifies the parameters of the touch screen by locating the four detection markers, after which it takes a photo and downloads it to the device. Using the ring-wearing finger, the user then points to a button on the screen, where upon the app locates the ring and uses the uses the downloaded image to read the text aloud. The user can then select the correct action and independently use the touch screen service.
“Once werealized how much we used touch screens on a daily basis and how it would not only affect visually impaired people, but the aging population, we knew we could make a large impact on an underdeveloped market,” said Lustgarten. “After interviewing members of this community, it was apparent that this was an overlooked problem that needed a solution.”
WatVision works off a simplified version of a QR code—referred to as an RCO code—that we see in a variety of applications. Each of the codes are numbered, corresponding from the ring to the amount of sensors found on the screen. This allows the camera to identify the number of markers in its current field of vision.
WatVision garners feedback from the sensors and can identify if the user needs to move their phone up and to the right, depending how many on-screen sensors are found in the camera’s field of vision. The application will never confuse the user’s finger when pointing at the screen for example, because from the ring to the screen, everything is identified by these RCO codes.
If the user is by themselves, WatVision will audibly communicate if it can’t accurately connect to the markers on screen, stating that it needs the user to move the phone to the left or right for example.
WatVision also has a wearable glove with a micro controller attached, adding a level of interactivity not available on the ring. The glove, which has sensors attached and includes a vibration motor, will vibrate subtly when the user has their finger touching an applicable option, letting them know they can interact with what their touching. Because the project is still in it’s prototype stage, the glove does look very tech-y, with wires stemming to and fro. But the team do have plans to streamline the glove and make it look more aesthetically pleasing.
“We created a kind of heat map version, where we looked for all the different text in the image and places where we could do text-to-speech when their finger was over it,” says Lustgarten. “And the closer their finger is to an option, the more feedback they get. So when you move your finger around you get a kind of warmer [colder] experience,” he says.
The core of the project is giving WatVision a natural feeling design and user-friendly interface for visually impaired people. It’s why it took eight months and six iterative software designs for Loewen and Lustgarten to land on a working design for the project. After that, the duo submitted WatVision to MIT under an open source license, allowing access for others to make improvements and upgrades.
One of interesting things about the WatVision is its price point. Because the ring is made from a simple plastic and smartphones have become ubiquitous, the WatVision team believes they can manufacturer the part for less than $2. Both Lustgarten and Loewen said they could’ve created a standalone device that utilized similar technology to the Google Glass, but it would have made the device prohibitively expensive, something in direct contrast to their WatVision objective.
“It’s literally a piece of plastic with a marker on top of it,” Loewen says. “We’re very confident that it would cost less than two-dollars. Because that was the goal, to make it very accessible for people to use. It’s why we hosted the app on the phone, because 80 per cent of Canadians have phone. You aren’t asking people to buy another $800 device, that only works for one thing,” he says.
The team is currently working on getting the WatVision to the product stage, aiming to have it on the market in the next year.