Monday, November 16, 2015

Breaking a different kind of language barrier: Sign language becomes sensor-based



American Sign Language is the bridge that connects deaf and hard-of-hearing people, in large part, to the world of traditional interpersonal communications. But how to communicate with ASL when a partner in a given conversation cannot interpret the visually-based language?
Seeking to close that kind of communications gap, work is underway at Texas A&M University. Roozbeh Jafari, associate professor and principal investigator with the school’s Department of Biomedical Engineering — and researcher at its Centre for Remote Health Technologies and Systems — is developing a newly sophisticated tool to make ASL understandable to everyone.
The results of Jafari’s project, and the long-term implications that stem from it, could change the way we approach interfacing with each other — and even with technology — all based on our hands, muscles and movements.

Vision Quest: Recent Challenges for ASL Translation

The ASL translation system doesn’t have an official name, yet, but what it’s doing — and what it stands to do — is concrete and apparent. The goal is to translate ASL for all participants in a way that proves more accurate, more portable and more reliable than ever before.
“There have been a few systems for translating American Sign Language automatically,” said Jafari, regarding devices that precede the new technology he is working to refine. “The most prominent among them have been based on cameras and vision … you would basically stand in front of a camera and the camera would track hand motion.”

It is a system of turning visually tracked movement into words. But the cameras that facilitate it, only work well when the specific ASL gestures tracked are precise enough for the computer on the other end of the equation to recognise. Failure to hit the mark precisely, however, can mean the conversation between an ASL user and the non-ASL-using participant becomes difficult. Words get lost. Communications break down. Add in challenges around where a camera can be placed in a room full of ASL-using participants. Further add that users have to carry around a motion-tracking camera everywhere.
In all of these factors, Jafari saw the need for a different ASL-interpreting tool.

Beyond Vision: Jafari’s Motion- and Muscle-Tracking Approach to ASL Translation

In Jafari’s project, the camera is out of the picture. Instead, his technology applies an external motion sensor and a wearable muscle-tracking sensor to create a new version of ASL translation.
“The sensor is based on EMG, or electromyogram technology,” Jafari said, referring to sensors the Mayo Clinic describes as measuring electrical signals — ones that our motor neurons transmit to muscles, causing them to contract. EMGs turn can these signals into numerical values computers and specialists are able to interpret.
“Combined with the external motion sensors, which show us the overall hand movement, the EMG allows us to discriminate between gestures,” he said. “A fine-grain of interpretation … motion sensors give us the overall sense and muscle activities give us information about the fine-grained intent.”

Next Steps: Focusing on the Details of New ASL Tech

The team has produced an operational proof-of-concept model, when it comes to the ASL-interpreting technology underway at Texas A&M. The next step is to refine the sensitivity and accuracy of the devices.
  • Currently, every wearer of the EMG sensor, every time they don the device, must be careful to position the wearable tech in a precise way, otherwise the system must be “retrained” to register the ASL vocabulary the user employs. Jafari also stated that they’re working on ways to “make the system smarter, in a sense … to reduce or eliminate training time.”
  • At present, Jafari’s system recognises individual words, but requires a pause between them. As the team develops their work further, the goal is for the translation engine to combine the input it receives into whole phrases and sentences — more akin to the way humans naturally communicate.
  • The third prong of development is to increase the vocabulary of the technology, overall.
When all of Jafari’s developing tech is operating at the advanced level he describes, ASL users and their conversation partners will clearly benefit. But the applications of the sensor-based system extend beyond sign language and translation alone.
“When you think about it, you can use this for many other applications,” he said. “Think about your house … you might have a smart house, but right now to turn on and off all your devices you need to go to your mobile phone, each app, and then use them. What if you could control your house with hand gestures, communicating with each of your devices?”
For Jafari, starting with the mission to further facilitate ASL among all participants — and then extending into the home and future applications — the conversation is just getting underway.

Thursday, November 12, 2015

How an iPhone changed a paralyzed veteran's life



While serving in Iraq as a medic in the U.S. Army, Ian Ralston was hit with a tiny ball bearing from an IED in 2010. He was left paralyzed from the neck down. 
Despite the hardship, Ralston quickly adopted a "make the best of it" attitude. "To me, there's absolutely no point in being upset about it," he told the told the WCF Courier nearly a year after he was injured. 
"I mean, yeah, it sucks. And if I had my choice, no, I wouldn't be in a wheelchair. But this is what I got right now. So I might as well make the best of it. If all you want to do is be upset about it, it's just going to make it that much harder for you to live with it, to cope with it. So I got past that quick."
That attitude has helped Ralston thrive. He's married, has newborn twins and uses a special wheelchair that helps him accomplish daily tasks—even operating his smartphone. 

Life changed by a smartphone


Like most new parents, Ralston posts photos of his children to Facebook, where he also posts status updates about his life.
Ralston does this using an iPhone 6 Plus — a phone he got eight months ago. The iPhone is Ralston's first smartphone.
"I got injured before the smartphone craze, so getting one now is really cool," Ralston tells me.
Many of us are familiar with accessibility options for visually impaired individuals; screen readers make it possible for users who are partially or fully blind to still read a computer screen. Dictation software has made it possible for those who can't move their arms or fingers to communicate.
But accessibility tech goes a lot further than that. Ralston, for example, is able to operate his iPhone 6 Plus using his mouth. His wheel chair uses what's called a sip-and-puff system. Blowing or sucking into a tube can move his chair around. It can also control the screen on his iPhone.
The iPhone connects to Ralston's chair using a small device called a Tecla Shield. This is basically a bluetooth adaptor that connects his phone — which is mounted on his wheelchair — to the wheelchair's sip-and-puff controls. 
The Tecla Shield is designed to work with iOS devices running iOS 7 or higher. Using a feature called Switch Control, Ralston can control and access different parts of the screen using his mouth.
For Ralston, the impact having a smartphone has had on his life has been both big and small.
Ha
Having access to a phone gives him freedom and a sense of independence he didn't have before. "I'm on a vent because I can't breathe on my own," he explains. In the past, this meant that if he couldn't leave the house alone — in case something happened (a battery got too low, some tubing started to come out). Now, he has some independence and call his wife on her phone if he needs assistance.

"That was something I could never do before."
But it's not just the big stuff. Like most guys his age, Ralston is into fantasy football. He told me about how he recently was out and got a notification of an important movement that would affect his lineup. Using Yahoo's official Fantasy Football app, he was able to make changes to his team based on that news. "It sounds really trivial," Ralston concedes — but I disagree.
The reality is that most of us use our phones for a mix of reasons. The fact that Ralston can change his lineup no matter where he is — just like anyone else — is huge.
"I send texts [using the built-in dictation feature] and check Facebook like anyone else," he says. He also uses Siri to look some stuff up — a game score or the weather. 
The iPhone isn't Ralston's first experience with adaptive tech. He has used Dragon's Naturally Speaking software on his PC laptop for a few years to dictate emails or posts. He also uses his mouth to control his computer's mouse.
Still, the iPhone is a new experience because it is attached to the chair and as a result, can go everywhere with him.
A glitch between Ralston's phone and his chair meant he couldn't use the two together for a few weeks. "I was pissed off!" he tells me, surprised by how quickly he became reliant on the device.
Ralston almost seemed embarrassed by his addiction — but as I pointed out to him — that's the same way all of us — regardless of our physical abilities — feel about our phones. If my iPhone broke tomorrow, I know exactly how long I would be able to last without it: As long as it took to get to the nearest Apple Store to buy a new one.

Letting people know this is out there

Ralston's setup between his wheelchair and his iPhone was all covered by Veteran's Affairs.
As far as he knows, he was the 16th person in Washington state to get it. More people, he thinks, should know that this is available.
"And it's not just for vets or those of us with spinal injuries," he says. "People with ALS can benefit, too."
Getting it setup and certified through the VA was seamless, he says — at least in Seattle. "It takes about four hours to install," he says and then it took an hour or two for him to get the hang of.
Still, setup was intuitive and Ralston was able to play around while on the way back from getting it installed on his chair.

Getting developers on board

Although Ralston has success with most of the apps he wants to use, it's worth noting that not all apps are built with accessibility in mind.
When I asked him about any apps in particular that don't work well with his setup, he called out transportation maps like Apple Maps and Google Maps. "I like to look at maps and places," he explains, and it can be difficult with the current apps.
Apple makes it relatively easy for developers to add accessibility support — including access to Switch Control — in its apps. If an app supports Apple's screen reader, it probably also supports Switch Control.
De
Developers don't always focus on accessibility until they hear from someone affected. And Ralston says he often doesn't want to complain because he's just so happy he has access to a phone at all.

Apple and other companies do actively work with vets and with the community to make accessibility better.
Ralston's advice for developers is to start playing with the accessibility settings on their own. There are YouTube videos that show how the features work and by trying to use an app without typical input, developers can get a real-world idea of what it might be like for someone like Ralston to use an app.


Tuesday, November 10, 2015

Meet the runway model with one of the world's most advanced bionic arms



Rebekah Marine, a New Jersey-born congenital amputee, nearly gave up on her dream of becoming a fashion model when a casting director told her she would never make it. But she didn't let that stop her. In recent years, she has strutted down some of the fashion industry's most exclusive runways, all while modeling one of the most advanced bionic arms on the market.