Wednesday, January 6, 2021

 


Monday, April 8, 2019




The OrCam MyEye increases the independence of people who are blind and visually impaired. It can read texts, barcodes, recognize faces, identify products, money notes, colors, and can even tell the user the time and date. It does this by conveying visual information audibly.

Friday, March 29, 2019

How AI and machine learning are changing prosthetics


Imagine a prosthetic arm with the sensory capabilities of a human arm, or a robotic ankle that mimics the healthy ankle's response to changing activity.
Hollywood has long popularized imaginative versions of such ideas. While human engineering may not yet be able to produce superhero-enabling devices, prosthetics are getting "smarter" and more adaptive, approaching a reality in which amputees' artificial appendages offer near-normal function.
Bioengineers are increasingly looking to create "human-machine interfaces embodied by a prosthetic limb that really feel like an extension of the body," said Robert Armiger, project manager for amputee research at Johns Hopkins University's Applied Physics Lab, where scientists have developed an arm with human-like reflexes and sensation.
FDA looked to spur development when it released a 'leapfrog' draft guidance in February outlining a vision for invention and testing of brain-implanted devices capable of controlling prosthetic limbs. And the Defense Advanced Research Projects Agency is funneling tens of millions into new products. 
The nonprofit Amputee Coalition estimates about 2 million amputees in the U.S., and that number is expected to nearly double to 3.6 million by 2050. An estimated 185,000 new lower-limb amputations occur each year. 
Among those living with limb loss, below-knee amputations are the most prevalent, with nearly three-fourths related to circulatory problems. In fact, vascular disease, including diabetes and peripheral artery disease, accounts for 54% of all amputations in the U.S. Other major causes are trauma (45%) and cancer (less than 2%). 
With the numbers of amputations rising, activity is the space has been heating up.
Driving demand are aging populations and rising incidence of vascular diseases, as well as developments in artificial intelligence and machine learning. Advanced materials such as silicone and urethanes are also resulting in lighter-weight prosthetics with "memory" to respond to changes in pressure. 
Among those competing in the space are Ottobock, which sells the bebionic hand, and ReWalk, maker of a powered walking assistance system. Icelandic firm Ă–ssur markets a mind-bionic prosthetic for lower-limb amputees.  
But despite their potential, smart prosthetics are still a ways from becoming a reality for most people who need them, in large part due to the relatively small population, high costs and lack of reimbursement.
Building a robotic system that incorporates all the movements and sensory components of the missing limb, and doing so in a natural and intuitive way, is also challenging. 
Another slowdown is getting technology transitioned out of the lab and into a company that will pursue FDA approval and take it to market.
"The risks are in working with insurance companies to reimburse for these types of devices that are already very expensive and have a high degree of abandonment," Armiger told MedTech Dive. "Insurance companies say, 'we're spending this much money on this device and people don't wear it, and now you're asking us to spend more money on a more advanced technology.'"
To expand access to these devices, companies will need to demonstrate that value proposition, he adds. 

Happy National Assistive Technology Awareness Day!


AT Awareness DayEarlier this month, the United States Senate unanimously adopted a resolution declaring today National Assistive Technology Awareness Day. The resolution noted that "assistive technology devices and services are not luxury items but necessities for millions of people with disabilities and older adults, without which they would be unable to live in their communities, access education, and obtain, retain, and advance gainful, competitive integrated employment,"
ACL is proud to fund Assistive Technology Act programs in every state and territory that help older adults and people with disabilities discover, try, reutilize, and finance assistive technology. In FY18:
  • Over 72,000 individuals participated in assistive technology device demonstrations to find the right device for their community living needs.
  • Nearly 50,000 AT devices were provided on short-term loan to individuals with disabilities, service providers, and agencies.
  • More than 70,000 AT devices were reutilized, allowing consumers to save more than $28 million by obtaining a lightly used or refurbished AT device.
  • 96% percent of consumers who received financing loans from an AT program said they would not be able to purchase or obtain the AT without this financing.

Thursday, October 26, 2017

Toronto’s Pearson Airport now supports assistive app for people with cognitive special needs




Toronto’s Pearson International Airport has announced a partnership with MagnusCards, a free-to-download app that offers digital how-to guides (known as ‘Card Decks’) for people with autism and other cognitive special needs. Pearson says it’s the first airport in the world to work with MagnusCards to offer the assistive app. Developed by the Waterloo-based MagnusMode, the MagnusCards app offers assistive Card Decks for a variety of venues and brands, including CIBC banks and Maple Leaf Sports and Entertainment. Pearson visitors can download up to ten different Card Decks, each of which offering a personal step-by-step guide on how to navigate different parts of Canada’s largest airport. The Pearson Card Decks include tips on how to check-in to an airline, getting help or asking questions, going through US Customers and Border Protection and more. Card Decks are also offered in English and French and feature pictures, text and audio to help as many different users as possible. 

Wednesday, July 26, 2017

Apple and Cochlear team up to roll out the first implant made for the iPhone


Apple has teamed up with Australian-based Cochlear to bring iPhone users the first made for iPhone Cochlear implant.
Approved by the U.S. Food and Drug Administration in June, Cochlear’s Nucleus 7 Sound Processor can now stream sound directly from a compatible iPhone, iPad or iPod touch to a patient’s surgically embedded sound processor.
The device also allows those with the implant to control and customize the sound from their iPhone.
There have been other implants and hearing aids that have used iOS apps to control sound and other features and Nucleus’s own app can be downloaded to do the same. However, Cochlear’s newest processor is controlled by the phone itself and does not require an app download.
More than 50 million Americans have experienced some sort of hearing loss due to one reason or another. Apple saw the hearing loss problem and has spent a number of years developing a hearing aid program within the company.
Apple soon developed a protocol the company offered for free for hearing aid and implant manufacturers to use with their devices.
“We wanted to see something that could become ubiquitous out in the world,” Apple’s Sarah Herrlinger, senior manager for global accessibility policy and initiatives told TechCrunch. “We want everybody to use our technology and to say ‘wow my iPhone is the best piece of technology I’ve ever used before’…with every iteration of our operating system our goal is to add in new accessibility features in order to expand the support that we can give to people all over the world.”
Accessing the control settings for your Cochlear implant is relatively easy. Those who get the new Nucleus 7 Sound Processor or other made for iPhone hearing aid simply go to their iPhone settings, click on “General” and then click “Accessibility.” As you move down the screen you’ll see a list of different devices you’ll see “hearing devices.” Tap on that and then the device should show up the way a Bluetooth device would in Bluetooth settings. The implant will then pair with your iPhone.
Just like headphones or another Bluetooth-enabled device, as soon as the implant is paired up it can be controlled using the iPhone’s volume controls. So, for example, when a phone call comes in, you can hear that call at the volume settings within your implant.
The new Nucleus 7 comes with a longer battery life and is also smaller and 24 percent lighter than its predecessor, the Nucleus 6 Sound Processor, making it ideal for small children with hearing loss as well.
“The approval of the Nucleus 7 Sound Processor is a turning point for people with hearing loss, opening the door for them to make phone calls, listen to music in high-quality stereo sound, watch videos and have FaceTime calls streamed directly to their Cochlear implant,” Cochlear CEO Chris Smith said in a statement. “This new sound processor builds on our long-standing commitment to help more people with hearing loss connect with others and live a full life.”

Monday, November 21, 2016

Ava gives the deaf and hard-of-hearing a more present voice in group conversations


For those with hearing issues, simple dinner table group conversations can be pretty painful to stay on top of.
Ava is aiming to bring deaf and hard-of-hearing people back into group conversations with their threaded speech-to-text application that gives people with hearing issues an easy way to stay on top of a conversation.
The app, formerly known as Transcence, starts with each participant in a conversation downloading the app and setting up a profile. This may seem like a bit of work to get a casual chat going but especially for families or groups of friends, it’s a really simple way to bring everyone into a conversation regardless of their specific hearing abilities.
After getting everyone onboard, people just talk normally near their phones microphone and the speech to text translation is organized into a threaded message for everyone in the group text allowing users who are deaf or hard-of-hearing to have a record of the conversation right in front of them to quickly respond to.
The company announced this past week that they’ve closed $1.8M in funding to grow their team and accelerate product development.
I sat down with Ava CEO Thibault Duchemin to chat about tech meeting the needs of the deaf community and what’s next for his app.
The potential for the app expands well beyond dinner table conversations, and fits into pretty much anything where multiple people are chatting in an environment where deaf people are present. The Ava team had a major opportunity to show off their tech last month when Salesforce used the tech to live transcribe the majority of its breakout sessions for audience members at its Dreamforce conference. Duchemin and COO Pieter Doevendans also chatted a lot about the potential for education markets in giving everybody in the classroom an equal voice.
For the most part, speech recognition tech has been slow to wide user adoption because it’s not perfect. Even when systems can handle 90 or 95 percent accuracy, it’s hard to focus on anything other than what the system got wrong. It makes sense because voice assistants are so fragile and even a single misheard word can throw off the usefulness of an answer.
For now, Ava is just as accurate as the 3rd-party solutions that are currently powering it, though Duchemin says that once its system gathers more data on a user’s voice it will grow more capable in distinguishing it from the background noise. But for deaf users that often can only achieve 20 percent accuracy in reading lips during a conversation—making up the rest through body language and context—the ability to gain 80 or 90 percent accuracy through using Ava is quite empowering.
I reached out to TC contributor and accessibility writer Steven Aquino to hear if this is something the deaf community would actually use. “I grew up in the deaf community, as both of my parents were deaf, so I have that connection between both the deaf and hearing worlds. ASL is my first language,” Aquino said. “I know from experience how difficult is it to have deaf or hard-of-hearing folks involved in conversation. I was always acting as an interpreter when my parents were around hearing family or friends.”
Right now, the app allows users to host up to 5 hours of conversation per month. Any user can host a group conversation and participating in another user’s session doesn’t eat up any of your alotted time. For Ava users that move past this 5 hour time frame, they can upgrade to a paid unlimited version of the app for $30 per month.
Duchemin detailed that one of the most critical things it getting people on the app and through the setup process as quickly as possible so that groups can get to chatting. The interface of the app is appropriately very simple with just a few controls while much of the screen real estate is devoted to what’s being said. Users can sign into the account with different options, either as someone who’s hard-of-hearing, deaf or hearing, with the app slightly altering the experience to best serve the user’s abilities. The app is available on both Android and iOS.
Duchemin emphasized that the app definitely isn’t perfect and that there are still a lot of improvements the team is hoping to roll out soon. To them the key was really getting the app out in time for Thanksgiving so that family conversations around the dinner table could be a bit easier and accessible for the 15 million people in the US with hearing issues.