Monday, November 21, 2016

Ava gives the deaf and hard-of-hearing a more present voice in group conversations


For those with hearing issues, simple dinner table group conversations can be pretty painful to stay on top of.
Ava is aiming to bring deaf and hard-of-hearing people back into group conversations with their threaded speech-to-text application that gives people with hearing issues an easy way to stay on top of a conversation.
The app, formerly known as Transcence, starts with each participant in a conversation downloading the app and setting up a profile. This may seem like a bit of work to get a casual chat going but especially for families or groups of friends, it’s a really simple way to bring everyone into a conversation regardless of their specific hearing abilities.
After getting everyone onboard, people just talk normally near their phones microphone and the speech to text translation is organized into a threaded message for everyone in the group text allowing users who are deaf or hard-of-hearing to have a record of the conversation right in front of them to quickly respond to.
The company announced this past week that they’ve closed $1.8M in funding to grow their team and accelerate product development.
I sat down with Ava CEO Thibault Duchemin to chat about tech meeting the needs of the deaf community and what’s next for his app.
The potential for the app expands well beyond dinner table conversations, and fits into pretty much anything where multiple people are chatting in an environment where deaf people are present. The Ava team had a major opportunity to show off their tech last month when Salesforce used the tech to live transcribe the majority of its breakout sessions for audience members at its Dreamforce conference. Duchemin and COO Pieter Doevendans also chatted a lot about the potential for education markets in giving everybody in the classroom an equal voice.
For the most part, speech recognition tech has been slow to wide user adoption because it’s not perfect. Even when systems can handle 90 or 95 percent accuracy, it’s hard to focus on anything other than what the system got wrong. It makes sense because voice assistants are so fragile and even a single misheard word can throw off the usefulness of an answer.
For now, Ava is just as accurate as the 3rd-party solutions that are currently powering it, though Duchemin says that once its system gathers more data on a user’s voice it will grow more capable in distinguishing it from the background noise. But for deaf users that often can only achieve 20 percent accuracy in reading lips during a conversation—making up the rest through body language and context—the ability to gain 80 or 90 percent accuracy through using Ava is quite empowering.
I reached out to TC contributor and accessibility writer Steven Aquino to hear if this is something the deaf community would actually use. “I grew up in the deaf community, as both of my parents were deaf, so I have that connection between both the deaf and hearing worlds. ASL is my first language,” Aquino said. “I know from experience how difficult is it to have deaf or hard-of-hearing folks involved in conversation. I was always acting as an interpreter when my parents were around hearing family or friends.”
Right now, the app allows users to host up to 5 hours of conversation per month. Any user can host a group conversation and participating in another user’s session doesn’t eat up any of your alotted time. For Ava users that move past this 5 hour time frame, they can upgrade to a paid unlimited version of the app for $30 per month.
Duchemin detailed that one of the most critical things it getting people on the app and through the setup process as quickly as possible so that groups can get to chatting. The interface of the app is appropriately very simple with just a few controls while much of the screen real estate is devoted to what’s being said. Users can sign into the account with different options, either as someone who’s hard-of-hearing, deaf or hearing, with the app slightly altering the experience to best serve the user’s abilities. The app is available on both Android and iOS.
Duchemin emphasized that the app definitely isn’t perfect and that there are still a lot of improvements the team is hoping to roll out soon. To them the key was really getting the app out in time for Thanksgiving so that family conversations around the dinner table could be a bit easier and accessible for the 15 million people in the US with hearing issues.

Thursday, October 20, 2016

This blind Apple engineer is transforming the tech world at only 22


Apple engineer Jordyn Castor has never been one for limitations.

She was born 15 weeks early, weighing just under two pounds. Her grandfather could hold her in the palm of his hand, and could even slide his wedding ring along her arm and over her shoulder. Doctors said she had a slim chance of survival.
It was Castor's first brush with limited expectations — and also the first time she shattered them.
Castor, now 22, has been blind since birth, a result of her early delivery. But throughout childhood, her parents encouraged her to defy expectations of people with disabilities, motivating her to be adventurous, hands-on and insatiably curious.

It was that spirit that led to her interact with technology, whether it was the desktop computer her family bought when she was in second grade, or the classroom computer teachers encouraged her to use in school.
She says the adults in her life would often hand her a gadget, telling her to figure it out and show them how to use it. And she would.

"I realized then I could code on the computer to have it fulfill the tasks I wanted it to," says Castor, whose current work focuses on enhancing features like VoiceOver for blind Apple users. "I came to realize that with my knowledge of computers and technology, I could help change the world for people with disabilities.
"I could help make technology more accessible for blind users."

Bringing a personal perspective to Apple innovation

There's an often overlooked component of "diversity" in workplace initiatives — the need to include the perspectives of people with disabilities.
Keeping tabs on the needs of the blind and low-vision community is a key component of Apple's innovation in accessibility. Castor is proof of how much that can strengthen a company.
She was a college student at Michigan State University when she was first introduced to Apple at a Minneapolis job fair in 2015. Castor went to the gathering of employers, already knowing the tech giant would be there — and she was nervous.
"You aren't going to know unless you try," she thought. "You aren't going to know unless you talk to them ... so go."
Castor told Apple reps how amazed she was by the iPad she received as a gift for her 17th birthday just a few years earlier. It raised her passion for tech to another level — mainly due to the iPad's immediate accessibility.

"Everything just worked and was accessible just right out of the box," Castor tells Mashable. "That was something I had never experienced before."
Sarah Herrlinger, senior manager for global accessibility policy and initiatives at Apple, says a notable part of the company's steps toward accessibility is its dedication to making inclusivity features standard, not specialized. This allows those features to be dually accessible — both for getting the tech to more users, as well as keeping down costs.

"[These features] show up on your device, regardless of if you are someone who needs them," Herrlinger tells Mashable. "By being built-in, they are also free. Historically, for the blind and visually impaired community, there are additional things you have to buy or things that you have to do to be able to use technology."
At that job fair in 2015, Castor's passion for accessibility and Apple was evident. She was soon hired as an intern focusing on VoiceOver accessibility.
As her internship came to a close, Castor's skills as an engineer and advocate for tech accessibility were too commanding to let go. She was hired full-time as an engineer on the accessibility design and quality team — a group of people Castor describes as "passionate" and "dedicated."
"I'm directly impacting the lives of the blind community," she says of her work. "It's incredible."

Innovation with blind users in mind

Increased accessibility for all users is one of Apple's driving values, under the mantra "inclusion inspires innovation."
Herrlinger says the company loves what it makes, and wants what it makes to be available to everyone. She describes the need to continuously innovate with accessibility in mind as part of Apple's DNA.
"Accessibility is something that is never-ending," Herrlinger says. "It isn't something where you just do it once, check that box and then move on to do other things."
And it's a dedication that isn't going unnoticed by the blind community. On July 4, Apple was the recipient of the American Council of the Blind's Robert S. Bray Award for the company's strides in accessibility and continued dedication to inclusion-based innovation for blind users.
The company, for example, made the first touchscreen device accessible to the blind via VoiceOver. Recent announcements of Siri 
coming to Macthis fall, and of newer innovations, like a magnifying glass feature for low-vision users, have continued the promise of improving the Apple experience for those who are blind and low vision.

"The fact that we take the time to innovate in these ways is something new and different," Herrlinger says. "It was not the expected thing in the tech community."
Often, the success of such innovations depends on the input of the community — and employees like Castor provide irreplaceable first-hand insight into the tech experience for blind individuals.

The most recent example of community-driven innovation can be found on the Apple Watch. During a meeting, Herrlinger explains, a person who sees could easily peer down at their watch to keep an eye on the clock. A person who is blind, however, hasn't had a way to tell time without VoiceOver.
After confronting the conundrum, Apple solved the issue by making a feature that tells time through vibrations. The addition, Herrlinger says, is coming to watchOS 3 this fall.

High-tech meets low-tech

Castor says her own success — and her career — hinges on two things: technology and Braille. That may sound strange to many people, even to some who are blind and visually impaired. Braille and new tech are often depicted as at odds with one another, with Braille literacy rates decreasing as the presence of tech increases.
But many activists argue that Braille literacy is the key to employment and stable livelihood for blind individuals. With more than 70% of blind people lacking employment, the majority of those who are employed — an estimated 80% — have something in common: They read Braille.
For Castor, Braille is crucial to her innovative work at Apple — and she insists tech is complementary to Braille, not a replacement.

"I use a Braille display every time I write a piece of code," she says. "Braille allows me to know what the code feels like."
In coding, she uses a combination of Nemeth Braille — or "math Braille" — and Alphabetic Braille. Castor even says that with the heavy presence of tech in her life, she still prefers to read meeting agendas in Braille.
"I can see grammar. I can see punctuation. I can see how things are spelled and how things are written out," she says.
The technologies that Apple creates support her love of Braille, too — there are various modifications, like Braille displays that can to plug into devices, to help her code and communicate. But Castor also often forgoes Braille displays, solely using VoiceOver to navigate her devices and read screens.
That autonomy of choice in accessibility, Apple says, is intentional. T
he company believes that the ability to choose — to have several tools at a user's disposal, whenever they want them — is key to its accessibility values.

Giving back to the community

Last week, Castor attended a conference hosted by the National Federation of the Blind, where she gave a speech telling her story. She says the impact that Apple has had on the blind community was extremely clear as soon as she stepped into the conference hall — just by listening to what was going on around her.
"When I walk through the convention, I hear VoiceOver everywhere," she says. "Being able to give back through something that so many people use is amazing."
Castor was recently able to use her presence and perspective at Apple to give back to a part of the community she's especially passionate about — the next generation of engineers.
She was a driving force behind accessibility on Apple's soon-to-be released Swift Playgrounds, an intro-to-coding program geared toward children. She's been working to make the program accessible to blind children, who have been waiting a long time for the tool, she says.
"I would constantly get Facebook messages from so many parents of blind children, saying, 'My child wants to code so badly. Do you know of a way that they can do that?'" Castor says. "Now, when it's released, I can say, 'Absolutely, absolutely they can start coding.'"
Castor says working on Swift Playgrounds has been an empowering experience, and her team has deeply valued her perspective on the VoiceOver experience for blind users.
She says the task-based, interactive app would have made a massive impact on her as a child. The program is, after all, a guided way of taking tech and figuring out what makes it tick — a virtual version of the hands-on curiosity adults instilled in her as a child.

"It will allow children to dive into code," she says of the program. "They can use Swift Playgrounds right away out of the box; no modifications. Just turn on VoiceOver and be able to start coding."
As someone who was always encouraged to challenge expectations, Castor says she has one simple message for the next generation of blind coders, like the children who will sit down with Swift Playgrounds in the fall.
"Blindness does not define you," she says. "It's part of who you are as a person, as a characteristic — but it does not define you or what you can do in life."


Friday, August 12, 2016

Virtual reality programs could help paralyzed patients walk again

Utilizing both virtual reality and brain-interface systems, researchers were able to help patients regain some motion.



Results from a recent experiment have demonstrated that special brain-machine interfaces, when used in addition to exoskeletons and virtual reality, could very well help paralyzed patients regain movement after spinal cord injuries. In short, virtual reality physical therapy could eventually help patients to walk again.
The system and its resulting experiments, conducted by Duke Universityneuroscientist Miguel Nicolelis and his colleagues, was successful in opening up partial restoration of muscle control and sensations in patients's lower limbs. This was possible after the patients were put through an aggressive training regiment with brain-controlled robotics andVR technology. This level of recovery is unprecedented in patients with long-term paralysis, according to the research, and was successful in restoring some sensation to eight patients previously diagnosed with spinal cord injuries.
The training programs utilized brain-machine interfaces (BMIs) in conjunction with virtual reality tech to establish communication between the brain and a computer, which gives patients control over external devices using only their mind. Virtual reality assisted patients with visualizing the brain-controlled movements and mind-body awareness. With a representation of their own bodies they were able to better make a connection between VR and their physical beings to make the training regimen more successful.
Patients regained varying amounts of sensation, ranging from some feeling and muscle control to an astonishing ability to move in a 32-year-old woman who had been paralyzed for 13 years. She was able to move her legs on her own with her body weight supported by a harness after just 13 months of treatment.
While these experiments aren't the first to test treatments like this one, they are certainly some of the more promising ones. The researchers are looking forward to testing out additional treatment plans in the future on more recently-injured patients to see how things change.

Thursday, August 4, 2016

McGill University researchers develop navigation app for Canadians living with vision loss


McGill University researchers have developed an app to guide people living with vision loss as they navigate their daily life.
Autour initially began testing in Montreal, and has now expanded to include cities across Canada thanks to funding from the Canadian Internet Registration Authority’s Community Investment program. CIRA is dedicating a total of $1 million to 25 Canadian projects.
“Our lab has long taken an approach of focusing on technology projects with a social impact, so our experience with spatial audio and mobile interaction led us naturally to work with the visually impaired community,” said Jeremy Cooperstock, one of the McGill University researchers. “We have truly enjoyed the experience of working with members of the community, and believe that the valuable input we received makes Autour the most user-friendly app available today – a go-to piece of technology that will improve the lives of the people who use it.”
The app overlays GPS, Google Maps, public transit and other data to provide audible instructions and descriptions to help guide users’ movements, and the app communities the places and services around them as users travel. The expanded Autour app is available to iPhone users via the Apple Store.
Cities that work with Autour include Halifax, Quebec City, Greater Montréal, Sherbrooke, Ottawa-Gatineau, Toronto/GTA, Hamilton, Waterloo, Thunder Bay, Winnipeg, Saskatoon, Regina, Calgary, Greater Vancouver, and Victoria.
This story was originally published on BetaKit.

Wednesday, July 27, 2016

This Robotic Crawler Helps Babies at Risk for Cerebral Palsy

Robotics and AI often get a bad rap for the whole destroyer of the human racething. But when you watch a motorized machine help a baby crawl, you can’t help put feel like robots aren’t so bad after all. And that’s exactly the kind of machine that researchers at the University of Oklahoma built.
Specifically, the Self-Initiated Prone Progression Crawler (SIPPC) is designed to mitigate neurological damage caused by cerebral palsy at an early age. Cerebral palsy refers to a number of neurological disorders that occur during pregnancy, infancy, or early childhood. Infants at risk can suffer from severe loss of motor skills and, sometimes, intellectual capabilities. Although children usually aren’t diagnosed with cerebral palsy until their first birthday, aiding movement in those crucial early months can help children at risk to develop motor and cognitive skills.
So researchers designed a motorized scooter for infants around two to eight months that helps them crawl. Additionally, an EEG cap monitors brain activity during these exercises, while mounted cameras capture movement 20 times a second to create a 3D graph of the child’s crawling.
The centerpiece to this whole robotic operation, however, is the machine learning algorithm which analyzes the infant’s movements and anticipates what the child is trying to do. The crawler then kicks in some motorized assistance to help the kiddo go.
The device featured in the above video is actually third iteration of the robot’s design. Since receiving funding from the National Science Foundation in 2012, the project has seen a series of successes leading to the current study of 56 newborns. Unfortunately, the device is still in its early stages and can’t be used by families at home. But the researchers hope that won’t be the case for long.

ALS ice bucket challenge leads to real-life genetics discover


You couldn't go on social media in 2014 without seeing a new video of a friend, celebrity or tech star dumping a bucket of ice water over his or her head to raise money for research into the degenerative neurological disorder ALS, short for amyotrophic lateral sclerosis.

Now, it looks like that "ice bucket challenge" produced some very real scientific results, according to the ALS Association.
A new Nature Genetics study — funded by money raised through the ice bucket challenge — details the discovery of a new gene associated with ALS.
The gene, named NEK1, appears to be one of the most common found in association with the disease and may be a good option for future gene therapy, the new study suggests.
“Global collaboration among scientists, which was really made possible by ALS Ice Bucket Challenge donations, led to this important discovery,” co-author of the new study John Landers, said in a statement
“It is a prime example of the success that can come from the combined efforts of so many people, all dedicated to finding the causes of ALS. This kind of collaborative study is, more and more, where the field is headed.”
Scientists found the gene by searching the genomes of more than 1,000 ALS families. Researchers also independently found the gene in a Dutch population, the ALS Association said.

The new study is part of Project MinE, a gene sequencing effort looking at the genomes of about 15,000 people with ALS around the world that's funded by donations raised through the ice bucket challenge. 
“The discovery of NEK1 highlights the value of ‘big data’ in ALS research,” ALS Association scientist Lucie Bruijn said in the statement. “The sophisticated gene analysis that led to this finding was only possible because of the large number of ALS samples available." 
"The ALS Ice Bucket Challenge enabled The ALS Association to invest in Project MinE’s work to create large biorepositories of ALS biosamples that are designed to allow exactly this kind of research and to produce exactly this kind of result.”
In total, the challenge raised about $115 million, with about $77 million of that going to research. The viral hit also produced some amazing videos. 
Stars like Chris Pratt and Justin Timberlake got in on the ice bucket action, and even tech giants like Bill Gates and Mark Zuckerberg took the plunge for ALS research.

This August, the ALS Association is asking for donations as part of its "Every Drop Adds Up" campaign. The new campaign asks contributors to talk about their commitment to fighting ALS.

Tuesday, July 12, 2016

Furenexo’s SoundSense is a simple, open-source gadget that helps deaf people stay aware of their surroundings


People with deafness have plenty of ways to navigate everyday situations as if they had no disability at all, but there are still situations that present dangers unique to them — not being able to hear a smoke alarm or gunshot, for instance. SoundSense is a small wearable device that listens for noises that might require immediate attention and alerts the user when it detects one.
“There’s really been an absence of innovation in technology for disabilities over the last decade or even decades,” said Brian Goral, co-founder and CEO of Furenexo, the company behind SoundSense. We talked a few weeks before today’s launch. “What we’re looking to do is bring technology that’s taken for granted, things like cell phones and driverless cars, and apply that to the disability space.”
This first device is a small and simple one for a reason — the company is bootstrapped and has to rely on Kickstarter for the funds to make the SoundSense. They’re also looking for grants from non-profit entities and perhaps government funds.
But really, the company has self-limited on purpose: the idea is to make something practical and cheap that almost anyone can use. Even a person with perfect hearing could wear one of these while walking around with headphones on.
There isn’t much to say about the device — it really is simple. The microphone passes its signal to a microchip, which watches for sudden increases in volume, and when it hears one, the whole doodad vibrates and its LEDs flash. The battery lasts about a day, and recharges over USB.
“It’s not anything deeply profound, like it’s going to revolutionize disability,” Goral said. “But for a person who wants to go jogging, or staying in a city they’ve never been to, just having that extra confidence and awareness that they’re going to know if something’s going wrong.”
Because it’s not a medical device, it can be sold immediately without any kind of FDA approval. And because it’s so cheap (Kickstarter versions will be $25, but the cost can be dropped even further with scale), it can be sold at drug stores or even given away by, for example, a community center catering to deaf people. Not only that, but the schematics for the device are free to download for anyone who wants to tweak them and make their own version.
For testing, guidance, and other benefits of partnership, Furenexo is working with non-profits like Helen Keller Services, which specializes in helping out people who are deaf, blind, or both.
But the SoundSense isn’t the only thing they plan on doing — just the first.
“You start to look at other challenges,” Goral said. “Like, if you’re a pedestrian in a wheelchair, and you’re on Google Maps — you might want a plug-in that avoids complex intersections, or takes you to accessible entrances instead of a generic spot on a map. Or a geofence for Alzheimer’s patients that sends you a text if they’re outside a certain area.”
The company is working on a wrist-mounted pad that the visually impaired can use to type braille, and there are plans for a more comprehensive haptic feedback armpiece that can give simple signals when they approach things like obstacles or other people.
Furenexo also hopes to create a community online that connects people with disabilities to people who want to address them. A person with paralysis might explain the difficulties of navigating the web without using her arms or legs, and an curious engineer might propose a solution or prototype a device.
The SoundSense Kickstarter is live now, so snatch up a device if you think it might be useful to you or someone you know.

Wednesday, June 8, 2016

Magnusmode partners with ROM to create digital museum guides for people with autism



As part of National Access Awareness Week, Toronto-basedMagnusmode has partnered with the Royal Ontario Museum and Easter Seals Canada to launch digital museum guides for visitors with autism.
The guides are built off of the company’s Magnuscards platform, which provides step-by-stepillustrated guides for people with cognitive special needs to navigate through daily tasks like doing laundry. Its web and mobile apps use digital guides, called Card Decks, and users to follow instructions from Magnus, the app’s interactive character.
“Museums have a key role to play in helping remove the isolation that can exist for many people with autism,” said Nadia Hamilton, CEO and founder of Magnusmode. Hamilton was inspired to found Magnusmode through her experience growing up with her brother, Troy, who has autism. “We chose to work with the ROM, a world-renowned institution, as our founding museum partner. With this proven model, we can now begin to collaborate with museums across Canada and around the world with tools that enable museums to educate, engage, and inspire all members of the community.”
The ROM MagnusCards feature two decks: one which guides and prepares the visitor for what to expect when entering the ROM, and the other functions as an educational scavenger hunt through the ROM’s James and Louise Temerty Galleries of the Age of Dinosaurs.

Thursday, May 19, 2016

Harvard engineers designed a 'soft wearable robot'

The flexible suit is aimed at patients with limited mobility.



A team of engineers from Harvard University's Wyss Institute for Biologically Inspired Engineering have moved one step closer to a consumer version of a soft, assistive exosuit that could help patients with lower limb disabilities walk again. The Wyss Institute announced todaythat the university is collaborating with ReWalk Robotics to bring its wearable robotic suit to market.
The soft exosuit was designed by Dr. Conor Walsh, who also happens to be the founder of the Harvard Biodesign Lab, along with a team of roboticists, mechanical and biomechanical engineers, software engineers and apparel designers. What really makes the Wyss exosuit stand out from other exoskeletons and robotic suits, is its form-fitting and fabric-based design. Instead of a heavy, rigid frame, the exosuit uses small but powerful actuators tucked in the belt to assist the wearer's legs in a more natural way. While the setup might not be powerful enough to fight off Xenomorphs, it is a much more elegant solution for stroke, MS and elderly patients who still have partial mobility, but need additional assistance.
"Ultimately this agreement paves the way for this technology to make its way to patients," Walsh said of his teams partnership with ReWalk. According to Harvard, the exosuit's development has already had a lasting impact beyond those medical applications as well. As the announcement puts it, the team's work has "been the catalyst for entirely new forms of functional textiles, flexible power systems and control strategies that integrate the suit and its wearer in ways that mimic the natural biomechanics of the human musculoskeletal system."

Friday, May 13, 2016

An Entrepreneur Makes Speech a Reality for All



Smartstones, Inc., a company taking great leaps in improving communication technology, is today releasing a new product that will directly improve millions of lives. GlobalMindED is proudly committed to supporting the innovative efforts of companies like Smartstones, whose products and services increase access, equity and opportunity for people of all backgrounds and abilities. Smartstones’ new development impacts not only those with communication challenges and their families, but also serves to democratize education and the exchange of information.

Since 2013, the existing application :prose™ has allowed nonverbal people the ability to “swipe to speak,” “tap to speak,” and “gesture to speak” with iOS devices and their own Touch™ device. :prose is the de-facto standard for nonverbal sensory communication. Today, Smartstones has taken their work even further, in the form of breakthrough “think to speak” technology. Using brainwave commands and a wireless connected EEG headset from Emotiv, the technology can send a “thought message” to anyone via push notification. 

“There are hundreds of millions of people in the world affected by various communication challenges, including autism, ALS/Lou Gehrig’s disease, cerebral palsy, Parkinson’s disease, strokes, brain and spinal cord injuries—and many are looking for greater independence, social inclusiveness, self-expression and to be understood,” says Smartstones Founder and CEO Andreas Forsland. “Communication is the most important part of the human experience, and we want to ensure every person has the ability to express their most basic needs.” With 2.3 million nonverbal autistic children in the USA alone, and over 70 million on the autism spectrum worldwide, everyone has a unique need when it comes to a communication interface.
Andreas came up with this revolutionary idea when tragedy befell his own mother. His mother, then 70 years old, developed pneumonia which eventually led to renal failure; the doctors put her on life support, and she spent several weeks in intensive care, unable to move or speak. Serving as the main point of contact between her, her doctors, lawyers, and long-distance family and friends inspired him to find—and when he couldn’t find one, to create—a tool that would allow anyone to receive and transmit brief, but absolutely vital, messages when communication is otherwise impossible. Thankfully, his mother recovered; Andreas continued his quest, all the while building on his experience as a coach and mentor to entrepreneurs. Andreas attended Startup Weekend just over a year later to present his solution.

How It Works:
For people with Autism Spectrum Disorder and other communication challenges, communicating verbally with others can be difficult or impossible. Traditional Alternative and Augmentative Communication (AAC) apps use pictogram-based approaches which, while versatile, can take months or even years to fully learn. :prose works by translating sensor data such as touch, motion, and now thought gestures, into spoken phrases. The integration of EEG technology makes :prose accessible to even more people, including those with motor-skill inhibiting conditions, who may find touch gestures difficult to perform reliably.

Recently, Smartstones worked alongside PathPoint, an organization that provides services to over 2,000 individuals with developmental, psychiatric, and physical disabilities to test the technology. 

“People have been trying to get neural devices to speak for decades,” said Gil Trevino, Assistive Technology Manager at PathPoint. “I’m happy to share that Smartstones has done it. One of our patients with severe disabilities is able to control :prose with her mental commands. Within minutes she was speaking several phrases aloud, compared to years of training with other technologies. She became a whole new person when she heard herself speak.”

Smartstones has an official partnership with Emotiv to provide the wearable EEG head-set. “Emotiv is incredibly proud to be involved in the :prose program and to be providing the wearable EEG component, said Nuri Djavit, Emotiv Chief Marketing Officer. “From the start, our mission has been to advance our understanding of the human brain and to find ways to improve the human condition—especially for those facing adversity.” 

The app is available for $29.99 (50% off during the month of April in recognition of National Autism Awareness Month) and integrates with a range of devices. GlobalMindED is delighted to connect with anyone opening opportunities for different populations, and we applaud Andreas for his groundbreaking innovations.