Trends

Montreal startup Stradigi’s AI game teaches people sign language

Responsibly applied artificial intelligence (AI) has the potential to solve some of the world’s toughest challenges. One needn’t look further for evidence then the winners of this week’s IBM Watson AI Xprize wildcard round, which included a Montreal startup — Aifred Health — developing a model that helps clinicians choose personalized patient treatment programs. In related news, just this past Sunday, Google subsidiary DeepMind unveiled AlphaFold, AI that can predict protein folding more accurately than any system before it.
Accessibility is another burgeoning area of what’s been coined “AI for good” research, and one which Montreal startup Stradigi AI is committed to advancing with a new tool for the deaf and hearing impaired At the NeurIPS 2018 conference in Montreal this week, the four-year-old startup — which its two cofounders, Carolina Bessega and Jaime Camacaro, pivoted from software development to AI research in 2016 — demoed a game that uses computer vision to help people learn American Sign Language (ASL).
The ASL Alphabet Game, as it’s self-descriptively called, was produced in partnership with the Deaf Anglo Literacy Center (DALC), a Montreal organization that seeks to provide basic literacy and communication skills to the deaf. For every person at the conference who took it for a whirl — 900 in all by Friday — Stradigi donated $10 to DALC.
“There are so many people asking how AI can contribute positively [to society],” she told VentureBeat in an interview at NeurIPS. “We thought this would be a good way to show how it could really make a difference.”
Stradigi
It’s sort of like Simon Says, albeit only with hands and involving strictly sign language. Players earn points for imitating the 26 letters of the alphabet in American Sign Language (ALS), which convolutional neural networks trained on a collection of more than 100,000 images classify in near-real time. (Convolutional neural nets are a class of deep neural networks commonly applied to visualizing imagery.)
Videos from which the images were collected were chosen carefully to minimize bias. As Bessega, who has a Ph.D. in fundamental physics and previously served as program coordinator at the Venezuelan Science and Technology Ministry, explained, a neural net trained on an unrepresentative dataset would have had trouble recognizing signs made by people of certain ethnic groups, or with long fingers and large jewelry.
Pose estimation is performed by a three-stage CNN, which takes into account the join locations of hands as they’re positioned in front of a webcam. The model passes along cropped images to a second neural net for classification — one that produces heat maps with increasingly refined estimates for a player’s gesticulations.
The AI system is 99.03 percent accurate at correctly identifying ASL letters, Bessega said. That’s about half a percentage point better than Google’s pretrained Inception V4 managed in the company’s tests (98.50 percent), and nearly two percent better than the popular Xception algorithm (97.37 percent).
The whole stack took about a month and a half to develop, Bessega said, with the participation of Stradigi AI’s entire workforce — not only its d of 30 researchers, but its graphic designers and software engineers, too.
“It was mandatory for everybody,” she explained, “but they were enthusiastic about it.”
 
Bessega made it clear that Stradigi’s core focus will remain custom AI applications for enterprise clients — the company’s delivered more than 15 bespoke solutions in the last year, she said — but said that nonprofit and philanthropic work will become a larger part of its project slate going forward.
In that way, Stradigi’s following in the footsteps of tech giants like Microsoft, which committed $25 million to its AI for Accessibility program in May with the goal of “[helping] people with disabilities with work, life, and human connections.” Google subsidiary DeepMind is using AI to generate closed captions for deaf users., meanwhile. And chipmaker Intel recently supported Wheelie, a startup working on a wheelchair that enables riders to navigate with facial expressions, through its AI for Social Good initiative.
“We’re very much a commercial entity, but our goal is to keep at least one ‘AI for good’ project in parallel we’re working on,” Bessega said.
One of those projects is an AI system that one day might aid patients prone to debilitating seizures by providing early warning. The dream, Bessega said, is to develop a wearable device that alerts wearers at least five minutes before an episode occurs.”
“We’re working with a Montreal hospital [on it],” she said. “It would allow these patients to lead a normal life.”
Source: VentureBeat
To Read Our Daily News Updates, Please Visit Inventiva Or Subscribe Our Newsletter & Push.

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button

Adblock Detected

Please consider supporting us by disabling your ad blocker