Trends

DeepNorth’s school security system uses AI and cameras to detect threats

Deep North (formerly VMAXX), a Silicon Valley startup with offices in China and Sweden, hopes to leverage artificial intelligence (AI) to prevent violence and “other safety issues” facing schools. It today announced a program that’ll offer a select number of institutions the opportunity to field test its threat-detecting object recognition and computer vision technology.
It’s already working with schools districts and universities in Texas, Florida, Massachusetts, and California, and has the backing of U.S. Congressman Pete Sessions (R-TX). “AI represents one of the few viable ways to make schools safe, and does it in a way that is more affordable than any other,” he said in a statement.
Not unlike Amazon Web Service’s Rekognition, IBM’s Watson Visual Recognition, and Microsoft’s Azure Face API, Deep North’s platform applies an intelligent layer to conventional, off-the-shelf security cameras (with resolutions as low as 320p), analyzing footage as it comes in. It monitors, detects, and interprets peoples’ in-frame behaviors and movements across settings, and identifies objects — e.g., unattended bags or objects that look like a weapon — that might pose a danger to students and staff.
School administrators receive alerts when a potential threat’s identified.
The patent-pending tech, which was originally engineered for brick-and-mortar retail, leverages cross-camera tracking that can scan crowds and monitor “areas of special concern” such as entrances, exist, and gathering areas. Deep North claims its technology doesn’t share any personal identifiable information of students or faculty (thanks to a numeric hashtag system based on physical characteristics), and said it can also be used to prevent abductions, “improve facilities layouts” and infrastructure, and manage foot traffic.
“It was both unexpected and eye-opening to see the value our video AI and deep learning expertise could also bring to securing schools,” Michael Adair, Deep North president and CEO, said. “Utilizing our solution, schools are able to automate and amplify the concept of ‘see something, say something’ in a way human security simply can’t match …The ability for a school to improve its safety and security without taking on steep costs, or having to adopt stress-inducing measures such as metal detectors, is no small feat.”
It’s not the first system of its kind. A high school in Eastern China began testing an “intelligent classroom behavior management system” earlier this year, which uses facial recognition designed to analyze students’ engagement in real time. And a Paris business school is using artificial intelligence and facial analysis supplied by LCA Learning’s Nestor to determine whether students are paying attention in class.
Companies like Shielded Students, meanwhile, hope to employ cameras and integrated microwave radar scanners and computer vision software to identify guns and other hidden weapons in schools.
But such systems have their detractors, unsurprisingly. There’s little to no public data to assess whether AI-driven surveillance systems in schools work, they say. And they point out that facial recognition AI is particularly susceptible to bias and false positives.
In July, the ACLU demonstrated that Amazon’s Recognition could, when calibrated a certain way, misidentify 28 sitting members of Congress as criminals. A study in 2012 showed that facial algorithms from vendor Cognitec performed 5 to 10 percent worse on African-Americans than on Caucasians. And more recently, it was revealed that a system deployed by London’s Metropolitan Police produces as many as 49 false matches for every hit.
Rick Smith, CEO of Axon, one of the largest suppliers of body cameras in the U.S., was earlier this summer quoted as saying that facial recognition isn’t yet accurate enough for law enforcement applications.
“[They aren’t] where they need to be to be making operational decisions off the facial recognition,” he said. “This is one where we think you don’t want to be premature and end up either where you have technical failures with disastrous outcomes or … there’s some unintended use case where it ends up being unacceptable publicly in terms of long-term use of the technology.”
But Adair expressed confidence in the Deep North system’s accuracy — and its potential to do real good.
“We look forward to expanding our efforts with this program and helping more schools across the country enhance security, mitigate safety risks and better protect their students and faculty for the long run,” he said. “We are proud to be leading the way in providing a behind-the-scenes, software-driven option that can truly make a difference in the near-term as well as the long-term.”
Source: VentureBeat
To Read Our Daily News Updates, Please visit Inventiva or Subscribe Our Newsletter & Push.

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button

Adblock Detected

Please consider supporting us by disabling your ad blocker