Edge computing — that is, network architectures in which computation is relegated to smart devices, as opposed to servers in the cloud — is forecast to be a $6.72 billion market by 2022. Its growth will coincide with that of the deep learning chipset market, which some analysts predict will reach $66.3 billion by 2025. There is reason for that — edge computing is projected to make up roughly three-quarters of the total global AI chipset business in the next six years.
David Schie, a former senior executive at Maxim, Micrel, and Semtech, thinks both markets are ripe for disruption. He — along with WSI, Toshiba, and Arm veterans Robert Barker, Andreas Sibrai, and Cesar Matias — in 2011 cofounded AIStorm, a San Jose-based artificial intelligence (AI) startup that develops chipsets that can directly process data from wearables, handsets, automotive devices, smart speakers, and other internet of things (IoT) devices. Today the startup emerged from stealth with $13.2 million in series A backing from biometrics supplier Egis Technology, imaging sensor company TowerJazz, Meyer Corporation, and Linear Dimensions Semiconductor — all four of which say they plan to integrate the company’s technology into upcoming products.
Schie, who serves as CEO, said the fresh capital will fuel AIStorm’s engineering and go-to-market efforts. “AIStorm’s revolutionary … approach allows implementation of edge solutions in lower-cost analog technologies,” he added.
AIStorm calls its tech “AI-in-Sensor” processing (AIS), and claims it has the potential to eliminate not only the power requirements and cost associated with traditional at-the-edge machine learning implementations, but also the latency. To that end, AIStorm’s patented chip design is capable of 2.5 theoretical operations per second and 10 theoretical operations per second per watt, which Schie contends is 5 to 10 times lower than the average GPU-based system’s power draw. Moreover, through use of a technique called switched charge processing, which allows the chip to control the movement of electrons between storage elements, he says the chip is able to further boost efficiency by ingesting and processing data without first digitizing it.
Why’s that last bit important? Consider a security camera pointed at a warehouse. Points of interest — the areas around doors where intruders might enter, for instance — comprise only a fraction of the total pixels, so a connected system has to poll the sensor’s image data to try to figure out where to focus. By contrast, AIStorm’s chip lets the sensor itself deal with events, make decisions, and perform analyses.
“Edge applications must process huge amounts of data generated by sensors,” Egis Technology COO Todd Lin explained. “Digitizing that data takes time, which means that these applications don’t have time to intelligently select data from the sensor data stream, and instead have to collect volumes of data and process it later.”
According to Schie, those advantages — along with the AIStorm chipset’s programmable architecture and compatibility with popular abstraction layers, like Google’s TensorFlow — could enable biometric authentication on devices like smartwatches and augmented reality glasses, or cameras with battery lives of years instead of weeks or months.
“It makes a ton of sense to combine the sensor with the imager and skip the costly digitization process,” said Dr. Avi Strum, senior vice president and general manager of TowerJazz’s sensors business. “For our customers, this will open up new possibilities in smart, event-driven operation and high-speed processing at the edge.”
AIStorm tested chip out this month and plans to ship production orders next year. In addition to its Silicon Valley headquarters, the company has offices in Phoenix, Arizona and Graz, Austria.