Dubbed Eyeriss, neural nets are sometimes branded deep learning.
Right now, the networks are pretty complex and are mostly run on high-power GPUs.
You might also want to process locally for privacy reasons.
And, of course, onboard neural networks would be useful to battery-powered autonomous robots.
The networks can thus swell to enormous proportions, said MIT.
Although they outperform more conventional algorithms on many visual-processing tasks, they require much greater computational resources.
Data enters and is divided among the nodes in the bottom layer.
The result emerges from the final layer.
The chip tries to reduce repetition in processing by efficiently breaking down tasks for execution among the 168 cores.
The circuitry can be reconfigured for different types of neural networks, and compression helps preserve bandwidth.
Nvidia at CES demonstrated self-driving cars that removed data from servers to identify objects or obstructions on a street.
The researchers havent said if the chips would reach devices.
Besides Intel and Qualcomm, chip companies like Movidius are trying to bring AI capabilities to mobile devices.
The research was partially funded by Defense Advanced Research Projects Agency (DARPA).
source: www.techworm.net