HomeArticles

Imagination creates neural network accelerator for custom processors

Like Tweet Pin it Share Share Email


Imagination Technologies will make it easy for chip designers to embed neural network accelerators in their chips to dramatically speed up artificial intelligence processing.

London-based Imagination will introduce a design that chip makers can use in their chips, starting sometime next year. The chip component, which can be embedded within a larger chip design, will be useful for neural network processing in applications such as mobile devices, surveillance equipment, automotive tech, and consumer electronics. Neural networks have made rapid advances in areas such as pattern recognition, resulting in a huge explosion for artificial intelligence applications.

Chris Longstaff, senior director of product and technology marketing at PowerVR, said in an interview with VentureBeat that the PowerVR Neural Network Accelerator has two times the performance and uses half the bandwidth of the nearest competitor. And he said it is eight times more powerful than rival chips known as digital signal processors.

“AI processing will happen on the edge because of power, bandwidth, performance, reliability, security, and latency,” Longstaff said. “An autonomous car can’t rely on a 5G, 4G, or 3G signal to a data center. It’s just not going to work. We think neural network accelerators will become ubiquitous as a hardware block.”

The accelerator will support multiple operating systems, including Linux and Android. It can be used in system-on-chip designs (which mix together different components in a single chip) and deliver high performance and low power consumption.

Neural network accelerators (NNAs) could be a new class of processors, likely to be as significant as central processing units and graphics processing units, Longstaff said. Imagination offers both CPUs and GPUs, and now it will add NNAs as well.

The tech could be used for things like photography enhancement and predictive text enhancement in mobile devices; feature detection and eye tracking in augmented reality and virtual reality headsets; pedestrian detection and driver alertness monitoring in automotive safety systems; facial recognition and crowd behavior analysis in smart surveillance; online fraud detection, content advice, and predictive UX; speech recognition and response in virtual assistants; and collision avoidance and subject tracking in drones.

If a drone flies at 150 miles per hour, it needs enough processing intelligence to know when to avoid an obstacle. With NNA processing, it could avoid an object with less than a meter’s notice. And Longstaff said it might take an NNA two seconds to search through 1,000 photos for a pattern, while it could take a GPU up to 60 seconds.

The Embedded Vision Developer Survey conducted by the Embedded Vision Alliance, 79 percent of respondents among chip makers said they were already using or were planning to use neural networks to perform computer vision functions in their products or services.

Jeff Bier, founder of the Embedded Vision Alliance, said in a statement, “Numerous system and application developers are adopting deep neural network algorithms to bring new perceptual capabilities to their products. In many cases, a key challenge is providing sufficient processing performance for these demanding algorithms while meeting strict product cost and power consumption constraints. Specialized processors like the PowerVR 2NX NNA, designed specifically for neural network algorithms, will enable deployment of these powerful algorithms in many new applications.”

In other news, Imagination also introduced two new PowerVR GPUs, the cost-sensitive PowerVR Series9XE and 9XM GPUs. Apple used Imagination’s graphics technology in its smartphones in the past, but it has stopped doing so and is creating its own graphics components now.

 

Comments (0)

Leave a Reply

Your email address will not be published. Required fields are marked *

اخبار حلويات الاسرة طب عام طعام وشراب