SAN FRANCISCO–(BUSINESS WIRE)–Please replace the release with the following corrected version due to multiple revisions.
The updated release reads:
BRAINCHIP DEMONSTRATES COMPANY’S EVENT-BASED AI NEURAL PROCESSOR AT EMBEDDED VISION SUMMIT
BrainChip Holdings Ltd. (ASX: BRN), a leading provider of ultra-low power high performance AI technology today announced its participation at the 2020 Embedded Vision Summit Virtual Conference taking place September 15 –25, 2020 on Tuesdays and Thursdays from 9 a.m. – 2 p.m. PDT. BrainChip will exhibit Akida™,the Company’s next generation artificial intelligence AI Edge technology. The ultra-low power, event domain neural processor is capable of inference and incremental learning to support many of today’s standard neural networks.
The company will be running multiple demonstrations of its Akida Neural Processor technology, which is implemented in the Neuromorphic System-on-Chip (NSoC), a revolutionary advanced neural networking processor that brings artificial intelligence to the edge in a way that existing technologies are not capable. The solution is high-performance, small, ultra-low power and enables a wide array of edge capabilities. The Akida NSoC represents a revolutionary new breed of Neural Processing computing devices for Edge AI devices and systems. Comparisons to leading DNN accelerator devices show significantly better images/second/watt running industry standard benchmarks with MobileNet, MobileNet-SSD and Key Word Spotting, while maintaining excellent accuracy.
The Akida NSoC is designed for use as a stand-alone embedded accelerator or as a co-processor. It includes interfaces for ADAS sensors, audio sensors, and other IoT sensors. It also has high-speed data interfaces such as PCI-Express, USB, 12S and 13C. An on-chip MPU is used to control the configuration of the Akida Neuron Fabric as well as off-chip communication of metadata. The Akida NSoC represents a scalable solution utilizing a built in serial chip to chip connectivity to allow up to 64 devices to be arrayed for a single solution.
“While we would love the opportunity to showcase our Akida solution in person to attendees of the Embedded Vision Summit, presenting from our virtual booth actually enables us to demonstrate Akida’s abilities to a potentially larger audience joining from home,” said Louis DiNardo, BrainChip CEO. “We’re eager to show attendees how Akida helps organizations create ultra-low power chips with our licensed intellectual property and high- performance power efficient systems with our integrated circuit. Akida provides the ability to incrementally learn on-chip without the need to retrain in the Cloud. This ability makes us uniquely positioned to help deliver the next generation of AI at the Edge. It’s an exhibit you definitely don’t want to miss.”
The Embedded Vision Summit, the premier conference for innovators adding computer vision and AI to products, will be held as a virtual, online event on September 15 – 25, 2020 on Tuesdays and Thursdays from 9 am – 2 pm PT. The Summit is the only event focused exclusively on deploying computer vision and visual AI, attracting a global audience of companies developing vision-enabled products. The 2020 Summit will feature more than 100 presentations and dozens of exhibitors and technology demonstrations. For the latest updates on the Embedded Vision Summit, follow @EmbVisionSummit on Twitter or visit https://www.embeddedvisionsummit.com.
About BrainChip Holdings Ltd (ASX: BRN)
BrainChip is a global technology company that is producing a groundbreaking neuromorphic processor that brings artificial intelligence to the edge in a way that is beyond the capabilities of other products. The chip is high performance, small, ultra-low power and enables a wide array of edge capabilities that include on-chip training, learning and inference. The event-based neural network processor is inspired by the spiking nature of the human brain and is implemented in an industry standard digital process. By mimicking brain processing BrainChip has pioneered a processing architecture, called Akida™, which is both scalable and flexible to address the requirements in edge devices. At the edge, sensor inputs are analyzed at the point of acquisition rather than through transmission via the cloud to a data center. Akida is designed to provide a complete ultra-low power and fast AI Edge Network for vision, audio, olfactory and smart transducer applications. The reduction in system latency provides faster response and a more power efficient system that can reduce the large carbon footprint of data centers.
Additional information is available at https://www.brainchipinc.com
Mark Smith, email@example.com