At the recent Embedded Vision Summit, Expedera chief scientist and co-founder Sharad Chole detailed LittleNPU, our new AI processing approach for always-sensing smartphones, security cameras, doorbells, and other consumer devices. Always-sensing cameras persistently sample and…
Read More
Can Compute-In-Memory Bring New Benefits To Artificial Intelligence Inference?
Compute-in-memory (CIM) is not necessarily an Artificial Intelligence (AI) solution; rather, it is a memory management solution. CIM could bring advantages to AI processing by speeding up the multiplication operation at the heart of AI…
Read More
Expedera Expands Global Reach with New Regional Design Centers and Chinese Language Website
Highlights • Expedera opens design centers in Taipei and Shanghai • Chinese language website now available Santa Clara, California, June 21, 2022—Expedera Inc, a leading provider of scalable Deep Learning Accelerator (DLA) semiconductor intellectual property…
Read More
Measuring NPU Performance
There is a lot more to understanding the capabilities of an AI engine than TOPS per watt. A rather arbitrary measure of the number of operations of an engine per unit of power, this metric…
Read More
Expedera Hears You: Wearables Need Low Power AI
In the short time since emerging from stealth mode, Expedera has quickly become known as the Artificial Intelligence (AI) Inference IP company that delivers the best performance per watt and per area. Our Origin product…
Read More