The Coral USB Accelerator brings powerful ML inferencing capabilities to existing Linux systems.
Featuring the Edge TPU — a small ASIC designed and built by Google— the USB Accelerator provides high performance ML inferencing with a low power cost over a USB 3.0 interface. For example, it can execute state-of-the-art mobile vision models such as MobileNet v2 at 100+ fps, in a power efficient manner. This allows you to add fast ML inferencing to your embedded AI devices in a power-efficient and privacy-preserving way.
Models are developed in TensorFlow Lite and then compiled to run on the USB Accelerator.
Edge TPU Key Benefits
- High speed TensorFlow Lite inferencing
- Low power
- Small footprint
Coral by Google is a platform for building devices with local AI
Features
- Google Edge TPU ML accelerator coprocessor
- USB 3.0 Type-C socket
- Supports Debian Linux on host CPU
- Models are built using TensorFlow: Fully supports MobileNet and Inception architectures though custom architectures are possible
Specs
- Edge TPU ML accelerator: ASIC designed by Google that provides high performance ML inferencing for TensorFlow Lite models
- Arm 32-bit Cortex-M0+ Microprocessor (MCU): Up to 32 MHz max, 16 KB Flash memory with ECC, 2 KB RAM
- Connections: USB 3.1 (gen 1) port and cable (SuperSpeed, 5Gb/s transfer speed), Included cable is USB Type-C to Type-A
Supplier Unconfirmed | |
EAR99 | |
Active | |
8471.80.90.00 | |
Automotive | Unknown |
PPAP | Unknown |