About this deal
We wrapped up the tutorial by discussing some of my thoughts, feedback, and suggestions when using the Coral USB Accelerator (be sure to refer them first if you have any questions). After initialisation (which normally happens with magic after a couple of minutes), the sticks can be added in your VM properties. This example takes a camera feed and tracks each uniquely identified object, assigning each object with a persistent ID.
You can find examples of using this for image classification and object detection in the google-coral/tflite repository. Coral is a department of the Google brand that helps to create intelligent designs based on artificial intelligence.In this article, you will read more about Google Coral and how it enables on-device Edge AI with its TPU (Tensor Processing Unit) computing capabilities. After that, we learned how to run the example demo scripts included in the Edge TPU library download. However, most edge AI devices are able to provide offline capabilities (built-in storage, robust auto-rebooting capabilities).
It's build on top of the TensorFlow Lite C++ API and abstracts-away a lot of the code required to handle input tensors and output tensors. In addition, AI inferencing for low-power devices enables the use of Edge AI hardware to power large-scale AI solutions. So yeah… it’s not only doable but with the example schematics provided by Coral and Framework, it is 15 min work.
This system allows predictions about each transformer’s performance and future reliability and, ultimately, the system as a whole. The most popular use cases of Coral TPUs are based on computer vision and visual deep learning on the edge. The examples directory contains directories for images and models along with a selection of Python scripts. A few weeks ago, Google released “Coral”, a super fast, “no internet required” development board and USB accelerator that enables deep learning practitioners to deploy their models “on the edge” and “closer to the data”.