Installing AI in an embedded system creates new "knowledge".
e-AI is a solution that turns your information into value.
The evolution of artificial intelligence (AI) technologies such as machine learning and deep learning has been remarkable in recent years, and the range of application is rapidly expanding from the cloud market mainly focused on the IT field to the embedded system market. For example, service robots.
Embedded devices equipped with AI may become necessary in future for service robots that need to perform judgment and control according to various situations. In addition, it is expected that the development of embedded devices equipped with AI will accelerate not only for service robots but also for services and their associated devices in general that require interaction with people. Under these circumstances, Renesas "e-AI" implements artificial intelligence technology in embedded devices.
As the first step of the solution, we introduce a new function to implement the result of deep learning in the endpoint embedded device, specifically a plug-in compatible with the open source Eclipse-based integrated development environment "e² studio".
|1. e-AI translator||Converts the learned AI network of Caffe or TensorFlow, open source machine learning / deep learning frameworks, to the MCU/MPU development environment.|
|2. e-AI checker||Based on the output result from the translator, the ROM/RAM mounting size and the inference execution processing time are calculated while referring to the information of the selected MCU/MPU.|
|3. e-AI importer||Connects a new AI framework specialized for embedded systems that enables real-time performance and resource-saving design in the MCU/MPU development environment.|