画像
Steven Lee
Technical Director
掲載: 2019年12月17日

We hear a lot about artificial intelligence (AI) in IoT and the vast potential it has for streamlining processes and reducing costs. Just by implementing a connected system to monitor equipment, devices, and processes with data analytics, we can already potentially reduce maintenance costs by up to 69% (according to data from MicroAI™). This system can notify users when equipment actually requires servicing or warn them of a potential failure. So, what value would AI provide and is it more complex?

In an IoT-connected world, data from assets are monitored in real-time to analyze device performance and possible failures. The health of the device requires a vast amount of data to establish a “normal” baseline, and for the operator to set thresholds (if they are known) to trigger alerts when it falls outside of that range. As the connected system expands, “big data” will inundate an organization’s resources to store not only the large volume but also the speed at which the data is coming in. If there is more than one “type” of the device, then the data can come in a variety of forms, such as text, video, audio or SMS.

What if some of that processing can be done at the device node to establish the normal behavior and only trigger a signal, alert, or notification when an outlier event happens? Instead of the complex “learn to do it all” AI, what if we just take a more surgical approach? This is when these smaller AI containers come into play, such as MicroAI™.

These small AI algorithms can easily embed into standard MCUs and be trained with what “normal” data looks like, but also learn during operation. Now, the processing can be done directly at the connected device node or the edge of the system. There may be some tradeoffs with system power, but this can be balanced with the frequency of the data aggregation and processing.

Furthermore, when data is processed locally at the edge location through edge computing, data connectivity costs and the data volume being transmitted is much lower. This is because the system is only sending out the alerts/alarms, versus all the raw sensor data being shipped to the cloud to be processed. This becomes exponential cost savings when looking at how much data one asset could generate.

For example, a single IoT sensor could pull five data points per second. If the asset has 10 sensors placed across it and each value represents, conservatively, 10 bytes, over the course of one hour that single asset would generate 1.8MB (5 values x 10 sensors x 10 bytes x 3,600 seconds = 1,800,000 bytes).

Now imagine that you have 100 assets across the facility. You have over 65GB of data over the course of a year. This is just one of the cost savings that come from computing on edge. MicroAI™ allows organizations to focus on their core competencies and not focus on manually processing this data by allowing the data to start working for the organization. For example, when MicroAI™ determines that there is a sign of failure within the machine, the machine can now call an API to open its own work order for a maintenance technician to inspect it.

IoT with Edge AI Platform

One application that truly benefits from this approach is remote systems utilizing a cellular network as the gateway to push data to the cloud. Combine the MicroAI™ with an NB-IoT/CAT-M connected system and take IoT autonomy a step into the future.

Visit https://renesas.micro.ai/register to register for a preview of the IoT with Edge AI platform and learn more about the standard dashboard and features.

この記事をシェアする