Artificial intelligence (AI) is a system attempting to mimic human behavior, more specifically an electrical and/or mechanical entity that mimics a response to input that resembles what a human would do. The best tangible example of this is voice recognition, where the system needs to understand colloquial terms, abbreviations, pronouns, as well as standard words in order to respond as if you were talking to your best friend. The crux of AI is taking input from something like a sensor or combination of sensors, and determining an appropriate response based on a goal. For example, the goal of a home security system is to protect the home. It must determine if the input from the vibration and sound sensor matches that of a broken window and if so, trip the alarm and notify the authorities. It is trying to match the behavior of if you were on the couch and the window shattered – you would hear it, recognize it, and run to dial for emergency services.
Machine learning (ML) is the ability for a system to improve itself after repetitive use. The idea is that it can use the data it is collecting to better itself and improve. ML was created as a byproduct of creating artificial intelligence as researchers needed a way to improve their response to input without frequently manually updating the system and instead of letting it update and get better on its own. ML is typically a computer algorithm and are used to develop solutions like speech recognition.
Artificial Neural Networks or ANN is an implementation of machine learning, albeit a very advanced implementation of it with many layers. Unlike ML where it takes input and makes a decision in a single flow, an ANN has several nodes that each contribute to what to do based on the data. Changing the behavior of one node can impact the other nodes as well. This creates a more complex structure that closely resembles a human brain.
One of the most highly anticipated developments in the IoT is the infusion of Artificial Intelligence and Machine Learning. By making IoT devices trainable, actionable, and capable of extracting information and learning from the environment, they become more contextually aware and ultimately more useful in a variety of ways.
The IoT has many layers where AI/ML can be implemented. Each layer can make different decisions and offers different value propositions. The bottom has the least amount of data to work with and typically only has regional jurisdiction. As you go up the layers there is more data to be computed and the decisions become larger and have more of an impact on the system. The closer to the top you are the longer it takes for data to get there and the longer it takes for the decision to be passed down and the end-user of the network to see the reaction. For example, you don’t want voice recognition to always go all the way to the cloud and take seconds to come back, you want to compute it locally so that you can have a fast response and thus good user experience.
Starting at the bottom the very end edge of AI/ML in the IoT consist of the simplest end device like small, low power sensors, simple smart home devices like lightbulbs, low-end thermostats and more. Typically, they utilize a microprocessor like a Cortex-M class and a very slimmed-down AI system to make the best decision based on the data they are sensing. This can allow thermostats or door and window sensors to understand the environment they are in and make the best decision as opposed to being calibrated from the factory and potentially making the wrong decision if they are deployed in a unique environment. This can enhance the user experience and allow the device maker and end-user to get everything they need out of the device.
AI/ML at this level is typically still on-premise of the IoT network but has moved to application processors like Cortex-A class. They can compute more data and take input from all the other end nodes or sensors. Here you can make decisions based on the entire household or building instead of sending it to the cloud. You can have the system look at data from both the smart HVAC system and smart lighting system to ensure that if the users have left the building, the lights get turned off and maybe decrease HVAC output to save power. Here is where you first start bridging systems together to take all their inputs in before deciding on what to do.
At the top edge, you are now dealing with a more regional set of data and decision-making that could impact a neighborhood, town, or city. You have even more powerful processors here to handle the increase in data. This is typically municipalities and cities that are dealing with this layer to monitor resource use like electricity, parking space, and more.
The end all be all for data, here it can seem like you have unlimited computational power which isn’t far from the truth. Typically, even if the decision was not made at the cloud level, the data and decision are at least sent to it for analysis to see if it was the right decision or not and if not the system can improve. Here you can bridge many connected systems and manage entire fleets of assets. This is typically used by large entities like retail store owners, industrial players with multi-stage processes, hospitality chains, and more.
Edge device sensors can generate vast quantities of raw data, and therefore occupy large amounts of bandwidth. AI end nodes can pre-process data help reduce bandwidth usage.
Specialized AI modeling software create models that are used by small application MCUs, thus avoiding complicated coding typically required to detect subtle differences in raw data.
AI adds functional benefits and capabilities but without adding to the memory footprint or MCU requirements since code size tends to be reduced. Local processing, also reduces current as communications are reduced.
With reduced datasets transfering to the cloud, bad actors have less data on which to engage in hacking activity. Smaller datasets enabled by AI/ML also help neutralize the ability for hackers to identify data patterns.
Based in San Jose, California, Edge Impulse is the leading development platform for embedded machine learning, free for developers, and used by over 1,000 enterprises worldwide.
Based in Beaverton, Oregon, SensiML pioneers software tools simplifying the development of TinyML code for IoT sensor applications with a unique approach to data collection and labeling.
The Thunderboard Sense 2 offers developers a compact feature-packed reference design. The kit was designed to provide a fast path to develop and prototype IoT products such as wireless sensor nodes. Being rich in a broad range of sensors makes it an ideal platform to illustrate the power of Artificial Intelligence in a small platform IoT device. Thunderboard Sense 2 is fully supported by the industry-leading Simplicity Studio tool suite and comes complete with a fully supported on-board J-Link debugger.