AWS Partner Network (APN) Blog
Real-time data monitoring with AWS IoT Core and Imply Data
By Avinash Upadhyaya, Sr. IOT Product Manager – AWS
By Rohan Joshi, Director of Product Management – Imply Data
Imply Data |
There is an increasing demand for real-time monitoring of data due to rapid development of Industry 4.0, connected vehicles, and use cases like asset tracking. Industries such as automotive, energy, aerospace, and manufacturing send millions of daily messages from Internet of Things (IoT) devices to cloud services and require real-time monitoring of this telemetry data.
Unlocking real-time insights from IoT telemetry data has led to improved visibility, automated decision-making, and reduced costs for numerous initiatives. However, as industrial implementation scales, it becomes challenging to ingest and monitor data in real-time due to architectural limitations on latency, concurrency, and scale. Finding real-time insights from IoT telemetry data can sometimes feel daunting, but the returns in business value and impact can be considerable.
In this blog post, we will explain the benefits of sending IoT device data from Amazon Web Services (AWS) IoT Core to Imply’s real-time monitoring and observability system. We will also walk you through a step-by-step approach of using AWS IoT Core and Imply to build a real-time analytics application to help operators and other end users monitor, diagnose, and resolve production issues.
Imply Polaris
Imply Polaris is a managed version of Apache Druid – a time-series data storage and analysis engine which is capable of sub-second queries across data at scale. A combination of fast queries with high concurrency of real-time stream data with historical context provides IoT applications immediate access to all of the data needed to drive actionable insights. They use AWS IoT Core, which is a fully managed service that connects billions of IoT devices to the cloud, and routes trillions of messages from IoT devices to different AWS and third-party cloud services. Also, the IoT Rules engine , which is a feature of AWS IoT Core that enables you to process, decode, and filter IoT messages at scale using SQL-like statements. The Rules engine also lets you define IoT rule actions to route your IoT device data to other AWS and external services, and to any number of HTTP endpoints.
Benefits of sending IoT device data to Imply’s Apache Druid system
Users send IoT device data to Imply for an easy-to-use way to run fast analytics and fast aggregations of metrics emitted from sensors. This is common across a multitude of industries. Examples include:
- Networking and Telecommunications: Router and firewall sensor companies use Imply to monitor, visualize, and trigger alters on telemetry data without the need for pre-processing and performing other computations that normally increase operational complexity.
- Industrial: Manufacturing processes use Imply to ingest, monitor, and analyze global, regional, and plant-level aggregated metrics from industrial sensors. These industries also use Imply to ensure real-time operational visibility of metric data and to monitor granular data as products are manufactured. Additionally, we are seeing manufacturing companies ingest data to Imply’s platform and train AI/ML model for predictive maintenance.
- Logistics: Logistics and asset tracking companies use Imply to keep real-time track of distributed asset fleets across multiple uses. This minimizes downtime by flagging issues before they degrade operations. Additionally, monitoring the data in Imply enables logistics teams to analyze productivity by fleet size and visualize common trends across a global footprint.
Solution overview
In this solution, you will learn how to route data from your IoT devices to Imply via AWS IoT Core. Imply’s high concurrency, ability to provide sub-second queries across data at scale, and ability to stream real-time data with historical context allows IoT applications to have immediate access to all the data needed to drive actionable insights.
Prerequisites
To set up the solution described in this blog post, you will need:
- An AWS account in an AWS Region that supports AWS IoT Core and Amazon Kinesis Data Streams.
- Permissions to create IoT rules, AWS Identity and Access Management (IAM) roles and policies, and access to AWS CloudShell.
- Raspberry Pi or any IoT client/device that is connected to AWS IoT Core
- An Imply Polaris account
Figure 1: Imply Polaris on AWS Architecture
Step 1: Connect your IoT devices to AWS IoT Core
In this step, you will connect your IoT devices to AWS IoT Core by following the steps shown in AWS IoT Core’s developer guide. This step requires you to setup your IoT policies, download the IoT certificates, create IoT Things, and attach your IoT Things and policies to the IoT certificates.
Step 2: Publish MQTT messages from your IoT device to AWS IoT Core
After completing Step 1, you can start publishing MQTT messages from your IoT devices to AWS IoT Core. You can use the AWS IoT MQTT client to subscribe and view the MQTT messages in AWS IoT Core’s management console. For a step-by-step demo of ‘step 1’ and ‘step 2’, you can also refer to this video.
Step 3: Set up an IoT rule and IoT rule action to route your IoT messages to Amazon Kinesis Data Stream or to Apache Kafka
Once AWS IoT Core’s MQTT Broker is subscribed to the messages published by your IoT device, you can use Rules engine feature to filter, process, and decode the IoT device data and to route the data to downstream services like Amazon Kinesis Data Streams or Amazon Managed Streaming for Apache Kafka (Amazon MSK).
To set up an IoT rule, follow the instructions in the AWS IoT Core developer guide. After creating your IoT rule, create an Amazon Kinesis rule action to route your IoT data to Amazon Kinesis Data Streams by following the instructions in the AWS IoT Core developer guide. Alternatively, you can route your IoT device data to Amazon Managed Streaming for Apache Kafka (MSK) or to a self-managed Apache Kafka service using Kafka rule action in the IoT rules console page.
As a third option, you can route your IoT data to an Imply.io HTTPs endpoint using HTTP action in IoT rules. Once you route your IoT data to Kinesis or Apache Kafka, you are ready to stream the data to Imply’s Apache Druid system. For a hands-on demo on how to use IoT rules and rules action, refer to this video – How to Get Started with Rules Engine for AWS IoT Core.
Step 4: Route IoT device data from Kinesis Data Streams to Imply.IO’s Apache Druid system
After your IoT data has been routed to a Kinesis stream, and after you have set up a Polaris account with Imply, you can create a connection from Polaris to Amazon Kinesis to ingest data from your stream into Imply Polaris.
Similarly, if your IoT data is being streamed to Apache Kafka, you can follow the instructions in the Polaris developer guide to ingest data from either Kafka or Apache Kafka into Imply Polaris.
If you would rather push your IoT data directly into Imply Polaris via an HTTP endpoint, you can follow this guide to create a connector and push event data to Polaris by API.
Step 5: Build a monitoring dashboard in Imply
Now that you have some data being streamed into a table in Imply Polaris, you can quickly build some visualizations and a dashboard to give you real-time visibility into your assets.
First, create a data cube, which you can use to explore and visualize the data in the table that contains your event data from your asset. A data cube allows you to explore your data and visualize patterns in an instant at a high-level of granularity. Once you are satisfied with your first visualization, add it to a dashboard.
You can reference the same data cube you created earlier to add more visualization tiles to your dashboard. Now that you have created your first dashboard with analytics that gives you real time visibility, you can now think about taking this a step further by building a simple standalone application with this visualization.
Figure 2: Imply Polaris on AWS Dashboard
Step 6: Embed visualizations with react and express
Imply provides a set of simple tools to embed the IoT event dashboards and visualizations that you created earlier – directly into your application via an iFrame.
The Embedding Visualizations with React and Express will walk you through how to create a very simple web application that shows a different embedded visualization relative to the user.
While this example is simple, it highlights the possibilities of how you can use Imply’s embedding features to quickly integrate critical IoT metrics and visualizations into your application workflow.
To avoid future charges of AWS IoT Core, Kinesis and Imply Polaris, please ensure that you delete the work you have done above.
Conclusion
The above example shows how you can use AWS IoT Core and Imply together to ingest, monitor, and analyze real-time telemetry data from your IoT devices to a fully managed Apache Druid service, without worrying about provisioning or managing any cloud infrastructure. The above solution can be applied to build real-time streaming and monitoring solutions across multiple industries for use cases like triggering alerts, operational monitoring and visibility, predictive maintenance etc. As a next step, you can deploy this solution in your account and test it to understand how AWS IoT Core and Imply Polaris can together meet your industry use case.
Imply Data – AWS Partner Spotlight
Imply Data is a real-time analytics solution to store, query, and visualize event-driven data based on the Apache Druid database.
Contact Imply Data | Partner Overview | AWS Marketplace
About the Authors
Avinash Upadhyaya is Senior Product Manager for AWS IoT Core where he is responsible for defining product strategy, roadmap prioritization, pricing, and go-to-market strategy of AWS cloud’s foundational IoT software service.
Rohan Joshi is a Director of Product Management at Imply Data where he is responsible for helping customers ingest, monitor, and interact with their data stored in Apache Druid.