The Internet of Things on AWS – Official Blog
Collecting, organizing, monitoring, and analyzing industrial data at scale using AWS IoT SiteWise (Part 3)
Post by Asim Kumar Sasmal, Senior Data Architect in the IoT Global Specialty Practice of AWS Professional Services and Saras Kaul, Senior Product Manager of AWS IoT SiteWise.
[Before reading this post, read Part 1 and Part 2 in the series.]
In Part 1 of this series, you learned how to model and ingest data from industrial sites in a secure, cost-effective, and reliable manner using AWS IoT SiteWise (in preview).
In Part 2, you learned how to monitor key measurements and metrics of your assets in near-real time using SiteWise Monitor, a new capability of AWS IoT SiteWise.
In Part 3 (this post), you learn how to:
- Subscribe to the AWS IoT SiteWise modeled data via AWS IoT Core rules engine
- Enable conditions monitoring and send notifications or alerts using AWS IoT Events in near-real time
- Enable Business Intelligence (BI) reporting on historical data using Amazon QuickSight
About AWS IoT Events
AWS IoT Events is a fully managed service that makes it easy to detect and respond to events from IoT sensors and applications. It lets you monitor your equipment or device fleets for failures or changes in operation. It can also trigger actions when such events occur. For more information, see Getting Started with the AWS IoT Events Console.
About Amazon QuickSight
Amazon QuickSight is a fast business analytics service that you can use to:
- Build visualizations
- Perform ad hoc analysis
- Get business insights quickly from your data in a self-serve fashion
As a fully managed and hosted service, there is no client-server to manage. With Amazon QuickSight, you can easily create and publish interactive dashboards that include built-in ML Insights. Access the dashboards from any device and embed them into your custom applications, portals, and websites. With pay-per-session pricing, Amazon QuickSight lets you give everyone access to the data they need, while only paying for what you use.
Solution Overview
AWS IoT SiteWise can publish asset data to AWS IoT via MQTT publish-subscribe message broker, so that you can interact with your asset data using other AWS services. With AWS IoT Core rules engine, you can conditionally route your data to other services. For this post, you ingest AWS IoT SiteWise modeled data into an AWS IoT Analytics Channel to transform the JSON messages slightly (mainly to flatten them) for querying from the AWS IoT Analytics Datastore. You then setup two AWS IoT Analytics Datasets – one dataset is to feed into an AWS IoT Events detector model for conditions monitoring of key aggregated metrics of your equipments and the other dataset is for BI reporting using Amazon QuickSight.
The following diagram illustrates the high-level end-to-end solution described in this multi-part post and shows the AWS services involved.
Walkthrough
There are six sections in this walkthrough:
- Setting up an AWS IoT Analytics Channel, Pipeline, and Data Store
- Setting up an AWS IoT Core rule to ingest SiteWise modeled data into AWS IoT Analytics
- Setting up an AWS IoT Events detector model for conditions monitoring
- Setting up an AWS IoT Analytics dataset content delivery to the AWS IoT Events detector model
- Setting up an AWS IoT Analytics dataset to visualize in Amazon QuickSight
- Visualizing the AWS IoT Analytics dataset in Amazon QuickSight
Prerequisites
- Use the prerequisites from Part 1.
- You have an Amazon SNS topic named iote_send_email_sns_topic to receive email notification. Make a note of the SNS topic ARN, which you will use later.
- You have an IAM role named iote_equip_temp_role with trust relationships for iotevents.amazonaws.com, iotanalytics.amazonaws.com, and iot.amazonaws.com. The role also has an IAM policy with the following permission on the SNS topic:
{ "Version": "2012-10-17", "Statement": [ { "Effect": "Allow", "Action": "sns:Publish", "Resource": "arn:aws:sns:us-west-2:<AWS-Account-ID>:iote_send_email_sns_topic" } ] }
Setting up an AWS IoT Analytics Channel, Pipeline, and Datastore
Following are the steps to setup an AWS IoT Analytics Channel, Pipeline, and Data Store:
- Create an IAM policy named sitewise_blogpost_iota with the following permissions on the AWS IoT Analytics and attach that to the existing IAM role iote_equip_temp_role:
{ "Version": "2012-10-17", "Statement": [ { "Action": "iotanalytics:BatchPutMessage", "Resource": "*", "Effect": "Allow" } ] }
- Create an AWS Lambda function named sitewise_blogpost_transform_function with the following Python 3.7 code:
import json import logging import sys import time # Configure logging logger = logging.getLogger() logger.setLevel(logging.INFO) streamHandler = logging.StreamHandler(stream=sys.stdout) formatter = logging.Formatter('%(asctime)s - %(name)s - %(levelname)s - %(message)s') streamHandler.setFormatter(formatter) logger.addHandler(streamHandler) def lambda_handler(event, context): logger.info("event: {}".format(event)) print(json.dumps(event, indent=2)) transformed = [] for e in event: logger.info("e: {}".format(json.dumps(e, indent=2))) swtype=e['type'] asset_id = e['payload']['assetId'] property_id = e['payload']['propertyId'] logger.info("type: {} asset_id: {} property_id: {}".format(swtype, asset_id, property_id)) row = {"type": swtype, "asset_id": asset_id, "property_id": property_id} for v in e['payload']['values']: logger.info("v: {}".format(v)) time_in_seconds = int(time.time()) if 'timestamp' in v and 'timeInSeconds' in v['timestamp']: logger.debug("timeInSeconds in payload") time_in_seconds = v['timestamp']['timeInSeconds'] value = "" if 'doubleValue' in v['value']: logger.debug("doubleValue in payload") value = v['value']['doubleValue'] valuetype= "double" if 'integerValue' in v['value']: logger.debug("integerValue in payload") value = v['value']['integerValue'] valuetype= "integer" if 'booleanValue' in v['value']: logger.debug("booleanValue in payload") value = v['value']['booleanValue'] valuetype= "boolean" if 'stringValue' in v['value']: logger.debug("stringValue in payload") value = v['value']['stringValue'] valuetype= "string" quality = "" if 'quality' in v: logger.debug("quality in payload") quality = v['quality'] row['timestamp'] = time_in_seconds row['quality'] = quality row['value'] = value row['valuetype'] = valuetype logger.debug("row: {}".format(row)) transformed.append(row) logger.info("transformed: {}\n".format(json.dumps(transformed, indent=2))) return transformed
- Make sure that you grant permission for AWS IoT Analytics service to execute the Lambda function above using the following AWS CLI command:
aws lambda add-permission --function-name 'sitewise_blogpost_transform_function' --region 'us-west-2' --statement-id 1234 --principal iotanalytics.amazonaws.com --action lambda:InvokeFunction --profile default
- Create an AWS IoT Analytics Channel named sitewise_blogpost_channel with Service-managed Amazon S3 bucket as the storage type (default). You can also choose your own Amazon S3 buckets as the storage type.
aws iotanalytics create-channel --cli-input-json file://mychannel.json --region 'us-west-2' --profile default
The file
mychannel.json
contains the following code:{ "channelName": "sitewise_blogpost_channel" }
Sample run output:
{ "channelArn": "arn:aws:iotanalytics:us-west-2:<AWS-Account-ID>:channel/sitewise_blogpost_channel", "channelName": "sitewise_blogpost_channel", "retentionPeriod": { "unlimited": true } }
- Create an AWS IoT Analytics Data Store named sitewise_blogpost_datastore with Service-managed Amazon S3 bucket as the storage type (default). You can also choose your own Amazon S3 buckets as the storage type.
aws iotanalytics create-datastore --cli-input-json file://mydatastore.json --region 'us-west-2' --profile default
The file
mydatastore.json
contains the following code:{ "datastoreName": "sitewise_blogpost_datastore" }
Sample run output:
{ "datastoreName": "sitewise_blogpost_datastore", "datastoreArn": "arn:aws:iotanalytics:us-west-2:<AWS-Account-ID>:datastore/sitewise_blogpost_datastore", "retentionPeriod": { "unlimited": true } }
- Create an AWS IoT Analytics Pipeline named sitewise_blogpost_pipeline with the pipeline source as sitewise_blogpost_channel and the pipeline output as sitewise_blogpost_datastore.
aws iotanalytics create-pipeline --cli-input-json file://mypipeline.json --region 'us-west-2' --profile default
The file
mypipeline.json
contains the following code:{ "pipelineName": "sitewise_blogpost_pipeline", "pipelineActivities": [ { "channel": { "name": "mychannelactivity", "channelName": "sitewise_blogpost_channel", "next": "mylambdaactivity" } }, {"lambda": { "name": "mylambdaactivity", "lambdaName": "sitewise_blogpost_transform_function", "batchSize": 10, "next": "mydatastoreactivity" } }, { "datastore": { "name": "mydatastoreactivity", "datastoreName": "sitewise_blogpost_datastore" } } ] }
Sample run output:
{ "pipelineArn": "arn:aws:iotanalytics:us-west-2:<AWS-Account-ID>:pipeline/sitewise_blogpost_pipeline", "pipelineName": "sitewise_blogpost_pipeline" }
Note that the batch size is set to 10 for the Lambda invocation, which can be increased up to 1000 based on your SLA latency requirements and velocity of your streaming data.
Setting up an AWS IoT Core rule to ingest SiteWise modeled data into AWS IoT Analytics
Create an AWS IoT Analytics rule that sends messages to the Channel that you created earlier.
Replace the IAM role ARN with iote_equip_temp_role ARN.
aws iot create-topic-rule --rule-name sitewise_blogpost_rule_for_iota --topic-rule-payload file://rule.json --region 'us-west-2' --profile default
The file rule.json
contains the following code:
{
"sql": "SELECT * FROM '$aws/sitewise/asset-models/+/assets/+/properties/+'",
"ruleDisabled": false,
"awsIotSqlVersion": "2016-03-23",
"actions": [
{
"iotAnalytics": {
"channelName": "sitewise_blogpost_channel",
"roleArn": "arn:aws:iam::<AWS-ACCOUNT-ID>:role/iote_equip_temp_role"
}
}
]
}
After the rule is created, navigate to AWS IoT Analytics and create a test dataset by selecting a sample of 10 records from the sitewise_blogpost_datastore as below to make sure you are getting the data processed all the way through and available for querying using AWS IoT Analytics Dataset SQL query.
Setting up an AWS IoT Events detector model for condition monitoring
Now, to monitor the average equipment temperature during the last five minutes for the Unit 2 PLC of all three wind turbines, create an AWS IoT Events detector model named iote_equip_temp_detector_model.
The detector model has two states named Good and Critical. There may be a situation where the average equipment temperature climbs above a threshold of 70ºF (a manufacturer’s specification) three times in a row. In that case, the detector model instance for the corresponding Wind Turbine’s Unit 2 PLC switches its state from Good to Critical.
Similarly, the detector model instance switches its state from Critical to Good when the average equipment temperature drops below 70ºF three times in a row. Each state transition sends an email notification via Amazon SNS to your operations team to perform the necessary actions.
Use the following steps to create the detector model.
- Create an AWS IoT Events input named iote_equip_temp:
aws iotevents create-input --cli-input-json file://iote_equip_temp_input.json --region 'us-west-2' --profile default
The file
iote_equip_temp_input.json
contains the following code:{ "inputName": "iote_equip_temp", "inputDescription": "IoT Events Equipment Temperature Monitoring", "inputDefinition": { "attributes": [ { "jsonPath": "name" }, { "jsonPath": "avg_value" } ] } }
Sample run output:
{ "inputConfiguration": { "status": "ACTIVE", "inputArn": "arn:aws:iotevents:us-west-2:<AWS-ACCOUNT-ID>:input/iote_equip_temp", "lastUpdateTime": 1574882105.027, "creationTime": 1574882105.027, "inputName": "iote_equip_temp", "inputDescription": "IoT Events Equipment Temperature Monitoring" } }
- Create an AWS IoT Events detector model named iote_equip_temp_detector_model:
aws iotevents create-detector-model --cli-input-json file://iote_equip_temp_detector_model.json --region 'us-west-2' --profile default
The file
iote_equip_temp_detector_model.json
contains the following code (remember to replace the IAM role ARN for iote_equip_temp_role and SNS topic ARN for iote_send_email_sns_topic).{ "detectorModelName": "iote_equip_temp_detector_model", "detectorModelDescription": "AWS IoT Events Equipment Temperature Monitoring Detector Model", "detectorModelDefinition": { "states": [ { "stateName": "Critical", "onInput": { "events": [ { "eventName": "DecrementcriticalCounter", "condition": "convert(Decimal,$input.iote_equip_temp.avg_value) <= 70", "actions": [ { "setVariable": { "variableName": "criticalCounter", "value": "$variable.criticalCounter - 1" } } ] } ], "transitionEvents": [ { "eventName": "to_Good", "condition": "$variable.criticalCounter <= 1", "actions": [ { "sns": { "targetArn": "arn:aws:sns:us-west-2:<AWS-ACCOUNT-ID>:iote_send_email_sns_topic" } } ], "nextState": "Good" } ] }, "onEnter": { "events": [] }, "onExit": { "events": [] } }, { "stateName": "Good", "onInput": { "events": [ { "eventName": "IncrementcriticalCounter", "condition": "convert(Decimal,$input.iote_equip_temp.avg_value) > 70", "actions": [ { "setVariable": { "variableName": "criticalCounter", "value": "$variable.criticalCounter + 1" } } ] }, { "eventName": "ResetcriticalCounter", "condition": "convert(Decimal,$input.iote_equip_temp.avg_value) <= 70", "actions": [ { "setVariable": { "variableName": "criticalCounter", "value": "0" } } ] } ], "transitionEvents": [ { "eventName": "to_Critical", "condition": "$variable.criticalCounter >= 2", "actions": [ { "sns": { "targetArn": "arn:aws:sns:us-west-2:<AWS-ACCOUNT-ID>:iote_send_email_sns_topic" } } ], "nextState": "Critical" } ] }, "onEnter": { "events": [ { "eventName": "Initialization", "condition": "true", "actions": [ { "setVariable": { "variableName": "criticalCounter", "value": "0" } } ] } ] }, "onExit": { "events": [] } } ], "initialStateName": "Good" }, "roleArn": "arn:aws:iam::<AWS-ACCOUNT-ID>:role/iote_equip_temp_role", "key": "name" }
Sample run output:
{ "detectorModelConfiguration": { "status": "ACTIVATING", "detectorModelDescription": "AWS IoT Events Equipment Temperature Monitoring Detector Model", "lastUpdateTime": 1574882287.039, "roleArn": "arn:aws:iam::<AWS-ACCOUNT-ID>:role/iote_equip_temp_role", "creationTime": 1574882287.039, "detectorModelArn": "arn:aws:iotevents:us-west-2:<AWS-ACCOUNT-ID>:detectorModel/iote_equip_temp_detector_model", "key": "name", "detectorModelName": "iote_equip_temp_detector_model", "detectorModelVersion": "1" } }
In case you wonder about the condition “$variable.criticalCounter >= 2” for transition event “to_Critical” that should have been “>=3”, it is due to a current limitation in AWS IoT Events.
- In the AWS IoT Events console, choose Detector models and select iote_equip_temp_detector_model. Choose Edit and verify the detector model that you just created in the AWS CLI.
- Create a new IAM policy or modify the existing policy of your IAM role iote_equip_temp_role with the following permissions on the AWS IoT Events input (iote_equip_temp) that you created earlier:
{ "Version": "2012-10-17", "Statement": [ { "Effect": "Allow", "Action": "iotevents:BatchPutMessage", "Resource": [ "arn:aws:iotevents:us-west-2:<AWS-ACCOUNT-ID>:input/iote_equip_temp" ] } ] }
Remember to replace the IAM role ARN of your AWS IoT Events input iote_equip_temp noted earlier.
Setting up an AWS IoT Analytics dataset content delivery to the AWS IoT Events detector model
Now that you have created your AWS IoT Events detector model, create an AWS IoT Analytics Dataset named iote_equip_temp_dataset. The content delivery target is the AWS IoT Events detector model input – iote_equip_temp.
Remember to replace the IAM role ARN as well as the assed_id and property_id for your environment. To get the asset_id and property_id, you can use list-asset
to list all your assets and then describe-asset
. You can also get the human readable name of the asset property from describe-asset
API.
aws iotanalytics create-dataset --cli-input-json file://mydataset.json --region 'us-west-2' --profile default
The file mydataset.json
contains the following code:
{
"datasetName": "iote_equip_temp_dataset",
"actions": [
{
"actionName": "myaction",
"queryAction": {
"sqlQuery": "SELECT Replace(Replace(asset_property_name, '/', '-'), ' ') name, avg_value FROM (SELECT CASE WHEN asset_id = '4be96ade-55ec-40fd-b2ed-277bdcb83a4e' and property_id = '912e4cc3-7ceb-4b1f-951e-ffacc618f7dc' THEN '/Wind Turbine 1/Unit 2 PLC/Equipment Temperature' WHEN asset_id = '6390c711-86ea-4d58-a97a-3b52f43388aa' and property_id = '912e4cc3-7ceb-4b1f-951e-ffacc618f7dc' THEN '/Wind Turbine 2/Unit 2 PLC/Equipment Temperature' WHEN asset_id = 'a5642995-b599-4449-a0db-ea5de7e074af' and property_id = '912e4cc3-7ceb-4b1f-951e-ffacc618f7dc' THEN '/Wind Turbine 3/Unit 2 PLC/Equipment Temperature' ELSE 'unknown' END asset_property_name, Avg(Cast(value AS DOUBLE)) AS avg_value FROM sitewise_blogpost_datastore WHERE From_unixtime(timestamp) > current_timestamp - interval '5' minute AND asset_id IN ( '4be96ade-55ec-40fd-b2ed-277bdcb83a4e', '6390c711-86ea-4d58-a97a-3b52f43388aa', 'a5642995-b599-4449-a0db-ea5de7e074af' ) AND property_id IN ( '912e4cc3-7ceb-4b1f-951e-ffacc618f7dc') GROUP BY 1)temp"
}
}
],
"contentDeliveryRules": [
{
"destination": {
"iotEventsDestinationConfiguration": {
"inputName": "iote_equip_temp",
"roleArn": "arn:aws:iam::<AWS-ACCOUNT-ID>:role/iote_equip_temp_role"
}
}
}
],
"triggers": [
{
"schedule": {
"expression": "cron(0/5 * * * ? *)"
}
}
]
}
Sample run output:
{
"datasetName": "iote_equip_temp_dataset",
"datasetArn": "arn:aws:iotanalytics:us-west-2:<AWS-ACCOUNT-ID>:dataset/iote_equip_temp_dataset"
}
After the dataset is created, either wait five minutes for the dataset to run as scheduled or run it manually from AWS CLI as follows:
aws iotanalytics create-dataset-content --dataset-name "iote_equip_temp_dataset" --region 'us-west-2' --profile default
Sample run output:
{
"versionId": "f2ade0be-092f-4686-a4aa-8ecbc967cb5c"
}
Wait for the content to be created by running the following command. The state should show as SUCCEEDED for your dataset content to be available on Amazon S3.
aws iotanalytics get-dataset-content --dataset-name "iote_equip_temp_dataset" --region 'us-west-2' --profile default
Sample run output:
{
"status": {
"state": "SUCCEEDED"
},
"timestamp": 1574642608.767,
"entries": [
{
"dataURI": "https://aws-iot-analytics-dataset-cb3a5eef-4a9f-423c-8fd7-beb9ee6210d5xxxxxxxxxxxxx"
}
]
}
Your AWS IoT Analytics Dataset results are sent to the AWS IoT Events detector model.
In the AWS IoT Events console, choose Detector models, iote_equip_temp_detector_model to see the three detector model instances – one for each Unit 2 PLC of all three Wind Turbines.
After the AWS IoT Analytics dataset ran a few times as scheduled, all three Unit 2 PLC from the three Wind Turbines transitioned from a Good to a Critical state as shown below:
You should have received a similar email as below indicating the Critical status for Unit 2 PLC.
Setting up an AWS IoT Analytics Dataset to visualize in Amazon QuickSight
One of your requirements for an end-to-end use case is to visualize the lowest-grain Equipment Temperature readings for all three Wind Turbines for the last hour in Amazon QuickSight.
Create an AWS IoT Analytics Dataset named quicksight_equip_temp_dataset producing the lowest-grain equipment temperature readings for the last hour for all three Wind Turbines as follows.
Remember to replace the asset_id and property_id for your environment. To get the asset_id and property_id, you can use list-asset
to list all your assets and then describe-asset
. You can also get the human readable name of the asset property from describe-asset
API.
aws iotanalytics create-dataset --cli-input-json file://mydataset2.json --region 'us-west-2' --profile default
The file mydataset2.json
contains the following code:
{
"datasetName": "quicksight_equip_temp_dataset",
"actions": [
{
"actionName": "myaction",
"queryAction": {
"sqlQuery": "SELECT timestamp, equip_temp, Replace(Replace(asset_property_name, '/', '-'), ' ') name FROM (SELECT CASE WHEN asset_id = '4be96ade-55ec-40fd-b2ed-277bdcb83a4e' and property_id = '912e4cc3-7ceb-4b1f-951e-ffacc618f7dc' THEN '/Wind Turbine 1/Unit 2 PLC/Equipment Temperature' WHEN asset_id = '6390c711-86ea-4d58-a97a-3b52f43388aa' and property_id = '912e4cc3-7ceb-4b1f-951e-ffacc618f7dc' THEN '/Wind Turbine 2/Unit 2 PLC/Equipment Temperature' WHEN asset_id = 'a5642995-b599-4449-a0db-ea5de7e074af' and property_id = '912e4cc3-7ceb-4b1f-951e-ffacc618f7dc' THEN '/Wind Turbine 3/Unit 2 PLC/Equipment Temperature' ELSE 'unknown' END asset_property_name, From_unixtime(timestamp) AS timestamp, Cast(value AS DOUBLE) AS equip_temp FROM sitewise_blogpost_datastore WHERE From_unixtime(timestamp) > current_timestamp - interval '1' hour AND asset_id IN ( '4be96ade-55ec-40fd-b2ed-277bdcb83a4e', '6390c711-86ea-4d58-a97a-3b52f43388aa', 'a5642995-b599-4449-a0db-ea5de7e074af' ) AND property_id IN ( '912e4cc3-7ceb-4b1f-951e-ffacc618f7dc'))temp"
}
}
],
"triggers": [
{
"schedule": {
"expression": "cron(0 * * * ? *)"
}
}
]
}
Sample run output:
{
"datasetName": "quicksight_equip_temp_dataset",
"datasetArn": "arn:aws:iotanalytics:us-west-2:<AWS-ACCOUNT-ID>:dataset/quicksight_equip_temp_dataset"
}
As explained earlier, after the AWS IoT Analytics dataset is created, either wait for the dataset to run as scheduled (which in this case is one hour) or run it manually from the AWS CLI.
Visualizing the AWS IoT Analytics Dataset in Amazon QuickSight
AWS IoT Analytics provides direct integration with Amazon QuickSight. To visualize the AWS IoT Analytics dataset quicksight_equip_temp_dataset, see Visualizing AWS IoT Analytics Data with QuickSight.
After you create an Amazon QuickSight Dataset for your AWS IoT Analytics Dataset (quicksight_equip_temp_dataset), you can create a sample analysis.
Summary
In Part 3 of this multi-part series, you subscribed to the AWS IoT SiteWise modeled data via AWS IoT Core rules engine, enabled condition monitoring and send notifications or alerts using AWS IoT Events in near-real time, and finally, enabled Business Intelligence (BI) reporting on historical data using Amazon QuickSight.
This multi-part series described a secure, cost-effective, and reliable field-to-cloud solution. You learned how to:
- Ingest all your data from hundreds of industrial sites that have tens of thousands of PLCs and sensors
- Visualize key measurements and metrics for the devices, processes, and equipment in near-real time
- Enable condition monitoring and send notifications and alerts in near-real time to take actions when needed
- Enable Business Intelligence (BI) reporting on historical data for reporting
Hopefully, you have found this post informative and the proposed solution walkthrough helpful. As always, AWS welcomes feedback. Please submit comments or questions below.