Organize and Process IoT Data Quickly Using AWS IoT Analytics

November 16, 2022 | Comments(0) |

TABLE OF CONTENT

1. Introduction
2. Creation of Channel
3. Creation of Data Store
4. Creation of Data Pipeline
5. Creation of Dataset
6. Data Ingestion and Testing
7. Conclusion
8. About CloudThat
9. FAQs

 

Introduction

IoT device use has recently increased in applications such as agriculture, smartwatches, smart buildings, IoT retail shops, object tracking, and many more. These Internet of Things devices generate a large amount of data, which is subsequently transported to the cloud to be analyzed. 

Because IoT data is generally unorganized and difficult to evaluate, experts must first format it before beginning the analytics process. AWS IoT Analytics will enable you to convert unstructured data to structured data and then analyze it.

This blog will show you how to create a dataset with AWS IoT Analytics. We need to create a channel, a data pipeline, and data storage to create a dataset.

iot1

Creation of Channel

  1. Log in to AWS Console and type AWS IoT Analytics into the search box, then pick the IoT Analytics service. In the IoT Analytics service, choose the channel and click the Create Channel button in the upper right corner.
  2. Enter testing_channel as the Channel name, then select Service managed storage as the Storage type, which means AWS IoT Analytics will manage the volumes on the user’s behalf. Data will be stored in S3 in the background. Then, click Next.
  3. On the Ingesting messages from IoT core option to receive messages in a particular topic, enter the MQTT topic as testing/# in the topic filter.
  4. Click Create New in the IAM Role option, enter testing_channel_role as the role name, and then click Create Role. Then click next and finally choose the create channel option; the channel will be created in a short time.

iot2

Creation of Data Store

  1. Choose Data Stores in the AWS IoT Analytics console, then click Create Data Store
  2. In the top right corner, enter the Datastore ID as testing_datastore, then click next.
  3. Select Service managed storage as the Storage type, then click next, and choose data format as JSON.
  4. Click next, leave Custom data partitioning as its default, then click next.
  5. Finally review and click Create Data Store.

iot3

Creation of Data Pipeline

  1. In the AWS IoT Analytics console, select Pipelines, then click Create pipeline in the top right corner and name the pipeline testing_pipeline.
  2. Choose testing_channel for the pipeline source and testing datastore for the pipeline output.
  3. In Infer message attributes, enter the Attribute name as Unit, Value as deg C, and data type string. then click Next.

We can see pipeline activity in Enrich, transform, and filter messages, which will help us transform the data using lambda or use different formulas, or we can add or remove data points. as of now leave it as default and click next. Finally, review the pipeline and then click the Create Pipeline option. The pipeline will be created in a short amount of time.

ito4

Creation of Dataset

  1. In the AWS IoT Analytics console, select datasets, then click Create a dataset.
  2. In the top right corner, on the next page choose the SQL datasets Option then name the Dataset testing_dataset.
  3. Choose testing_datastore for the Datastore source then click next. The AWS IoT Analytics dataset will query the data from the datastore using SQL.
  4. Enter SELECT * FROM testing datastore LIMIT 100 in the Author SQL query field and click Next. Click next after leaving the Data selection filter options alone.
  5. Leave the Set query schedule options alone and press the Next button. Click Next after leaving Configure the findings of your dataset selections alone.
  6. Configuring dataset content delivery rules will assist in storing the dataset in S3; for the time being, leave that option alone and click next.
  7. Finally, look over the dataset and select the Create dataset button. The dataset will be available in a short period.

iot5

Data Ingestion and Testing

  1. To ingest the IoT data to AWS IoT analytics navigate to the AWS IoT core and click MQTT test client to choose a publish to a topic option.
  2. Enter the topic name as testing/121 and message payload as {“temperature”: 27,”humidity”: 60} and click publish, similarly publish the message with different values as {“temperature”:28,”humidity”:65}, {“temperature”: 26,”humidity”:63}, {“temperature”: 22,”humidity”: 62}, {“temperature”:21,”humidity”:61}, {“temperature”:25,”humidity”:65}.
  3. Then navigate to the testing_dataset in the IoT Analytics, on the top right corner click the Run now option and click the Content option.
  4. Finally, click on created dataset name so we can see the created dataset.

iot6

iot7

Conclusion

Thus, we have seen how to create a channel, pipeline, data store, and dataset in AWS IoT Analytics. When we use AWS IoT analytics for IoT applications, we save time and perform analytics on the same data, which we can then display using Quicksight.

About CloudThat

CloudThat is also the official AWS (Amazon Web Services) Advanced Consulting Partner and Training partner and Microsoft gold partner, helping people develop knowledge of the cloud and help their businesses aim for higher goals using best-in-industry cloud computing practices and expertise. We are on a mission to build a robust cloud computing ecosystem by disseminating knowledge on technological intricacies within the cloud space. Our blogs, webinars, case studies, and white papers enable all the stakeholders in the cloud computing sphere.

Drop a query if you have any questions regarding IoT Analytics and I will get back to you quickly.

To get started, go through our Consultancy page and Managed Services Package that is CloudThat’s offerings.

FAQs

  1. What are the steps involved in AWS IoT Analytics?

Creation of Channel, creation of datastore, creation of pipeline, and creation of the dataset.

  1. What are the applications of AWS IoT Analytics?

Smart agriculture, Maintenance that is predicted, Proactive supply replenishment, and Process efficiency evaluation.


Leave a Reply