sexta-feira, novembro 22, 2024
HomeIoTOptimize industrial operations through predictive maintenance using Amazon Monitron, AWS IoT TwinMaker,...

Optimize industrial operations through predictive maintenance using Amazon Monitron, AWS IoT TwinMaker, and Amazon Bedrock


Introduction

Smart buildings and factories have hundreds or thousands of sensors continuously collecting operational data and system health information. These buildings increase efficiency and lower operating costs because the monitoring and data collected allow operations to shift from an “unplanned failures” to predictive maintenance approach.

Operations managers and technicians in industrial environments (such as manufacturing production lines, warehouses, and industrial plants) want to reduce site downtime. Sensors and the measurements they collect are valuable tools to predict maintenance; however, without context the additional information may cloud the big picture. Maintenance teams that focus on a single sensor’s measurements may miss meaningful associations that might otherwise appear to be unrelated. Instead, using a single view that displays assets in spatial context and consolidates measurements from a group of sensors, simplifies failure resolution and enhances predictive maintenance programs.

Our previous blog (Generate actionable insights for predictive maintenance management with Amazon Monitron and Amazon Kinesis) introduces a solution to ingest Amazon Monitron insights (Artificial Intelligence (AI)/Machine Learning (ML) predictions from the measurements) to a shop floor or create work order system. In this second blog, we discuss contextual predictive maintenance with Amazon Monitron through integrations with AWS IoT TwinMaker to create a three-dimensional (3D), spatial visualization of your telemetry. We also introduce an Amazon Bedrock-powered natural language chatbot to access relevant maintenance documentation and measurement insights.

Use cases overview

Using AWS IoT TwinMaker and Matterport, an operation manager can take advantage of a 3D visualization of their facility to monitor their equipment status. With the AWS IoT TwinMaker and Matterport integration, developers can now leverage Matterport’s technology to combine existing data from multiple sources with real-world data to create a fully integrated digital twin. Presenting information in a visual context improves an operators understand and helps to highlight hot spots, which can reduce resolution times.

AWS IoT TwinMaker and Matterport are used in our solution:

  • AWS IoT TwinMaker helps developers create digital twins of real-world systems by providing the following fully-managed features: 1/ access to data from diverse sources; 2/ create entities to virtually represent physical systems, define their relationships, and connect them to data sources; and 3/ combine existing 3D visual models with real-world data to compose an interactive 3D view of your physical environment.
  • Matterport provides options to capture and scan real-world environments, and create immersive 3D models (also known as Matterport spaces). AWS IoT TwinMaker supports Matterport integration so that you can import your Matterport spaces into your AWS IoT TwinMaker scenes. AWS customers can now access Matterport directly from the AWS Marketplace.

Solution Overview

Complete the following steps to create an AWS IoT TwinMaker workspace and connect it to a Matterport space. You will then associate the sensor locations tagged in Matterport with AWS IoT TwinMaker entities. You will use an AWS Lambda function to create an AWS IoT TwinMaker custom data connector. This data connector will associate the entities with the Monitron sensor insights stored in an Amazon Simple Storage Service (Amazon S3) data lake. Lastly, you will visualize your Monitron predictions in spatial 3D using the AWS IoT Application Kit. In this blog, we provide a detailed explanation of section “2. Digital twin – 3D Spatial Visualization” starting with the architecture in Figure 1.

Figure 1: High-level solution architecture

Prerequisites

  • An active GitHub account and login credentials.
  • AWS Account, with an AWS user.
  • AWS IAM Identity Center (successor to AWS Single Sign-On) deployed in the US-East-1 (N. Virginia) or EU-West-1 (Ireland) Regions.
  • Amazon Monitron (sensors and gateway, see Getting Started with Amazon Monitron).
  • A smartphone that uses either iOS (Requires iOS 14.0.0 or later) or Android (version 8.0 or later) and has the Monitron mobile app (iTunes or Google Play).
  • An enterprise-level Matterport account and license, which are necessary for the AWS IoT TwinMaker integration. For more information, see the instructions in the AWS IoT TwinMaker Matterport integration guide. If necessary, contact your Matterport account representative for assistance. If you don’t have an account representative you can use the Contact us form on the Matterport and AWS IoT TwinMaker page.

Note: Be sure that all deployed AWS resources are in the same AWS Region. As well, all the links to the AWS Management Console link to the us-east-l Region. If you plan to use another region, you might need to switch back after following a console link.

Configure Monitron’s data export and create an Export, Transfer, and Load (ETL) pipeline

Follow the instructions in Part 1 of this bog series to build an IoT data lake from Amazon Monitron’s data.

Refer to Understanding the data export schema for the Monitron schema definition.

Note: Any live data exports enabled after April 4th, 2023 streams data following the Kinesis Data Streams v2 schema. If you have an existing data exports that were enabled before this date, the schema follows the v1 format. We recommend using the v2 schema for this solution.

Data lake connection properties

Record the following details from your data lake. This information will be needed in subsequent steps:

  • The Amazon S3 bucket name where data is stored.
  • The AWS Glue data catalog database name.
  • The AWS Glue data catalog table name.

Create the AWS IoT TwinMaker data connector

This section describes a sample AWS IoT TwinMaker custom data connector that connects your digital twins to the data in your data lake. You don’t need to migrate data prior to using AWS IoT TwinMaker. This data connector is comprised of two Lambda functions that AWS IoT TwinMaker invokes to access your data lake:

  • The TWINMAKER_SCHEMA_INITIALIZATION function is used to read the schema of the data source.
  • The TWINMAKER_DATA_READER function is used to read the data.

Note: All code reference in this blog is available under this github link.

Create an IAM role for Lambda

Create an AWS Identity and Access Management (IAM) role that can be assumed by Lambda. The same IAM role will be used by both Lambda functions. Add this IAM policy to the role.

Create an AWS IoT TwinMaker schema initialization function using Lambda

This section provides sample code for the Lambda function to retrieve the data lake schema. This allows AWS IoT TwinMaker to identify the different types of data available in the data source.

  • Function name: TWINMAKER_SCHEMA_INITIALIZATION
  • Runtime: Python 3.10 or newer runtime
  • Architecture: arm64, recommended
  • Timeouts: 1 min 30 sec.

Lambda function source code

Configure the Lambda function environment variables with the data lake connection properties:

Key Value
ATHENA_DATABASE
ATHENA_TABLE
ATHENA_QUERY_BUCKET s3:///AthenaQuery/

Create an AWS IoT TwinMaker data reader function using Lambda

This section provides sample code for the Lambda function that will be used to query data from the data lake based on the request it receives from AWS IoT TwinMaker.

  • Lambda function name: TWINMAKER_DATA_READER
  • Runtime: Python 3.10 or newer runtime
  • Architecture: arm64, recommended
  • Timeouts: 1 min 30 sec.

Lambda function source code.

Configure the Lambda function environment variables with the data lake connection properties:

Key Value
ATHENA_DATABASE
ATHENA_TABLE
ATHENA_QUERY_BUCKET s3:///AthenaQuery/

Create an AWS IoT TwinMaker component and entities to ingest the stream data

If you do not already have an AWS IoT TwinMaker workspace, follow the instructions outlined in the Create a workspace procedure. The workspace is the container for all the resources that will be created for the digital twin.

To setup your AWS IoT TwinMaker Workspace:

  1. Go to the TwinMaker Console.
  2. Choose Create workspace.
    • Enter a name for your workspace. .
    • Select Create an Amazon S3 bucket.
    • Select Auto-generate a new role for the Execution Role drop down.
    • Choose Skip to review and create.
  3. Choose Next.
  4. Then choose Skip to Review and Create.
  5. Choose Create Workspace.

Figure 2: Create Workspace in AWS IoT TwinMaker

In order to ingest the stream data from your IoT data lake, create your own AWS IoT TwinMaker component. For more information, see Using and creating component types.

Use the following sample JSON to create a component that allows AWS IoT TwinMaker access and rights to query data from the data lake:

  1. Open your AWS IoT TwinMaker workspace.
  2. Choose Component Types in the menu in the Navigation pane.
  3. Choose Create Component Type.
  4. Copy the file from the GitHub repository and paste it into the Request portion of the screen. This auto-completes all the fields in this screen.

After creating the components, configure an AWS IoT TwinMaker execution Role to invoke Lambda functions to query the Amazon S3 data via Athena.

  1. Open the TwinMaker console and choose open the Workspaces area.
  2. Choose the workspace you just created.
  3. Identify the execution role used by the workspace.
    • Figure 3: Identify the Execution role
  4. Open the IAM Console.
  5. Choose Policies and then Create Policy.
  6. Choose JSON and then paste this code from GitHub into the window. Replace and into the policy with your values.
  7. Choose Next.
  8. On the Review and create page, enter name as DigitalTwin-TwinMakerLambda.
  9. Choose Create Policy.
  10. Expand the Roles menu.
  11. Search for twinmaker-workspace-YOUR_WORKSPACE_NAME-UNIQUEID and select it.
  12. Expand Add permissions and then Attach policies.
    • Figure 4: Attach policies
  13. Search for DigitalTwin-TwinMakerLambda and select it.
  14. Choose Add permissions.

Entities are digital representations of the elements in a digital twin that capture the capabilities of that element. This element can be a piece of physical equipment or a process. Entities have components associated with them. These components provide data and context for the associated entity. You can create entities by choosing the component type which was created (for more information, see Create your first entity).

  1. Go to the AWS IoT TwinMaker Console.
  2. Open your workspace.
  3. In the Navigation pane, choose Entity.
  4. Choose Create and select Create Entity.
  5. Choose Create entity.
    • Figure 5: Create Entity
  6. Select the entity you just created and choose Add Component.
  7. Enter MonitronData as the name.
  8. Select com.example.monitron.data as the type.
  9. Choose Add Component.
  10. Ensure the entity status changes to Active.
    • Figure 6: Add Component properties
  11. Once the Entity is Active, select the MonitronData component. You should see a list of the available properties listed.

Create 3D visualizations of your physical environment for the digital twin

Once you created the entities in AWS IoT TwinMaker, associate a Matterport tag with them (for more information about using Matterport, read Matterport’s documentation on AWS IoT TwinMaker and Matterport). Follow the documentation AWS IoT TwinMaker Matterport integration to link your Matterport space to AWS IoT TwinMaker.

Import Matterport spaces into AWS IoT TwinMaker scenes

Select the connected Matterport account to add Matterport scans to your scene. Use the following procedure to import your Matterport scan and tags:

  1. Log in to the AWS IoT TwinMaker console.
  2. Create new or open an existing AWS IoT TwinMaker scene where you want to use a Matterport space.
  3. Once the scene has opened, navigate to Settings.
  4. In Settings, under 3rd party resources, find the Connection name and enter the secret you created in the procedure from Store your Matterport credentials in AWS Secrets Manager.
  5. Next, expand the Matterport space dropdown list and choose your Matterport space.
    • Figure 7: Import Matterport Space
  6. After you have imported Matterport tags, the Update tags button appears. Update your Matterport tags in AWS IoT TwinMaker so that they always reflect the most recent changes in your Matterport account.
  7. Select a tag in the scene. You can associate your entity and component to this tag (follow the user guide for instructions, Add model shader augmented UI widgets to your scene).
    • Figure 8: Associate tag to entity

View your Matterport space in your AWS IoT TwinMaker Grafana dashboard

Once the Matterport space is imported into an AWS IoT TwinMaker scene, you can view that scene with the Matterport space in your Amazon Managed Grafana dashboard. If you have already configured Amazon Managed Grafana with AWS IoT TwinMaker, you can open the Grafana dashboard to view your scene with the imported Matterport space.

If you have not configured AWS IoT TwinMaker with Amazon Managed Grafana yet, complete the Amazon Managed Grafana integration process first. You have two choices when integrating AWS IoT TwinMaker with Amazon Managed Grafana. You can use a self-managed Amazon Managed Grafana instance or you can use Amazon Managed Grafana.

See the following documentation to learn more about the Grafana options and integration process:

View your Matterport space in your AWS IoT TwinMaker web application

View your scene with the Matterport space in your AWS IoT app kit web application. For more information, see the following documentation to learn more about using the AWS IoT application kit:

Figure 9: Digital Twin data dashboard with 3D visualization through Matterport

Figure 9 displays the data dashboard with 3D visualization through Matterport Space in an AWS IoT web application. The data collected from Amazon Monitron is presented on the dashboard including alarm status. In addition, the sensor location and status are displayed in the Matterport 3D visualization. These visualizations can help the onsite team identify a problem location directly from the dashboard.

Looking forward: accessing Knowledge through GenAI Chatbot using Amazon Bedrock

Along with the telemetry ingestion, your organization may have hundreds and thousands of pages of standard operating procedures, manuals, and related documentation. During a maintenance event, valuable time could be lost searching and identifying the right guidance. In our third blog, we will demonstrate how the value of your existing knowledge base can be unlocked using generative artificial intelligence (GenAI) and interfaces like chatbots. We will also discuss using Amazon Bedrock to make the knowledge base more readily accessible and include insights from near-real-time, historic measurements, and maintenance predictions from Amazon Monitron.

Figure 10: Digital Twin with 3D visualization through Matterport including AI assistant

Conclusion

In this blog, we outlined a solution using the AWS IoT TwinMaker service to connect data from Amazon Monitron to create a consolidated view of the telemetry data visualized in a 3D representation on a Matterport space. Monitron provides predictive maintenance guidance and AWS IoT TwinMaker allows for visualization the data in a 3D space. This solution presents the data in a contextual manner helping to improve operation response and maintenance. The immersive visualization of the digital twin can also improve communication and knowledge transfer within your operation team by leveraging a realistic representation. This also allows your operation team to optimize the process of identifying the issues and finding the resolution.

Our final blog in this series – Build Predictive Digital Twins with Amazon Monitron, AWS IoT TwinMaker and Amazon Bedrock, Part 3: Accessing Knowledge through GenAI Chatbot extends the Digital Twin to use generative artificial intelligence (GenAI) interfaces (aka chatbots) and make the information more readily accessible.

About the Author

Garry Galinsky is a Principal Solutions Architect at Amazon Web Services. He has played a pivotal role in developing solutions for electric vehicle (EV) charging, robotics command and control, industrial telemetry visualization, and practical applications of generative artificial intelligence (AI). LinkedIn.

Yibo Liang is an Industry Specialist Solutions Architect supporting Engineering, Construction and Real Estate industry on AWS. He has supported industrial customers and partners in digital innovation working across AWS IoT and AI/ML. Yibo has a keen interest in IoT, data analytics, and Digital Twins.

RELATED ARTICLES
- Advertisment -
Google search engine

Most Popular

Recent Comments