sexta-feira, novembro 22, 2024
HomeIoTHow to synchronize AWS IoT SiteWise assets and data across AWS accounts

How to synchronize AWS IoT SiteWise assets and data across AWS accounts


Introduction

As industrial and manufacturing companies embark on their digital transformation journey, they are looking to leverage advanced technologies for increased efficiency, productivity, quality control, flexibility, cost reduction, supply chain optimization, and competitive advantage in the rapidly evolving digital era. AWS customers in the manufacturing and industrial space, increasingly leverage AWS IoT SiteWise to modernize their industrial data strategy and unlock the full potential of their operational technology. AWS IoT SiteWise empowers you to efficiently collect, store, organize, and monitor data from industrial equipment at scale.It also enables you to derive actionable insights, optimize operations, and drive innovation through data-driven decisions.

The journey often begins with a Proof of Value (PoV) case study in a development environment. This approach provides you with an opportunity to explore how data collection and asset modelling with a solution that includes AWS IoT SiteWise can help. As you become comfortable with the solution, you could scale more assets or facilities into a production environment from staging over time. This blog post provides an overview of the architecture and sample code to migrate the assets and data in AWS IoT SiteWise from one deployment to another, while ensuring data integrity and minimizing operational overhead.

Getting started with AWS IoT SiteWise

During the PoV phase, you establish data ingestion pipelines to stream near real-time sensor data from on-premises data historians, or OPC-UA servers, into AWS IoT SiteWise. You can create asset models that digitally represent your industrial equipment to capture the asset hierarchy and critical metadata within a single facility or across multiple facilities. AWS IoT SiteWise provides API operations to help you import your asset model data (metadata) from diverse systems in bulk, such as process historians in AWS IoT SiteWise at scale. Additionally, you can define common industrial performance indicators (KPIs) using the built-in library of operators and functions available in AWS IoT SiteWise. You can also create custom metrics that are triggered by equipment data on arrival or computed at user-defined intervals.

Setting up multiple non-production environments on a factory floor can be challenging due to legacy networking and strict regulations associated to the plant floor – in addition to delays in hardware procurement. Many customers transition the same hardware from non-production to production by designating and certifying the hardware for production use after validation completes.

To accelerate and streamline the deployment process, you need a well-defined approach to migrate their IoT SiteWise resources (asset, hierarchies, metrics, transforms, time-series, and metadata) between AWS accounts as part of your standard DevOps practices.

AWS IoT SiteWise stores data across storage tiers that can support training machine learning (ML) models or historical data analysis in production. Through this blogpost we provide an outline about how to migrate the asset models, asset hierarchies, and historical time series data from the development environment to the staging and production environments that are hosted on AWS.

Solution Walkthrough

Let’s begin by discussing the technical aspects of migrating AWS IoT SiteWise resources and data between AWS accounts. We provide a step-by-step guide on how to export and import asset models and hierarchies using IoT SiteWise APIs. We also discuss how to transfer historical time series data using Amazon Simple Storage Service (Amazon S3) and the AWS IoT SiteWise BatchPutAssetPropertyValue API operation.

By following this approach, you can promote your AWS IoT SiteWise setup and data through the development lifecycle as you scale your industrial IoT applications into production. The following is an overview of the process:

  1.  AWS IoT Sitewise metadata transfer:
    1.  Export AWS IoT SiteWise models and assets from one AWS account (development account) by running a bulk export job. You can use filters to export the models and/or assets.
    2.  Import the exported models and/or assets into a second AWS account (staging account) by running a bulk import job. The import files must follow the AWS IoT SiteWise metadata transfer job schema.
  2. AWS IoT Sitewise telemetry data transfer
    1. Use the following API operations to migrate telemetry data across accounts:
      1. BatchGetAssetPropertyValueHistory retrieves historical telemetry data from the development account.
      2. CreateBulkImportJob ingests the retrieved telemetry data into the staging account.

The data migration steps in our solution make the following assumptions:

  1. The staging account does not have AWS IoT SiteWise assets or models configured where it uses the same name or hierarchy as the development account.
  2. You will replicate the AWS IoT SiteWise metadata from the development account to the staging account.
  3. You will move the AWS IoT SiteWise telemetry data from the development account to the staging account.

1: Migrate AWS IoT SiteWise models and assets across AWS accounts

Figure 1: Architecture to migrate AWS IoT SiteWise metadata across AWS accounts

Figure 1: Architecture to migrate AWS IoT SiteWise metadata across AWS accounts

AWS IoT SiteWise supports bulk operations with assets and models. The metadata bulk operations help to:

  1.  Export AWS IoT SiteWise models and assets from the development account by running a bulk export job. You can choose what to export when you configure this job. For more information, see Export metadata examples.
    1.  Export all assets and asset models, and filter your assets and asset models.
    2. Export assets and filter your assets.
    3. Export asset models and filter your asset models.
  2. Import AWS IoT SiteWise models and assets into the staging account by running a bulk import job. Similar to the export job, you can choose what to i­­mport. For more information, see Import metadata examples.
    1. The import files follow a specific format. For more information, see AWS IoT SiteWise metadata transfer job schema.

2: Migrate AWS IoT SiteWise telemetry data across AWS accounts

AWS IoT SiteWise supports ingesting high volume historical data using the CreateBulkImportJob API operation to migrate telemetry data from the development account to the staging account.

Figure 2: Architecture to migrate AWS IoT SiteWise telemetry data across AWS accounts

Figure 2: Architecture to migrate AWS IoT SiteWise telemetry data across AWS accounts

2.1 Retrieve data from the development account using BatchGetAssetPropertyValueHistory

AWS IoT SiteWise has data and SQL API operations to retrieve telemetry results. You can use the export file from the Export AWS IoT SiteWise models and assets by running a bulk export job step to get a list of AWS IoT SiteWise asset IDs and property IDs to query using the BatchGetAssetPropertyValueHistory API operation. The following sample code demonstrates retrieving data for the last two days:

import boto3
import csv
import time
import uuid
"""
Connect to the IoT SiteWise API and define the assets and properties 
to retrieve data for.
"""
sitewise = boto3.client('iotsitewise')
# limit for only 10 AssetIds/PropertyIDs/EntryIDs per API call
asset_ids = ['a1','a2','a3'] 
property_ids = ['b1','b2','b3']

"""
Get the start and end timestamps for the date range of historical data
to retrieve. Currently set to the last 2 days.
""" 
# Convert current time to Unix timestamp (seconds since epoch)
end_time = int(time.time()) 
# Start date 2 days ago
start_time = end_time - 2*24*60*60
"""
Generate a list of entries to retrieve property value history.
Loops through the asset_ids and property_ids lists, zipping them 
together to generate a unique entry for each asset-property pair.
Each entry contains a UUID for the entryId, the corresponding 
assetId and propertyId, and the start and end timestamps for 
the date range of historical data.
"""
entries = []
for asset_id, property_id in zip(asset_ids, property_ids):
  entry = {
    'entryId': str(uuid.uuid4()),
    'assetId': asset_id, 
    'propertyId': property_id,
    'startDate': start_time,
    'endDate': end_time,
    'qualities': [ "GOOD" ],
  }
  entries.append(entry)
"""
Generate entries dictionary to map entry IDs to the full entry data 
for retrieving property values by entry ID.
"""
entries_dict = {entry['entryId']: entry for entry in entries}
"""
The snippet below retrieves asset property value history from AWS IoT SiteWise using the
`batch_get_asset_property_value_history` API call. The retrieved data is then
processed and written to a CSV file named 'values.csv'.
The script handles pagination by using the `nextToken` parameter to fetch
subsequent pages of data. Once all data has been retrieved, the script
exits the loop and closes the CSV file.
"""
token = None
with open('values.csv', 'w') as f:
  writer = csv.writer(f)
  while True:
    """
    Make API call, passing entries and token if on subsequent call.
    """
    if not token:
      property_history = sitewise.batch_get_asset_property_value_history(
          entries=entries
      )
    else:
      property_history = sitewise.batch_get_asset_property_value_history(
          entries=entries,
          nextToken=token
      )
    """
    Process success entries, extracting values into a list of dicts.
    """
    for entry in property_history['successEntries']:
        entry_id = entry['entryId']
        asset_id = entries_dict[entry_id]['assetId']
        property_id = entries_dict[entry_id]['propertyId']
        for history_values in entry['assetPropertyValueHistory']:
          value_dict = history_values.get('value')
          values_dict = {
            'ASSET_ID': asset_id,
            'PROPERTY_ID': property_id,
            'DATA_TYPE': str(list(value_dict.keys())[0]).upper().replace("VALUE", ""),
            'TIMESTAMP_SECONDS': history_values['timestamp']['timeInSeconds'],
            'TIMESTAMP_NANO_OFFSET': history_values['timestamp']['offsetInNanos'],
            'QUALITY': 'GOOD',
            'VALUE': value_dict[list(value_dict.keys())[0]],
          }
          writer.writerow(list(values_dict.values()))
    """
    Check for next token and break when pagination is complete.
    """  
    if 'nextToken' in property_history:
      token = property_history['nextToken']
    else:
      break

2.2 Ingest data to the staging account using CreateBulkImportJob

Use the values.csv file to import data into AWS IoT SiteWise using the CreateBulkImportJob API operation. Define the following parameters while you create an import job using CreateBulkImportJob. For a code sample, see CreateBulkImportJob in the AWS documentation.

  1. Replace the adaptive-ingestion-flag with true or false. For this exercise, set the value to true.
    1. By setting the value to true, the bulk import job does the following:
      1. Ingests new data into AWS IoT SiteWise.
      2. Calculates metrics and transforms, and supports notifications for data with a time stamp that’s within seven days.
    2.  If you were to set the value to false, the bulk import job ingests historical data into AWS IoT SiteWise.
  2. Replace the delete-files-after-import-flag with true to delete the data from the Amazon S3 data bucket after ingesting into AWS IoT SiteWise warm tier storage. For more information, see Create a bulk import job (AWS CLI).

Clean Up

After you validate the results in the staging account, you can delete the data from the development account using AWS IoT SiteWise DeleteAsset and DeleteAssetModel API operations. Alternatively, you may continue to use the development account to continue other development and testing activities with the historical data.

Conclusion

In this blog post, we addressed the challenge industrial customers face when scaling their AWS IoT SiteWise deployments. We discussed transferring from PoV to production across multiple plants and production lines and how AWS IoT SiteWise addresses those challenges. Migrating metadata (such as asset models, asset/enterprise hierarchies, and historical telemetry data) between AWS accounts ensures consistent data context. It also supports promoting Industrial IoT assets and data through the development lifecycle. For additional details please see Bulk operations with assets and models.

Author biographies

JoysonLewis.jpg

Joyson Neville Lewis

Joyson Neville Lewis is a Sr. IoT Data Architect with AWS Professional Services. Joyson worked as a Software/Data engineer before diving into the Conversational AI and Industrial IoT space. He assists AWS customers to materialize their AI visions using Voice Assistant/Chatbot and IoT solutions.

Anish Kunduru.jpg

Anish Kunduru

Anish Kunduru is an IoT Data Architect with AWS Professional Services. Anish leverages his background in stream processing, R&D, and Industrial IoT to support AWS customers scale prototypes to production-ready software.

Ashok Padmanabhan.jpg

Ashok Padmanabhan

Ashok Padmanabhan is a Sr. IoT Data Architect with AWS Professional Services. Ashok primarily works with Manufacturing and Automotive to design and build Industry 4.0 solutions.

Avik Ghosh.jpg

Avik Ghosh

Avik is a Senior Product Manager on the AWS Industrial IoT team, focusing on the AWS IoT SiteWise service. With over 18 years of experience in technology innovation and product delivery, he specializes in Industrial IoT, MES, Historian, and large-scale Industry 4.0 solutions. Avik contributes to the conceptualization, research, definition, and validation of AWS IoT service offerings.

RELATED ARTICLES
- Advertisment -
Google search engine

Most Popular

Recent Comments