- Flow
- Semiconductor Film Thickness Measurement System - FilmCheck AS300
- Edge Computing Hardware
- Azure Cloud service used
- Development Environment
- Prerequisites
- Digital Twins Definition Language
- Data transmission format
- The path of the Log & Report & Image in Azure Blob
- The file structure in the Github repository
- Step-by-step Guide
- 0. If the prerequisite for the environment is to use Azure Ubuntu VM
- 1. Preparing the operating environment
- 2. Set Environment Variable
- 3. Create a resource group
- 4. Deploying Azure Digital Twin
- 5. Create Azure IoT Hub and Create IoT Device
- 6. Build Azure 3D Scenes Studio
- 7. Setting up notifications to be sent through Microsoft Teams
- 8. Database environment setup and configuration
- 9. Deploy the Azure Container Apps
- Upgrade the Azure CLI
- Create an Azure Container Registry
- List the container registries under the current subscription
- Log in to the Azure Container Registry
- Create a Docker Image
- Check Azure container registry from Browser
- Create a Container Apps Environments
- Get default Domain
- Deploy your image to Azure Container App
- 10. Deploying the Frontend to Azure Static Web Apps
- 11. Launching the program on the Edge
- 12. Launch the program for FilmCheck AS300
- 13. Confirming the results from Azure Digital Twins 3D Scenes Studio
- 14. Confirming the results from the frontend
- 15. Confirming the results from the Microsoft HoloLens
- 16. Analysis and Learning of Film Thickness Data
- Reference
- Contributor
- License
This project contains a sample for working with Azure Digital Twins:
- A building scenario sample written in Vue, Node.js, and Python. The sample can be used to set up and a full end-to-end scenario with Azure Digital Twins.
- The designed flow and architecture is showed as following:
- Upload the corresponding status and measurement results of the Semiconductor Film Thickness Measurement System (FilmCheck AS300) in DTDL format to Azure Digital Twins.
- Sending the status and measurement results of the Semiconductor Film Thickness Measurement System (FilmCheck AS300) to the edge for analysis and serialization, and then transmitting to Azure IoT Hub.
- After analyzing the relevant data received from Azure IoT Hub, the program deployed in the Azure Container App writes it into the corresponding property in Azure Digital Twin.
- Retrieve the corresponding properties from Azure Digital Twins and display them in Azure 3D Scenes Studio.
- Use Microsoft HoloLens to understand the current status and measurement results of the Semiconductor Film Thickness Measurement System (FilmCheck AS300) through the dashboard deployed in Azure Static Web App.
- When an error occurs in the Semiconductor Film Thickness Measurement System (FilmCheck AS300), send an alert message to the administrator group in real-time via Microsoft Teams.
- Upload the logs and reports received from the Semiconductor Film Thickness Measurement System (FilmCheck AS300) on Edge to Azure Blob. When Azure Blob receives the upload event of the report, trigger the Azure Machine Learning pipeline to perform data analysis and generate results, then upload them to Azure Blob and Job Metrics/Images.
- Keyword:
- Azure IoT Hub, Docker, Azure Container Registry, Azure Container Apps, Azure Digital Twins, Digital Twins Definition Language (DTDL), Azure 3D Scenes Studio, Azure Digital Twins Explorer, Azure App Service, Microsoft HoloLens, Creating a 3D model of a physical object, Babylon.js, Azure Machine Learning, Azure Static Web App, Vue, Azure storage account, Azure Database for PostgreSQL, GitHub, Azure Active Directory and Microsoft Teams.
Spectral reflectometer (SR) is a method of characterizing unknown properties of a sample by measuring the reflection of electromagnetic radiation. The reflected light is collected onto the spectrometer and transfer into spectrum signals. After the theoretical analysis and signature fitting, the parameters of the sample can be obtained. In our system, it can measure the parameters of film thickness, (n, k) value, and TSV (through silicon via) depth (option).
- Features
- Extra film thickness range up to 150 µm
- Fast alignment with high reliability R - Θ stage
- Precise measurement location for patterned film
- Generate map recipe in seconds
- Double check measurement location by preview color camera
- Auto focusing for precise measurement
- Specifications
- Oxide, Nitride, PR, PI film on Si, GaAs, glass, metals substrate
- Excellent for patterned film with auto mapping
- Mapping size up to 300 mm wafer
- Absolute reflectivity measurements
- Film thickness range: 200 Å ~ 150 µm
- Repeatability: <br 10 Å @ 5000 Å Oxide on Si substrate
- Measurement Spot Size: φ40 µm (available others)
- Wafer size compatibility: 4”, 6”, 8”, 12”
- SEMI S2 compliance
- Infomation
- Azure IoT Hub
- Azure Event Hub
- Azure Active Directory
- Azure Digital Twins
- Azure Digital Twins Explorer
- 3D Scenes Studio for Azure Digital Twins
- Azure Container Registry
- Azure Container Apps
- Azure Machine Learning Studio
- Azure Static Web Apps
- Azure Storage Account
- Azure Database for PostgreSQL
Application | Development Environment - Hardware / OS | Language |
---|---|---|
Edge | NVIDIA Jetson AGX Xavier | Python 3.6 |
Cloud | MacBook Pro M1 Max (macOS: Ventura 13.0.1) You can also use an Azure Ubuntu VM to perform the following operations. |
Node v18.13.0 (LTS) |
- Sign in to the Azure portal
- Install the Azure CLI
- Sign in with Azure CLI
- Install the Visual Studio Code
- Install the Docker CLI
- Install the pgAdmin
- Sign in to GitHub
- Install Node.js
- Install Git and Sign in to GitHub
The DTDL model is defined as following 2 sections:
- Lab
{
"@id": "dtmi:itri:Lab;1",
"@type": "Interface",
"@context": "dtmi:dtdl:context;2",
"displayName": "Lab",
"contents": [
{
"@type": "Relationship",
"name": "contains",
"properties": [
{
"@type": "Property",
"name": "targetModel",
"schema": "string"
}
],
"target": "dtmi:itri:cms:filmcheck;1"
}
]
}
- Semiconductor Film Thickness Measurement System (FilmCheck AS300)
Type Error | Machine Status | Color | Annotation |
---|---|---|---|
-1 | Shutdown | Default Color | The program for FilmCheck AS300 has been closed. |
0 | Idle | Green | The program for FilmCheck AS300 has been opened. |
1 | inOperation | Orange | The program for FilmCheck AS300 is currently performing wafer inspection. |
2 | Error | Red | An error occurred during wafer inspection in the program for FilmCheck AS300. |
{
"@context": [
"dtmi:dtdl:context;2"
],
"@id": "dtmi:itri:cms:filmcheck;1",
"@type": "Interface",
"displayName": "FilmCheck Device",
"description": "FilmCheck Device",
"contents": [
{
"@type": "Property",
"name": "statusDate",
"displayName": "status date",
"description": "status date",
"schema": "string"
},
{
"@type": "Property",
"name": "statusTime",
"displayName": "status time",
"description": "status time",
"schema": "string"
},
{
"@type": "Property",
"name": "typeError",
"displayName": "status type",
"description": "status type",
"schema": "integer"
},
{
"@type": "Property",
"name": "statusMessage",
"displayName": "status message",
"description": "status message",
"schema": "string"
},
{
"@type": "Property",
"name": "statisticsDate",
"displayName": "statistics date",
"description": "statistics date",
"schema": "string"
},
{
"@type": "Property",
"name": "statisticsTime",
"displayName": "statistics time",
"description": "statistics time",
"schema": "string"
},
{
"@type": "Property",
"name": "statisticsStandardDeviation",
"displayName": "statistics standard deviation",
"description": "statistics standard deviation",
"schema": "double"
},
{
"@type": "Property",
"name": "statisticsAverage",
"displayName": "statistics average",
"description": "statistics average",
"schema": "double"
},
{
"@type": "Property",
"name": "statisticsUniformity",
"displayName": "statistics uniformity",
"description": "statistics uniformity",
"schema": "double"
},
{
"@type": "Property",
"name": "statisticsMax",
"displayName": "statistics max",
"description": "statistics max",
"schema": "double"
},
{
"@type": "Property",
"name": "statisticsMin",
"displayName": "statistics min",
"description": "statistics min",
"schema": "double"
},
{
"@type": "Property",
"name": "statisticsMaxMin",
"displayName": "statistics max min",
"description": "statistics max min",
"schema": "double"
}
]
}
-
Transferring data from the Semiconductor Film Thickness Measurement System to the Edge
- Shutdown
{ "date": "2023/02/23", "time": "13:43:34", "message": "Shutdown: The program is preparing to close." }
- Idle
{ "date": "2023/02/23", "time": "13:43:34", "message": "Idle: The program is already open but wafer inspection has not yet started." }
- inOperation
{ "date": "2023/02/23", "time": "13:43:34", "message": "inOperation: The wafer inspection is in progress." }
- Error
{ "date": "2023/02/23", "time": "13:43:34", "message": "Error: OpenSensor() : Sensor usb device list failed or no sensor present : 0x1" }
- Shutdown
-
Transferring data from the Edge to Azure IoT Hub
Type Error | Machine Status (type) | Color | Annotation |
---|---|---|---|
-1 | Shutdown | Default Color | The program for FilmCheck AS300 has been closed. |
0 | Idle | Green | The program for FilmCheck AS300 has been opened. |
1 | inOperation | Orange | The program for FilmCheck AS300 is currently performing wafer inspection. |
2 | Error | Red | An error occurred during wafer inspection in the program for FilmCheck AS300. |
{
"status": {
"date": "2023/01/11",
"time": "09:18:50",
"type": "Error",
"message": "-999 Failed to load the configuration file."
},
"statistics": {
"date": "2023/01/11",
"time": "09:18:50",
"standardDeviation": 769.93,
"average": 54954.8,
"uniformity": 98.599,
"max": 54954.8,
"min": 54954.8,
"maxMin": 54954.8,
"fileName": "20230220-011029.csv",
}
}
Type | Name | Azure Blob |
---|---|---|
Log | L20230218.log | adt3dstorageaccount/adt/Log |
Report | 20230220-011029.csv | adt3dstorageaccount/adt/Report |
Image | 20230220-011029.png | adt3dstorageaccount/adt/Image |
├── 3D-Scenes
│ └── semi-v1.glb
├── AzureContainerApp
│ ├── Blob_Trigger
│ │ ├── config
│ │ │ └── config.json
│ │ ├── docker-manifests
│ │ │ └── Dockerfile
│ │ ├── index.js
│ │ ├── migrations
│ │ │ └── 20230219235350-create-log.js
│ │ ├── models
│ │ │ ├── index.js
│ │ │ └── log.js
│ │ ├── package-lock.json
│ │ ├── package.json
│ │ └── tokenService.js
│ ├── DB_Ops
│ │ ├── config
│ │ │ └── config.json
│ │ ├── docker-manifests
│ │ │ └── Dockerfile
│ │ ├── index.js
│ │ ├── migrations
│ │ │ ├── 20230219235350-create-log.js
│ │ │ ├── 20230220012714-create-report.js
│ │ │ └── 20230309235001-create-status.js
│ │ ├── models
│ │ │ ├── index.js
│ │ │ ├── log.js
│ │ │ ├── report.js
│ │ │ └── status.js
│ │ ├── package-lock.json
│ │ └── package.json
│ ├── IoTHub_To_ADT_Notify
│ │ ├── adtService.js
│ │ ├── config
│ │ │ └── config.json
│ │ ├── docker-manifests
│ │ │ └── Dockerfile
│ │ ├── index.js
│ │ ├── migrations
│ │ │ ├── 20230220012714-create-report.js
│ │ │ └── 20230309234114-create-status.js
│ │ ├── models
│ │ │ ├── index.js
│ │ │ ├── report.js
│ │ │ └── status.js
│ │ ├── package-lock.json
│ │ └── package.json
│ └── Query_ADT
│ ├── adtService.js
│ ├── docker-manifests
│ │ └── Dockerfile
│ ├── index.js
│ ├── package-lock.json
│ └── package.json
├── AzureML
│ ├── pipeline-python
│ │ └── go.py
│ └── run_pipeline.ipynb
├── DM
│ └── FilmChek_AS300.png
├── DTDL
│ ├── 3D-FilmCheck-v1.json
│ └── Lab.json
├── Edge
│ ├── Datas
│ │ ├── L20230301-Error.log
│ │ ├── L20230301-Idle.log
│ │ ├── L20230301-Shutdown.log
│ │ ├── L20230301-inOperation.log
│ │ └── L20230301.log
│ ├── Excel
│ │ └── 20230305-144311.xlsx
│ ├── Log
│ ├── Report
│ │ └── 20230313-140020.csv
│ ├── edge.py
│ ├── publisher-test
│ │ ├── pubLog.py
│ │ └── pubReport.py
│ ├── requirements.txt
│ └── uploadFileToBlob
│ ├── Log
│ │ └── L20230301.log
│ ├── Report
│ │ └── 20230313-140020.csv
│ ├── app.py
│ └── requirements.txt
├── Frontend
│ ├── README.md
│ ├── babel.config.js
│ ├── jsconfig.json
│ ├── package-lock.json
│ ├── package.json
│ ├── public
│ │ ├── assets
│ │ │ └── semi-v1.glb
│ │ ├── favicon.ico
│ │ ├── index.html
│ │ └── itri-64.ico
│ ├── src
│ │ ├── App.vue
│ │ ├── assets
│ │ │ └── logo.png
│ │ ├── components
│ │ │ ├── InfoModal.vue
│ │ │ └── SpinnerModal.vue
│ │ ├── main.js
│ │ ├── router
│ │ │ └── index.js
│ │ ├── scenes
│ │ │ └── Scene.js
│ │ ├── store
│ │ │ └── index.js
│ │ └── views
│ │ └── IndoorView.vue
│ └── vue.config.js
├── README.md
└── Script
└── install-docker.sh
-
Log in to Azure Ubuntu VM
ssh Account@Azure-Ubuntu-VM-IP
-
Download the zip file of the source code
git clone https://github.com/ArcherHuang/Azure-Digital-Twins-End-To-End-Sample.git
-
Insatll Docker Engine
-
Install Buildx
sudo apt-get install qemu qemu-user-static qemu-user mkdir -p ~/.docker/cli-plugins sudo docker version --format '{{.Server.Experimental}}' | grep -q 'true' && DOCKER_BUILDKIT=1 || DOCKER_BUILDKIT=0 curl -SL https://github.com/docker/buildx/releases/download/v0.6.1/buildx-v0.6.1.linux-amd64 -o ~/.docker/cli-plugins/docker-buildx chmod a+x ~/.docker/cli-plugins/docker-buildx docker buildx install docker buildx version
-
Install Azure CLI
curl -sL https://aka.ms/InstallAzureCLIDeb | sudo bash az --version
-
Sign in to Azure via AZ CLI
az login
-
Get subscription ID
az account subscription list
-
Switch to the subscription to be used
az account set --subscription "SUBSCRIPTION-ID"
-
Install Node.js
curl -s https://deb.nodesource.com/setup_16.x | sudo bash sudo apt install nodejs -y node -v
-
Commands for macOS environment
- Commands for macOS and Ubuntu environment
export RESOURCE_GROUP="adt-3d-rg"
export REGION="japaneast"
export ADT_NAME="adt-3d"
export BLOB_NAME="adt3dstorageaccount"
export BLOB_CONTAINER_NAME="adt"
export ACR_NAME="containerforacr"
export ACA_NAME_HUB="adt3dhub"
export ACA_NAME_QUERY="adt3dquery"
export ACA_NAME_BLOB="blobtrigger"
export ACA_NAME_DB_Ops="dbops"
export ACA_ENVIRONMENT_HUB="env-region-containerapps-hub"
export ACA_ENVIRONMENT_QUERY="env-region-containerapps-query"
export ACA_ENVIRONMENT_BLOB="env-region-containerapps-blob"
export ACA_ENVIRONMENT_DB_OPS="env-region-containerapps-db-ops"
export IOT_HUB_NAME="adt-3d"
export IOT_DEVICE_NAME="FilmCheck01"
export IOT_HUB_CONSUMER_GROUP_CONTAINER_APP="containerapp"
export IOT_HUB_CONSUMER_GROUP_STATIC_WEB_APP="forStaticWeb"
export DB_PG_NAME="semiconductor"
export DB_SKU="GP_Gen5_2"
export DB_USER_NAME="azureuser"
export DB_USER_PASSWORD="Pa~w@0rD"
echo $RESOURCE_GROUP
echo $REGION
echo $ADT_NAME
echo $BLOB_NAME
echo $BLOB_CONTAINER_NAME
echo $ACA_NAME_HUB
echo $ACA_NAME_QUERY
echo $ACA_NAME_BLOB
echo $ACA_NAME_DB_Ops
echo $ACA_ENVIRONMENT_HUB
echo $ACA_ENVIRONMENT_QUERY
echo $ACA_ENVIRONMENT_BLOB
echo $ACA_ENVIRONMENT_DB_OPS
echo $IOT_HUB_NAME
echo $IOT_DEVICE_NAME
echo $IOT_HUB_CONSUMER_GROUP_CONTAINER_APP
echo $IOT_HUB_CONSUMER_GROUP_STATIC_WEB_APP
echo $DB_PG_NAME
echo $DB_SKU
echo $DB_USER_NAME
echo $DB_USER_PASSWORD
-
Create an Azure Digital Twin
-
Get Azure Digital Twin Host URL
- Commands for macOS and Ubuntu environment
export ADT_Host_Name=`az dt show --dt-name $ADT_NAME --resource-group $RESOURCE_GROUP --query hostName` echo $ADT_Host_Name
-
Get User Principal Name
az ad user list
-
Set parameter
- Commands for macOS and Ubuntu environment
export USER_PRINCIPAL_NAME="Change it to the userPrincipalName obtained in the previous step." echo $USER_PRINCIPAL_NAME
- Commands for macOS and Ubuntu environment
-
Set Role Assignment
-
Check Role Assignment
-
Upload DTDL Model
-
Create the Twins
-
Lab
-
Semiconductor Film Thickness Measurement System (FilmCheck AS300)
-
-
Initial Property
-
Commands for macOS and Ubuntu environment
az dt twin update -n $ADT_NAME --twin-id FilmCheck01 --json-patch '[ {"op":"add", "path":"/statusDate", "value": "0"}, {"op":"add", "path":"/statusTime", "value": "0"}, {"op":"add", "path":"/typeError", "value": -1}, {"op":"add", "path":"/statusMessage", "value": "0"}, {"op":"add", "path":"/statisticsDate", "value": "0"}, {"op":"add", "path":"/statisticsTime", "value": "0"}, {"op":"add", "path":"/statisticsStandardDeviation", "value": 0.0}, {"op":"add", "path":"/statisticsAverage", "value": 0.0}, {"op":"add", "path":"/statisticsUniformity", "value": 0.0}, {"op":"add", "path":"/statisticsMax", "value": 0.0}, {"op":"add", "path":"/statisticsMin", "value": 0.0}, {"op":"add", "path":"/statisticsMaxMin", "value": 0.0} ]' -g $RESOURCE_GROUP
-
-
Create Relationship
-
List Relationship
-
Check from Azure Digital Twins Explorer
-
Create Azure IoT Hub
-
Create IoT Device
-
Create 2 event hub consumer groups
-
Commands for macOS and Ubuntu environment
az iot hub consumer-group create --hub-name $IOT_HUB_NAME --name $IOT_HUB_CONSUMER_GROUP_CONTAINER_APP --resource-group $RESOURCE_GROUP az iot hub consumer-group create --hub-name $IOT_HUB_NAME --name $IOT_HUB_CONSUMER_GROUP_STATIC_WEB_APP --resource-group $RESOURCE_GROUP
-
-
Get
FilmCheck01
IoT Device Connection String
-
Create Storage Accounts & Container
-
Create Storage Account
-
Create Container
-
Change access level
-
Add Storage Blob Data Owner
-
Retrieve the
subscriptionId
from the results.az account subscription list
-
Set parameter
- Commands for macOS and Ubuntu environment
export SUBSCRIPTION_ID="Enter the subscriptionId obtained in the previous step." echo $SUBSCRIPTION_ID
- Commands for macOS and Ubuntu environment
-
Create a role assignment.
-
-
-
Upload 3D Scenes to Studio & Create Twin
-
Enable Cross-Origin Resource Sharing (CORS) for storage account
-
3D Scenes Studio
- Click the
3D Scenes
button - Click on the pencil icon and Configure the instance and storage container details
- Add a new 3D scene
- After the file upload is completed, relevant information can also be viewed in the created blob.
- Once the file is uploaded, you'll see it listed back on the main screen of 3D Scenes Studio
- Select the scene to open and view it. The scene will open in Build mode
- Create a scene element - Create the element for the
Red
color of the tri-color light- Select the floor in the scene visualization. This will bring up the possible element actions. Select
+
Create new element - Create the element for theRed
color of the tri-color light- Display name, enter
Display Red Status
. Under Elements, select Floors. - Select
Visual rules
>Add Rule
- Enter a Display name of
Check Red Status
. Leave the Property expression onSingle property
and open the property dropdown list. It contains names of all the properties on the primary twin for the Floor element. SelecttypeError
. Then, selectAdd condition
. - Add check
typeError
condition
- Display name, enter
- Select the floor in the scene visualization. This will bring up the possible element actions. Select
- Create a scene element - Create the element for the
Orange
color of the tri-color light- Select the floor in the scene visualization. This will bring up the possible element actions. Select
+
Create new element - Create the element for theOrange
color of the tri-color light- Display name, enter
Display Orange Status
. Under Elements, select Floors. - Select
Visual rules
>Add Rule
- Enter a Display name of
Check Orange Status
. Leave the Property expression onSingle property
and open the property dropdown list. It contains names of all the properties on the primary twin for the Floor element. SelecttypeError
. Then, selectAdd condition
. - Add check
typeError
condition
- Display name, enter
- Select the floor in the scene visualization. This will bring up the possible element actions. Select
- Create a scene element - Create the element for the
Green
color of the tri-color light- Select the floor in the scene visualization. This will bring up the possible element actions. Select
+
Create new element - Create the element for theGreen
color of the tri-color light- Display name, enter
Display Green Status
. Under Elements, select Floors. - Select
Visual rules
>Add Rule
- Enter a Display name of
Check Green Status
. Leave the Property expression onSingle property
and open the property dropdown list. It contains names of all the properties on the primary twin for the Floor element. SelecttypeError
. Then, selectAdd condition
. - Add check
typeError
condition
- Display name, enter
- Select the floor in the scene visualization. This will bring up the possible element actions. Select
- Create new element for display Statistics
- Select the floor in the scene visualization. This will bring up the possible element actions. Select
+
Create new element - Display name, enter
Display Statistics
. Under Elements, select Floors. - Add
Status
widget - Add
Standard Deviation
widget - Add
Average
widget - Add
Uniformity
widget - Add
Max
widget - Add
Min
widget - Add
Max Min
widget
- Select the floor in the scene visualization. This will bring up the possible element actions. Select
- View scene
- Click the
-
-
Please enter a unique and identifiable name in the
團隊名稱
field, and click下一步
after you have finished entering it. -
Click the
連接器
button in the...
menu at the top right corner. (The連接器
option will appear in a moment, please wait.) -
Click the
連接器
button in the...
menu at the top right corner -
Please enter a unique and identifiable name in the
Incoming Webhook
field, and click建立
after you have finished entering it. -
After creation is completed, please copy the
URL
and then click the完成
button
-
Create a server
-
Allow access to Azure Services
-
Get the local IP address.
-
Configure a server-based firewall rule
-
Create a database
- Open the
pgAdmin
software - Right-click the
Servers
button > Click theRegister
button > Click theServer...
button - Enter a recognizable name in the
Name
field - Enter relevant information on the
Connection
tab- In the
Host name
field, enter theServer name
obtained from theGet the server name and the admin username
section. - In the
Username
field, enter theAdmin username
obtained from theGet the server name and the admin username
section. - In the
Password
field, enter the password you have set. - After entering the above information, click the
Save
button. - If login fails, please reset your password.
- In the
- Establish
FilmCheck
Database
- Open the
-
Create the Log and Report Table
-
Switch working directory
cd ./AzureContainerApp/DB_Ops
-
Install required packages
npm i
-
Modify the content of
development
in theconfig/config.json
file (username、password、database)
{ "development": { "username": "Please modify it with the information obtained earlier.", "password": "Please modify it with the information obtained earlier.", "database": "Please modify it with the information obtained earlier.", "host": "Please modify it with the information obtained earlier.", "dialect": "postgres", "ssl": true, "dialectOptions": { "ssl": { "require": true } } }, }
-
Create the Tables
npx sequelize db:migrate --env development
-
Open Query Tool
-
Check the Log Table
SELECT * FROM public."Logs"
-
Check the Report Table
SELECT * FROM public."Reports"
-
Check the Status Table
SELECT * FROM public."Statuses"
-
- Upgrade the Azure CLI
az upgrade
az config set auto-upgrade.enable=yes
az extension add --name containerapp --upgrade
-
Create an Azure Container Registry
-
List the container registries under the current subscription
-
Log in to the Azure Container Registry
-
Commands for macOS environment
az acr login --name $ACR_NAME
-
Commands for Ubuntu environment
az acr credential show --name containerforacr.azurecr.io --query passwords[0].value --output tsv sudo az acr login --name $ACR_NAME Username: containerforacr Password: Please enter the result obtained from the command 'az acr credential show'
-
-
Switch working directory (IoT Hub To ADT)
cd ../IoTHub_To_ADT_Notify
-
Modify the content of
development
in theconfig/config.json
file (username、password、database){ "development": { "username": "Please modify it with the information obtained earlier.", "password": "Please modify it with the information obtained earlier.", "database": "Please modify it with the information obtained earlier.", "host": "Please modify it with the information obtained earlier.", "dialect": "postgres", "ssl": true, "dialectOptions": { "ssl": { "require": true } } }, }
-
Create a Docker Image (IoT Hub To ADT)
-
Create a Container Apps Environments (IoT Hub To ADT)
-
Get default Domain (IoT Hub To ADT)
-
Set parameters
-
Get Build-in Event Hub-compatible endpoint
-
Commands for macOS and Ubuntu environment
az iot hub connection-string show --hub-name $IOT_HUB_NAME --default-eventhub --resource-group $RESOURCE_GROUP --query connectionString export EVENTHUB_CONNECTION_STRING="Enter the connectionString obtained in the previous step." echo $EVENTHUB_CONNECTION_STRING
-
-
Get Azure Digital Twin Host URL
- Commands for macOS and Ubuntu environment
az dt show --dt-name $ADT_NAME --resource-group $RESOURCE_GROUP --query hostName export ADT_Host_Name="Enter the Azure Digital Twin Host URL obtained in the previous step." echo $ADT_Host_Name
-
-
Confirm parameters
echo $ADT_NAME echo $IOT_HUB_CONSUMER_GROUP_CONTAINER_APP echo $EVENTHUB_CONNECTION_STRING echo $ADT_Host_Name echo $WEBHOOK_URL
-
Deploy the image to Azure Container App (IoT Hub To ADT)
-
Commands for macOS and Ubuntu environment
az containerapp create \ --name $ACA_NAME_HUB \ --resource-group $RESOURCE_GROUP \ --environment $ACA_ENVIRONMENT_HUB \ --registry-server $ACR_NAME.azurecr.io \ --image $ACR_NAME.azurecr.io/hub-to-adt-notify:0.1 \ --env-vars "EVENTHUB_NAME=secretref:eventhub-name" "CONSUMER_GROUP_NAME=secretref:consumer-group-name" "EVENTHUB_CONNECTION_STRING=secretref:eventhub-connection-string" "ADT_Host_Name=secretref:adt-host-name" "WEBHOOK_URL=secretref:webhook-url" \ --secrets "eventhub-name=$ADT_NAME" "consumer-group-name=$IOT_HUB_CONSUMER_GROUP_CONTAINER_APP" "eventhub-connection-string=$EVENTHUB_CONNECTION_STRING" "adt-host-name=$ADT_Host_Name" "webhook-url=$WEBHOOK_URL" \ --min-replicas 1 \ --max-replicas 1 \ --ingress 'internal' \ --target-port 3000 \ --query properties.configuration.ingress.fqdn "adt3dhub.internal.proudsky-415d04fa.japaneast.azurecontainerapps.io"
-
-
Set role assignment
-
Enable System assigned
-
Add role assignment
-
Click on the created
adt3dhub
in theContainer App
on theAzure portal
> Click onIdentity
in the left-hand menu > Click onAzure role assignments
-
Click on the
+ Add role assignment (Preview)
button- Enter the relevant information
- Select
Subscription
in theScope
field. - Select the desired subscription in the
Subscription
field. - Select
Azure Digital Twins Data Owner
in theRole
field. - After entering the above information, click on the
Save
button.
- Select
- Enter the relevant information
-
-
-
Switch working directory (Query ADT)
cd ../Query_ADT
-
Create a Container Apps Environments (Query ADT)
-
Get default Domain (Query ADT)
-
Create the Query ADT Docker Image (Query ADT)
-
Confirm parameters
-
Deploy the Query ADT image to Azure Container App (Query ADT)
- Please remember the URL returned after executing the command.
-
Commands for macOS and Ubuntu environment
az containerapp create \ --name $ACA_NAME_QUERY \ --resource-group $RESOURCE_GROUP \ --environment $ACA_ENVIRONMENT_QUERY \ --registry-server $ACR_NAME.azurecr.io \ --image $ACR_NAME.azurecr.io/query-adt:0.2 \ --env-vars "ADT_Host_Name=secretref:adt-host-name" \ --secrets "adt-host-name=$ADT_Host_Name" \ --min-replicas 1 \ --max-replicas 1 \ --ingress 'external' \ --target-port 80 \ --query properties.configuration.ingress.fqdn "adt3dquery.kindpebble-e066fde2.japaneast.azurecontainerapps.io"
-
- Please remember the URL returned after executing the command.
-
Set parameter
- Commands for macOS and Ubuntu environment
export QUERY_ADT_URL="Enter the URL obtained in the previous step." echo $QUERY_ADT_URL
- Commands for macOS and Ubuntu environment
-
Set role assignment (Query ADT)
-
Enable System assigned
-
Add role assignment
-
Click on the created
adt3dquery
in theContainer App
on theAzure portal
> Click onIdentity
in the left-hand menu > Click onAzure role assignments
-
Click on the
+ Add role assignment (Preview)
button- Enter the relevant information
- Select
Subscription
in theScope
field. - Select the desired subscription in the
Subscription
field. - Select
Azure Digital Twins Data Owner
in theRole
field. - After entering the above information, click on the
Save
button.
- Select
- Enter the relevant information
-
-
-
Test (Query ADT)
https://adt3dquery.kindpebble-e066fde2.japaneast.azurecontainerapps.io/hello
https://adt3dquery.kindpebble-e066fde2.japaneast.azurecontainerapps.io/adt/query/statistics
- Switch working directory (DB Ops)
cd ../DB_Ops
-
Modify the content of
development
in theconfig/config.json
file (username、password、database){ "development": { "username": "Please modify it with the information obtained earlier.", "password": "Please modify it with the information obtained earlier.", "database": "Please modify it with the information obtained earlier.", "host": "Please modify it with the information obtained earlier.", "dialect": "postgres", "ssl": true, "dialectOptions": { "ssl": { "require": true } } }, }
-
Create the Query ADT Docker Image (DB Ops)
-
Create a Container Apps Environments (DB Ops)
-
Deploy the Query ADT image to Azure Container App (DB Ops)
- Please remember the URL returned after executing the command.
-
Commands for macOS and Ubuntu environment
az containerapp create \ --name $ACA_NAME_DB_Ops \ --resource-group $RESOURCE_GROUP \ --environment $ACA_ENVIRONMENT_DB_OPS \ --registry-server $ACR_NAME.azurecr.io \ --image $ACR_NAME.azurecr.io/db-ops:0.1 \ --min-replicas 1 \ --max-replicas 1 \ --ingress 'external' \ --target-port 80 \ --query properties.configuration.ingress.fqdn "dbops.victoriousground-728f9a8f.japaneast.azurecontainerapps.io"
-
- Please remember the URL returned after executing the command.
-
Set parameter
- Commands for macOS and Ubuntu environment
export DB_OPS_API_URL="Enter the URL obtained in the previous step." echo $DB_OPS_API_URL
- Commands for macOS and Ubuntu environment
-
Test (DB Ops)
- https://DB_OPS_API_URL/logs
- https://DB_OPS_API_URL/reports
- https://DB_OPS_API_URL/status/latest
- https://DB_OPS_API_URL/reports/latest
-
Deploy files from the
./Azure-Digital-Twins-End-To-End-Sample/Frontend
folder to GitHub. -
Click the
Settings
> Click theSecrets and variables
> Click theActions
> Click theNew repository secret
-
Add the following
secret
-
- Enter
VUE_APP_EVENTHUB_NAME
in the Name field. - Enter the string displayed by
echo $ADT_NAME
in theSecret
field. - After entering the above information, click the
Add secret
button.
- Enter
-
Add
VUE_APP_CONSUMER_GROUP_NAME
- Enter
VUE_APP_CONSUMER_GROUP_NAME
in the Name field. - Enter the string displayed by
echo $IOT_HUB_CONSUMER_GROUP_STATIC_WEB_APP
in theSecret
field. - After entering the above information, click the
Add secret
button.
- Enter
-
Add
VUE_APP_EVENTHUB_CONNECTION_STRING
-
Get Build-in Event Hub-compatible endpoint
az iot hub connection-string show --hub-name $IOT_HUB_NAME --default-eventhub --resource-group $RESOURCE_GROUP --query connectionString
-
- Enter
VUE_APP_EVENTHUB_CONNECTION_STRING
in the Name field. - Enter the result displayed in the previous step in the
Secret
field. - After entering the above information, click the
Add secret
button.
- Enter
-
-
Add
VUE_APP_API_URL
(ACA - DB Ops)- Enter
VUE_APP_API_URL
in the Name field. - Enter the string displayed by
echo $DB_OPS_API_URL
in theSecret
field. - After entering the above information, click the
Add secret
button.
- Enter
-
Add
VUE_APP_BLOB_NAME
- Enter
VUE_APP_BLOB_NAME
in the Name field. - Enter the string displayed by
echo $BLOB_NAME
in theSecret
field. - After entering the above information, click the
Add secret
button.
- Enter
-
Add
VUE_APP_QUERY_ADT_URL
(ACA - Query ADT)- Enter
VUE_APP_QUERY_ADT_URL
in the Name field. - Enter the string displayed by
echo $QUERY_ADT_URL
in theSecret
field. - After entering the above information, click the
Add secret
button.
- Enter
-
-
Enter
Static Web App
in the search box of theAzure Portal
. > Select the search result forStatic Web Apps
. -
After modifying
./ADT_Web/tree/main/.github/workflows/azure-static-web-apps-Random-Number.yml
, commit the changes (You can see this file from the GitHub repository). Add the information in theenv:
section below (Please align the format).jobs: build_and_deploy_job: if: ... runs-on: ubuntu-latest name: Build and Deploy Job steps: - uses: actions/checkout@v2 with: submodules: true - name: Build And Deploy id: builddeploy uses: Azure/static-web-apps-deploy@v1 with: ... env: VUE_APP_EVENTHUB_NAME: ${{ secrets.VUE_APP_EVENTHUB_NAME }} VUE_APP_CONSUMER_GROUP_NAME: ${{ secrets.VUE_APP_CONSUMER_GROUP_NAME }} VUE_APP_EVENTHUB_CONNECTION_STRING: ${{ secrets.VUE_APP_EVENTHUB_CONNECTION_STRING }} VUE_APP_API_URL: ${{ secrets.VUE_APP_API_URL }} VUE_APP_BLOB_NAME: ${{ secrets.VUE_APP_BLOB_NAME }} VUE_APP_QUERY_ADT_URL: ${{ secrets.VUE_APP_QUERY_ADT_URL }}
-
Download the zip file of the source code to the edge and unzip it..
- Please refer to the instructions on how to download the zip file of the source code.
-
Install Mosquitto MQTT Broker On Ubuntu Edge
sudo apt update
sudo apt install -y mosquitto
- Change directory
cd ./Edge
- Install Python packages with pip and requirements.txt
pip3 install -r requirements.txt
- Get FilmCheck01 IoT Device Connection String
az iot hub device-identity connection-string show --hub-name $IOT_HUB_NAME --device-id $IOT_DEVICE_NAME --output table --resource-group $RESOURCE_GROUP
-
Get the Connection string for the blob
-
Get the container name for the blob
-
Create
.env
File and Add the following information.
IOT_HUB_DEVICE_CONNECTION_STRING="Enter the result obtained from the Get FilmCheck01 IoT Device Connection String."
BLOB_CONNECTION_STRING="Enter the result obtained from the Get the Connection string for the blob."
BLOB_CONTAINER_NAME="Enter the result obtained from the Get the container name for the blob."
- Run Code
python3 edge.py
- After receiving the log information from FilmCheck AS300, Edge will store the log information locally and upload it to Azure Blob at 5:00 every day. Similarly, upon receiving the report information from FilmCheck AS300, Edge will store the report information locally and upload it to Azure Blob in real-time.
-
Enter
Azure Machine Learning
in the search box above and click onAzure Machine Learning
in the search results. -
Click on the
+ Create
button in the upper left corner. > Then click onNew Workspace
. -
After verifying the relevant information, click on
Create
in the lower left corner. -
After the creation is completed, please click on
Go to resource
. -
Click on
Compute
on the left side to create a compute resource. -
After selecting the desired
CPU
orGPU
specifications, click onCreate
below. -
Create a notebook.
-
Click on the
Python 3.8 - AzureML
execution environment in the upper right corner (green light indicates it is running). -
Confirm that the
Compute instance
is running (green light indicates it is running). -
The relevant code
-
Execute the
安裝套件與確認 SDK 版本
code block. -
Execute the
配置工作環境
code block. -
Execute the
連接已有的 Blob
code block. -
Execute the
建立運算叢集
code block. -
Execute the
設定執行環境的 config
code block. -
Execute the
建立 Python 腳本步驟
code block. -
Execute the
部署成 pipeline 與儲存 pipeline ID
code block.
-
-
Create an
AzureML Pipeline
that is triggered when a report file is uploaded toBlob
.-
Click on the
Pipelines
button on the left-hand side. > Click on thePipeline endpoints
button at the top. > Click on theblob-trigger-pipeline
. -
Set parameters
-
Switch working directory (Blob Trigger)
cd ../Blob_Trigger
-
Modify the content of
development
in theconfig/config.json
file (username、password、database){ "development": { "username": "Please modify it with the information obtained earlier.", "password": "Please modify it with the information obtained earlier.", "database": "Please modify it with the information obtained earlier.", "host": "Please modify it with the information obtained earlier.", "dialect": "postgres", "ssl": true, "dialectOptions": { "ssl": { "require": true } } }, }
-
Create a Blob Trigger Container Apps Environments (Blob Trigger)
-
Get default Domain (Blob Trigger)
-
Create the Blob Trigger Docker Image (Blob Trigger)
-
Deploy the Blob Trigger image to Azure Container App (Blob Trigger)
-
Commands for macOS and Ubuntu environment
az containerapp create \ --name $ACA_NAME_BLOB \ --resource-group $RESOURCE_GROUP \ --environment $ACA_ENVIRONMENT_BLOB \ --registry-server $ACR_NAME.azurecr.io \ --image $ACR_NAME.azurecr.io/blob-trigger:0.1 \ --env-vars "EXPERIMENT_NAME=secretref:experiment-name" "TRIGGER_ML_ENDPOINT=secretref:trigger-ml-endpoint" \ --secrets "experiment-name=$EXPERIMENT_NAME" "trigger-ml-endpoint=$TRIGGER_ML_ENDPOINT" \ --min-replicas 1 \ --max-replicas 1 \ --ingress 'external' \ --target-port 80 \ --query properties.configuration.ingress.fqdn "blobtrigger.ashysky-edb6ddb2.japaneast.azurecontainerapps.io"
-
-
URL obtained after Azure Container App for Blob Trigger is created.
-
Set
Azure role assignments
in theAzure Container App
where theBlob Trigger
was created. (ACA-blobtrigger) -
Set up the Web hook to be triggered when a Blob is added.
-
Click on the
Events
on the left side. > Then click onMore Options
. > Click onWeb Hook
. -
Enter relevant information.
- In the
Name
field, please enter a unique and identifiable name.- This example uses
blob-event-subscription
.
- This example uses
- In the
System Topic Name
field, please enter a unique and identifiable name.- This example uses
blob-event-topic
.
- This example uses
- In the
Filter to Event Types
field, please selectBlob Created
andBlob Deleted
. - In the
Endpoint Type
field, please selectWeb Hook
. - In the
Endpoint
field, please enter the URL obtained from the previous sectionURL obtained after Azure Container App for Blob Trigger is created
. - After completing the above information, please click on the
Create
button at the bottom left corner.
- In the
-
-
Upload csv file through Edge to the created Blob to confirm that the pipeline has been triggered.
-
Check Metrics and Image
-
Verify that the result file added by the pipeline earlier has been added to the
Image
folder in the created Blob.
-
- https://learn.microsoft.com/en-us/azure/digital-twins/quickstart-3d-scenes-studio
- https://learn.microsoft.com/en-us/azure/digital-twins/how-to-use-3d-scenes-studio#use-custom-advanced-expressions
- https://learn.microsoft.com/en-us/rest/api/azure-digitaltwins/#data-plane
- Huang, Cheng-Chuan
- Huang, Jui-Feng
- Luo, Jun-Wei
- Wei, Hsiang-Chun
- Wei, Shao-Chun
Examples of Azure Digital Twins is licensed under the MIT license.