ADX Kusto plug-in for Azure Digital Twins history

Azure Digital Twins is the perfect solution for modeling your (Industrial) IoT metaverse.

Each ‘actor’ in the real world (devices, machines, places, buildings, environments) can be represented in the digital world.

Azure Digital Twins shows the current situation of those actors, complete with relationships and additional business rules.

These rules can also trigger cascading changes through the graph when parents and siblings are influenced by child property changes.

See a previous blog post on working with Azure Digital Twins and representing the environment in a 3D visualization.

But this solution has one weak spot: how about the historical twin property changes caused by the business rules?

I already demonstrated how to store historical raw telemetry as the cold path.

We would like to store the Twin data next to it.

Fortunately, Azure Digital Twins also supports storing historical data in Azure Data Explorer:

This historical data can even be queried together with Azure Digital Twin graph data.

So, the output of an ADT graph query can be joined with a Kusto query!

Let’s check out how this works.

This post is part two of a series of posts about Azure Digital Twins:

  1. Extending the AZ-220 Digital Twins hands-on lab with 3D visualization
  2. ADX Kusto plug-in for Azure Digital Twins history
  3. Exploring Azure Digital Twins Graph history

This post is the fifth part of this Azure Data Explorer blog series:

I assume you have set up an Azure Digital Twins environment and Azure Data Explorer already.

I reuse the environment I demonstrated during the Flight Into Azure IoT event:

You can check out the video and find out how it’s created and connected.

For this solution, I manipulated the Twins using a number of Azure Functions.

These functions listening to events coming from the ADT environment, work in conjunction with event hubs (the actual endpoints for ADT routing):

The graph itself shows a number of twins representing two airplanes each with multiple sensors:

The Data history explains historical twin data will be stored in Azure Data Explorer:

It also tells us it used an Event Hub.

This is because the historical data is making use of the exact ADT routing mechanism (with endpoints pointing to Event Hubs) so we need to create a separate Event Hub upfront.

Event Hub

So, I created this extra Event Hub next to the other already existing Event Hubs.

I named it ‘adt-history-eh’:

As always, I keep the partition count the same as the number of partitions of the IoT Hub.

I also added an extra consumer group named ‘adt’:

Note: Because this Event Hub is only used by this connection between ADT and ADX, it’s tempting to use $default. I personally try to avoid using this $default because this is the same consumer group selected by default by many diagnostics tools (IoT Explorer, CLI tools, Visual Studio Code). Technically, by running a diagnosis, you could influence the official flow (multiple consumers reusing the same $default consumer group).

Data History Wizard

Once that Event Hub is created, let’s run the Azure Digital Twins data history wizard.

This wizard is very powerful because it tries to work together with Azure Active Directory so the security is set up correctly.

First, we need to confirm we want to turn the system-assigned managed identity on:

Confirm with yes.

Next, provide the details of the Event Hub we just created, including the consumer group:

Then, the details of the Azure Data Explorer cluster and database must be provided:

If you have multiple Azure Digital Twin environments, you could use separate tables or just one table.

You can provide the name of an already constructed table (see the documentation for the table schema).

I leave the name empty so the wizard will create a table for me.

After that, permissions must be granted for AAD roles so Azure Data Explorer can access the ADT twin graph:

I just accepted all three permissions and it worked well for me 🙂

Finally, you are able to create the connection:

You see information about the Event Hub and Azure Data Explorer.

The summary also explains you can check the quality of the connection using the metrics and logs.

Here, you see how you can check both success, latency, and failure:

The connection is set up:

When we check the Azure Data Explorer database using the query explorer, we can see the new table is created:

The name of the Digital Twin History table is ‘adt_dh_TechDays2022_neu_adt_northeurope’, as seen in the setup.

Notice the generic structure.

Each Twin property change will be represented as a key and value pair.

The value column is of type dynamic so you need to cast the value afterward!

If the sensors are transmitting telemetry, historical data should be captured by now.

Historical twin data in Azure Data Explorer

Now it’s time to flex your Kusto skills and explore the twin’s history.

Because Azure Data Explorer normally works in batches but I want to see near-real-time ingest, I enable streaming ingestion:

.alter table adt_dh_TechDays2022_neu_adt_northeurope policy streamingingestion enable

Let’s check some incoming messages:

adt_dh_TechDays2022_neu_adt_northeurope
| take 100

We ‘take’ 100 random rows, from the table with historical values.

The result looks like this:

Each row is a property change of a twin with a certain Twin Id and the Model Id in place when the change occurred.

This table now collects the outcome of all Twin changes, including the business rules triggered by both Twin telemetry and Twin changes.

It seems multiple rows have the same timestamp. This seems to be related to the fact these twin properties were all patched at the same time.

To test this, I added some extra twins:

I updated the call sign for the new airplane.

I also updated two airliner fields in one go.

When I query for the latest rows (using ‘top X by FIELD desc’), three new rows are seen:

Yes, the two airliner rows have indeed the same timestamp.

Notice the creation of twins are not appearing in the historical data.

The Service Id represents the Azure Digital Twins service. So technically, multiple ADT services can reuse the same table.

Notice the same Azure Data Explorer database was already used for storing cold data coming from the IoT Hub.

This makes it possible to double-check the business rules by comparing them with the original raw data.

Azure Digital Twins query plugin

This historical view is a great solution for getting insights into Azure Digital Twins over time.

It even gets better when we combine querying the historical data with querying the ADT graph!

Yes, in Azure Data Explorer we can access the current graph using this plugin:

let adtEndpoint = 'https://TechDays2022-neu-adt.api.neu.digitaltwins.azure.net';
let query = ```SELECT DT FROM DIGITALTWINS DT```;
evaluate azure_digital_twins_query_request(adtEndpoint, query)

We need to provide both the Azure Data Explorer service endpoint (including the HTTPS part) and the query we want to execute.

You probably need to fix the query are some points due to a few minor exceptions but it works as expected.

This query returns all current twins in your Azure Digital Twins environment:

These are listed as JSON text and that is OK.

The Kusto Query Language can cope with that:

let adtEndpoint = 'https://TechDays2022-neu-adt.api.neu.digitaltwins.azure.net';
let query = ```SELECT T.$dtId as tid, T.deviceId FROM DIGITALTWINS T```;
evaluate azure_digital_twins_query_request(adtEndpoint, query)

Here, we ask for both the model Id and optional device Id, part of that JSON response.

We get two columns.

Let’s try something else, we try to filter the ADT query.

We are only interested in twins with a certain deviceId and implementing a certain model:

let adtEndpoint = 'https://TechDays2022-neu-adt.api.neu.digitaltwins.azure.net';
let query = ```SELECT DT FROM DIGITALTWINS DT WHERE deviceId = 'plane01sim' AND IS_OF_MODEL(DT , 'dtmi:com:adt:flightintoiot:enginedatasensor;5')```;
evaluate azure_digital_twins_query_request(adtEndpoint, query)

Note: The backticks look a bit strange but this is fine. Commas are not an issue anymore:

This results in a single twin:

If we only want to know the model Id, change the SELECT a little by asking for the twin Id alias:

let adtEndpoint = 'https://TechDays2022-neu-adt.api.neu.digitaltwins.azure.net';
let query = ```SELECT DT.$dtId as ID FROM DIGITALTWINS DT WHERE deviceId = 'plane01sim' AND IS_OF_MODEL(DT , 'dtmi:com:adt:flightintoiot:enginedatasensor;5')```;
evaluate azure_digital_twins_query_request(adtEndpoint, query)

This results in that single value:

No historical graph data?

The only thing that I can think of that is challenging is that we are missing ADT Graph changes over time.

We compare historical data with the most recent graph.

I would like to compare historical data with the graph, available at the same time as the data was generated.

If you need this, I expect you need to create an extra table in Azure Data Explorer with the models (Model ID) and timestamps when made available in the Azure Digital Twins graphs…

Combining historical data with twin data

We are now able to combine these two queries.

KQL supports joining the two queries:

let adtEndpoint = 'https://TechDays2022-neu-adt.api.neu.digitaltwins.azure.net';
let query = ```SELECT DT.$dtId as TID FROM DIGITALTWINS DT WHERE deviceId = 'plane01sim' AND IS_OF_MODEL(DT , 'dtmi:com:adt:flightintoiot:enginedatasensor;5')```;
evaluate azure_digital_twins_query_request(adtEndpoint, query)
| extend Id = tostring(TID) 
| join kind=inner (adt_dh_TechDays2022_neu_adt_northeurope) on Id
| where Key == "enginePressureRatio"
| where TimeStamp > ago(30m)
| extend val_double = todouble(Value)
| render timechart with (ycolumns = val_double)

First, we query for the twin that implements the engine model of plane01sim.

We only want to know the twin Id (TID) and this id (it’s actually of type ‘dynamic’) needs to be cast to a string and we call it Id.

Now we can join it with the historical ‘enginePressureRatio’ for the last thirty minutes.

The value field is also of type ‘Dynamic’ so we need to cast it to a ‘double’.

Finally, we can render the most recent engine pressure ratio values in a chart:

As you can see, the result is a nice render in the query editor.

This query, including the Azure Digital Twins plugin, can also be used in Azure Data Explorer Dashboards:

Azure Data Explorer is now turned into a very flexible historian for Azure Digital Twins.

Although I have not tested this, I expect these ADT-related queries can be used in Managed Grafana too.

Conclusion

In this post, we have experienced how easy it is to set up the connection between Azure Digital Twins and Azure Data Explorer for historical data.

We have seen we can query the raw historical data and we can join it with the outcome of an Azure Digital Twins graph query.

Finally, we have seen how this data can be turned into historical dashboards.

Now, your IIoT metaverse suddenly has a historical conscience.

Advertentie

9 gedachten over “ADX Kusto plug-in for Azure Digital Twins history

  1. Thanks for sharing. I have two questions:

    – There are anomaly detectors built-in to Stream Analytics. Why not using that and having an Event Hub and a Function?
    – There is a direct connection possibility from IoT Hub to DT service. Why a Function in between?

    Best!

    1. Hello Emin, Azure IoT hub integration with Azure Digital Twins needs to be done with an Azure Function. Only the Azure Functions can update Twin properties and push telemetry changes to ADT (due to the required AAD rights for updating the twin graph). Check my previous blog post and try the lab to see how. The same goes for Stream Analytics output. Ingestion of that output (here anomaly detections are demonstrated) needs to be done using also an Azure Function.

Reacties zijn gesloten.