Exploring Azure Digital Twins Graph history

As seen in previous posts, Azure Digital Twins shows the current state of any twin in the Azure Digital Twins graph.

This current state is a combination of multiple items:

  • twin name and model
  • twin properties updates based on telemetry and/or business rules related to that telemetry
  • twin properties updates based on predictions towards the future
  • twin properties updates based on historical data

Azure Digital Twins also offers to store historical Twin property data. That historical data is made available using Azure Data Explorer, part of the data history connection, as explained in my previous post.

There, twin data can also be compared with twin graph knowledge using the ADX Kusto plugin for querying the ADT graph.

Still, that accompanying plugin is missing something. We have no historical knowledge of the graph itself including models and how these changed over time!

If we want this information, we need to be creative.

Can we add the missing pieces?

Let’s check out how we can explore the ADT graph history…

This post is part three of a series of posts about Azure Digital Twins:

  1. Extending the AZ-220 Digital Twins hands-on lab with 3D visualization
  2. ADX Kusto plug-in for Azure Digital Twins history
  3. Exploring Azure Digital Twins Graph history


No, we are not able to query the historical twin graph by just adding a time stamp to the graph query. Constructing and querying that historical graph data is neither offered nor simple to achieve.

Perhaps the underlying information can be reconstructed if we can collect the following:

  • Twin lifecycle updates (creation and deletion of twins)
  • Twin relationship updates (creation. updates, and deletion)
  • Model version imports in a separate table

With this extra information, it should be able to reconstruct a historical view of models and how the graph should have looked at a certain point in time.

It’s not perfect but it is a good start…

Below, we look at collecting that data into the same Azure Data Explorer database as where the current Digital twin’s historical property data is placed.

Note: I leave it to the reader’s imagination how to make use of this new information using Kusto queries.

Twin lifecycle changes and Twin relationship changes

We are in luck regarding the first two bullet points. Azure Digital Twins routing supports events based on either Digital Twins Lifecycle Notification or Digital Twins Relationship Change Notification.

A Digital Twins Lifecycle Notification means any digital twin create or delete operation.

Next to that, Digital Twins Relationship Change Notification means any digital twin relationship change.

Using basic event routing, we can honor the first two wishes:

Because these notifications are routed over the same event stream and dispatcher, we can output them the same way as we output twin telemetry and twin property changes. Here, we send them to an Event Hub and pick them up with an Azure Function.

The Azure Data Explorer is capable of ingesting EventHub messages directly (complete with message property mapping) but as we will see soon, the notifications are of type EventData and both the body and application/user properties are containing important data.

Because Azure Data Explorer has no understanding of application properties, we need to add that extra Azure function as a message converter between Azure Digital Twins and Azure Data Explorer.

Here are some examples of events generated by Azure Digital Twins when I manipulated the graph by adding a new digital twin named ‘AzureLifecycle’:

I already registered a route and endpoint to an Event Hub for these kinds of lifecycle updates:

Note: as seen below, this route can be even more optimized when both types of notifications (lifecycle and relationship) are combined.

The notification events are picked up by an Azure Function.

This is the logging of that Azure Function I wrote, adding EventData message information to the logging:

2023-01-03T08:33:16Z   [Information]   cloudEvents:id - bb15f094-5374-4124-973f-58a1609e9c9c
2023-01-03T08:33:16Z   [Information]   cloudEvents:source - techdays2022-neu-adt.api.neu.digitaltwins.azure.net
2023-01-03T08:33:16Z   [Information]   cloudEvents:specversion - 1.0
2023-01-03T08:33:16Z   [Information]   cloudEvents:type - Microsoft.DigitalTwins.Twin.Create
2023-01-03T08:33:16Z   [Information]   cloudEvents:time - 2023-01-03T08:33:16.0061990Z
2023-01-03T08:33:16Z   [Information]   cloudEvents:subject - AzureLifecycle
2023-01-03T08:33:16Z   [Information]   cloudEvents:traceparent - 00-9243523fdcdace0559e80bc2d7ea5afc-0ef39c748a07ed58-01
2023-01-03T08:33:16Z   [Information]   CorrelationId - bdd71dec-ff3c-400e-98f2-22e8c1630455
2023-01-03T08:33:16Z   [Information]   ContentType - application/json
2023-01-03T08:33:16Z   [Information]   Body received: {"$dtId":"AzureLifecycle","$etag":"W/\"4d6b6014-e565-43ab-b918-0da88665d23e\"","$metadata":{"$model":"dtmi:com:adt:flightintoint:airliner;5"}}

Notice the event are indeed a combination of a message body JSON message and application properties.

The actual body is:

  "$metadata": {

This message tells us:

  • The ADT environment is ‘techdays2022-neu-adt.api.neu.digitaltwins.azure.net’ (cloudEvents:source)
  • The cloud event type is ‘Microsoft.DigitalTwins.Twin.Create’ (cloudEvents:type)
  • The change was executed at 2023-01-03T08:33:16.0061990Z in UTC timezone (cloudEvents:time)
  • The Id of the twin is ‘AzureLifecycle’ (both shown as cloudEvents:subject and as $dtId in the body)
  • The twin is created using model ‘dtmi:com:adt:flightintoint:airliner;5’ ($model in the body)

Note: I recommend taking ‘cloudEvents:specversion’ into account too. This is historical data so this version will probably change somewhere in the future.

When I delete the same digital twin, I get almost the same message:

2023-01-03T09:14:37Z   [Information]   cloudEvents:id - 06a4110a-f623-483d-9162-49ae831229fc
2023-01-03T09:14:37Z   [Information]   cloudEvents:source - techdays2022-neu-adt.api.neu.digitaltwins.azure.net
2023-01-03T09:14:37Z   [Information]   cloudEvents:specversion - 1.0
2023-01-03T09:14:37Z   [Information]   cloudEvents:type - Microsoft.DigitalTwins.Twin.Delete
2023-01-03T09:14:37Z   [Information]   cloudEvents:time - 2023-01-03T09:14:36.5425185Z
2023-01-03T09:14:37Z   [Information]   cloudEvents:subject - AzureLifecycle
2023-01-03T09:14:37Z   [Information]   cloudEvents:traceparent - 00-f7379bae8667b899ad48324573d31ddf-c9adf3480a538fbd-01
2023-01-03T09:14:37Z   [Information]   CorrelationId - 033344d4-5098-40e0-bd67-2f67e7151bbc
2023-01-03T09:14:37Z   [Information]   ContentType - application/json
2023-01-03T09:14:37Z   [Information]   Body received: {"$dtId":"AzureLifecycle","$etag":"W/\"e16d81b9-5117-478e-9d09-2911aedf3f3d\"","$metadata":{"$model":"dtmi:com:adt:flightintoint:airliner;5"}}

This time, the ‘cloudEvents:type’ of notification is ‘Microsoft.DigitalTwins.Twin.Delete’.

So, we get the full history of the life cycle of digital twins.

Now, another function listens to relationship change notifications, taken from another ADT route:

Here, I created a relationship between the airliner and an airplane named ‘ALC-42’ (this airplane twin was also added after I added the airliner twin):

This results in this graph:

Adding this relationship to the twin graph results in this logging:

2023-01-03T09:20:15Z   [Information]   cloudEvents:id - 8cf3f733-78f2-4b75-8a1f-d8f535d5b540
2023-01-03T09:20:15Z   [Information]   cloudEvents:source - techdays2022-neu-adt.api.neu.digitaltwins.azure.net
2023-01-03T09:20:15Z   [Information]   cloudEvents:specversion - 1.0
2023-01-03T09:20:15Z   [Information]   cloudEvents:type - Microsoft.DigitalTwins.Relationship.Create
2023-01-03T09:20:15Z   [Information]   cloudEvents:time - 2023-01-03T09:20:11.8930484Z
2023-01-03T09:20:15Z   [Information]   cloudEvents:subject - AzureLifecycle/relationships/28b4b6c7-0197-45d9-af06-be65c7584e04
2023-01-03T09:20:15Z   [Information]   cloudEvents:traceparent - 00-e49937cfa07129654ea8310cffed86b6-0792d165f3fa5ba3-01
2023-01-03T09:20:15Z   [Information]   CorrelationId - f952f439-64c7-4725-9fde-0e8a0f116070
2023-01-03T09:20:15Z   [Information]   ContentType - application/json
2023-01-03T09:20:15Z   [Information]   Body received: {"$relationshipId":"28b4b6c7-0197-45d9-af06-be65c7584e04","$etag":"W/\"d6a805f6-bd2b-4976-b7e9-6331e365141d\"","$sourceId":"AzureLifecycle","$relationshipName":"rel_has_airplanes","$targetId":"ALC-42"}

The application properties are quite the same, compared to those in the lifecycle notifications.

The Id of the relationships looks a bit strange due to the GUID in the name (cloudEvents:subject).

Only the body differs:


Two digital twins are involved in any relationship so both are mentioned in the body as source (airliner) and target (airplane).

Now we know what the ADT notification messages look like, we can try to ingest them into an extra table in Azure Data Explorer:

Again, we need an Azure function to convert the messages.

In this example, I used a second Event Hub as Azure Function output so I can ingest it with an Azure Data Explorer database data connection.

Note: Azure Data Explorer also supports programming SDK (like C#) but this solution gives me more insights into what is happening.

Now, because both routes expose similarly formatted output messages (except for the body JSON), we can store all kinds of notification messages in the same Azure Data Explorer table:

.create table adtNotificationTable (TimeStamp: datetime, SourceTimeStamp: datetime, ServiceId: string, Id: string, Type: string, Body:dynamic)

.create table adtNotificationTable ingestion json mapping "adtnotificationmapping"
        '{"Column": "TimeStamp", "Properties": {"Path": "$.TimeStamp"}},'
        '{"Column": "SourceTimeStamp", "Properties": {"Path": "$.SourceTimeStamp"}},'
        '{"Column": "ServiceId", "Properties": {"Path": "$.ServiceId"}},'
        '{"Column": "Id", "Properties": {"Path": "$.Id"}},'
        '{"Column": "Type", "Properties": {"Path": "$.Type"}},'
        '{"Column": "Body", "Properties": {"Path": "$.Body"}}'

.alter table adtNotificationTable policy streamingingestion enable

This table and accompanying JSON mapping are used in this ADX database data connection:

The data connection listens to messages coming from this ‘adtnotificationeventhub’ Event Hub and ingests them into the adtNotificationTable using the ‘adtnotificationmapping’ mapping.

Note: I started with two separate ADT routes, two different Event Hub endpoints, and two different Azure Functions. This is to separate Twin Lifecycle notifications and Relationship Update notifications. In hindsight, this could have been combined in one single ADT route supporting all five message types in the filter.

The only thing missing now is the Azure Function being extended with conversion logic so the ADT notification can be picked up by Azure Data Explorer, including application properties:

internal static partial class TwinRelationshipChangeToAdxFunction
    public static async Task Run(
            ConsumerGroup = "fa", 
            Connection = "EventHubTwinRelationshipChangeEventHubString")] 
        EventData[] events,
            Connection = "EventHubAdtNotificationEventHubString")]
        IAsyncCollector<string> outputEvents,
        ILogger log)
        log.LogInformation($"Executing: {events.Length} events...");

        //// Loop through the list of events

            foreach (EventData eventData in events)
                var adtNotification = new AdtNotification();
                adtNotification.TimeStamp = DateTime.UtcNow;
                adtNotification.SourceTimeStamp = Convert.ToDateTime(eventData.Properties["cloudEvents:time"]);
                adtNotification.ServiceId = eventData.Properties["cloudEvents:source"].ToString();
                adtNotification.Id = eventData.Properties["cloudEvents:subject"].ToString();
                adtNotification.Type = eventData.Properties["cloudEvents:type"].ToString();
                adtNotification.Body = Encoding.UTF8.GetString(eventData.Body.Array);
                await outputEvents.AddAsync(JsonConvert.SerializeObject(adtNotification));

                log.LogInformation($"Message serialized and sent");
        catch (Exception ex)
            log.LogInformation($"Exception: {ex.Message}");

public class AdtNotification
    public DateTime TimeStamp { get; set; }
    public DateTime SourceTimeStamp { get; set; }
    public string Id { get; set; }
    public string ServiceId { get; set; }
    public string Type { get; set; }
    public string Body { get; set; }

Once this Azure Function is active and the airplane twin is removed, two rows are added to the table because the relationship is obsolete too:

| order by TimeStamp asc

The result is:

Yes, we see historical information about both the lifecycle of twins and their relations!

When I recreate the same relationship between airliner and airplane again these create notifications are stored also:

I can access the body JSON elements of the create notification, even when it’s stored as a Dynamic field:

| where Type == 'Microsoft.DigitalTwins.Relationship.Create'
| project SourceTimeStamp, Id, Source = Body.$sourceId, Target = Body.$targetId 

The result gives me both the source twin and the target twin as separate values:

Notice I have to filter on the Type column because each type has another body format with possibly other elements.

With this, we have a historical lifecycle overview of all ADT graph elements.

Model version imports

The last part to implement is storing the actual DTDL interface models.

I can imagine models are uploaded either by hand or by code in Azure Digital Twins.

The Azure Digital Twins environment has no automatic way of exposing uploaded models to other environments.

But the same models should be added to Azure Data Explorer too so you have a historical overview.

Here, I demonstrate how to add them to the Azure Digital Explorer database by hand, using the Azure Data Explorer Web portal.

There, multiple ways of ingesting data are offered and this is the simplest solution for a simple problem 🙂

I go for ‘Ingest data from a local file’:

This opens a wizard dialog where we can add one or more files. Content from those files is stored in an existing or new table:

I chose to let a new table be created.

Next, I need to select one or more files. I start with one file, the Airliner model:

There is one obstacle. I want to upload the whole JSON content of the file as a single field into the database table!

There is a simple solution for this, set the Nested Level to zero and a single dynamic column is created:

Finally, data ingestion can start now:

A table with a single column is created and a single row is ingested.

If we query the table, we see the content.

It’s the DTDL model:

Let’s add the other models too.

Run the same wizard a second time, this time using the option to select an existing table:

Four other models will be ingested:

We use the same number of nesting levels:

Note: I could have reused the same table mapping.

The ingestion is straightforward, four new rows are added to the already existing table:

Finally, we now have five rows in the same table describing the individual models:

And we are able to query this structure too, including element with an asterisk in the name:

The models are imported now.

In a more professional way, you would automate importing every time a model is added to the Azure Digital Twins environment.


We are now able to extend the original historical Digital Twins property data export with historical Digital Twins lifecycle information, historical Digital Twins relationship information, and the actual Digital Twins models.

To query this historical digital twins graph information, you need to start writing your own logic.

It’s not as ideal as querying the most recent Digital Twins graph using the Kusto plugin but at least you have all information you need to recreate that historical experience.


3 gedachten over “Exploring Azure Digital Twins Graph history

Geef een reactie

Vul je gegevens in of klik op een icoon om in te loggen.

WordPress.com logo

Je reageert onder je WordPress.com account. Log uit /  Bijwerken )


Je reageert onder je Twitter account. Log uit /  Bijwerken )

Facebook foto

Je reageert onder je Facebook account. Log uit /  Bijwerken )

Verbinden met %s