Microsoft is still using the Baltimore certificate for its IoT services. This certificate is used for TLS communication with the IoT Hub and other IoT-related Azure services.
This is done by choice, to give users more time to migrate their devices if they are not (yet) able to support the new DigiCert Global Root G2 certificate automatically.
This year, starting next month, Microsoft will start the migration.
Check out this timeline:
This has been postponed a number of times already. It’s now time to act!
If both your Azure edge devices and Azure IoT edge devices are running Ubuntu, here are some pointers to test your devices.
The Azure IoT Hub is your cloud gateway for ingesting telemetry.
The IoT Hub cannot persist incoming messages so these must be forwarded to other Azure services.
Traditionally, the messages are exposed over an Event Hub-compatible endpoint.
More recent, (non-functional) IoT Hub routing is added where specific Azure services can be connected as an endpoint:
At this moment we can define:
The build-in endpoint (to keep the original way of distributing messages)
Event Hub
Service Bus (Topics/Queues)
Storage account, blob storage (perfect for cheap cold storage)
Lately, a native endpoint for CosmosDB has been made available.
This takes the pain away of having to set up extra resources between the IoT Hub and CosmosDB, just to transport messages from one resource to another. This is mostly done using a Stream Analytics job or custom Azure Functions.
Most of you Azure IoT developers are connecting devices to the Azure cloud using the Azure IoT Device SDKs.
Using these SDKs, you can connect a device to the cloud in an easy and secure way with your favorite programming language like C#, C, Java, Python, or Node.js.
This is the recommended way because it offers a convenient and optimized way to support all Azure IoT Hub features like Device Twin, Direct methods, and Cloud messages. It takes away a lot of the code wiring and you can focus on functionality.
Still, in a few instances, like working with very constrained devices, there could be a need for bare MQTT support:
MQTT is the de-facto standard for stateful communication in the IoT World (btw. Bare AMQP is offered too).
Let’s see how the Azure IoT Hub supports bare MQTT.
Azure Digital Twins is advertised as a “platform as a service (PaaS) offering that enables the creation of twin graphs based on digital models of entire environments, which could be buildings, factories, farms, energy networks, railways, stadiums, and more—even entire cities”.
This sounds promising but it does not really ring a bell, does it?
Fortunately, besides the excellent documentation, Microsoft provides a great learning path in MS Learn as part of the AZ-220 Azure IoT developer exam preparations.
There, you will learn how Azure Digital Twins offers new opportunities for representing an Internet of Things solution via twin models, twin relations, and a runtime environment.
You finish the learning path with a hands-on lab where you build a model around a cheese factory and ingest sensor telemetry:
In the demo, the telemetry flows through the runtime and ends up in Time Series Insights.
Yes, the learning path is a good start and will prepare you for the exam or the assessment (you need to pass this assessment for a free one-year certification renewal).
On the other hand, many extra features could be added to turn this good start into a great start!
Think about propagating Azure Digital Twins events and twin property changes through the graph and visualizing live updates of twins in a 3D model, complete with alerts.
Let’s check out some of these additional features and see what you need to do to extend the ADT example.
As seen in my previous post, The IoT Hub routing feature supports message enrichment, both for IoT devices and IoT edge modules.
Using the routing message enrichments, each incoming message gets extra user properties based on either static values, device twin tags, or device twin desired properties.
Unfortunately, only ten enrichments can be added:
If you want to pass more values, this will not work for you.
It would be great if nested JSON properties would count as one.
Again, unfortunately, only simple types (string, decimal, boolean, date/time, etc.) are supported so this excludes nested JSON (complex types).
Below, a viable solution to overcome both restrictions, using Azure Stream Analytics, is presented.
Azure Data Explorer (ADX) is a great data exploration tool for IoT developers building a full IoT solution. This could be a perfect target for the cold path.
As seen in my previous blog post, ADX even offers a native connector for the IoT Hub. This is based on the ‘default EventHub compatible endpoint’ offered by this cloud IoT gateway (optionally the built-in Events endpoint in the routing section of the IoT Hub or using the fallback mechanism).
Most of the documentation regarding this ADX connector is following this ‘happy flow’ where one connector stores incoming IoT telemetry in one ADX table using static routing.
This is a serious limitation where most IoT Hubs ingest multiple types of messages. These will not fit into that single table.
Luckily, the connector also offers the possibility to allow routing to other databases:
Here, we will check out this dynamic routing option and see how this provides much more flexibility.
In my previous blog post, I showed how to use the Azure IoT device lifecycle events.
These events are emitted by the Azure IoT Hub as routing events next to the regular incoming device-to-cloud messages. Routing these device lifecycle events makes it possible for both persisting and reacting at the behavior of devices (creation, deletion, connected, disconnected) and registration changes (device identity twin changes).
This was first demonstrated using a regular direct internet-connected device.
The Azure IoT Hub can register thousands of devices. To manage them at scale, several kinds of tooling is made available.
First, using the Device Twin of each device, devices can store extra context (type, brand, vendor, version, location, features, etc.) using the Tags section.
In the Tags section of the device twin, you are free to add a number of (sub) nodes to this JSON document section:
The tags will never be readable by the device itself, these tags are used to query all devices in your IoT Hub and make subsets.
You, both as Azure portal user or as a programmer, can query all devices for specific features.
In the Azure portal, this ability to query is most visible when you open the list of (edge) devices:
This shows a new section where we can enter a SQL-ish query starting with “SELECT FROM devices WHERE”:
In the past, I wrote a blog about the Azure IoT device SDK. This example was written in C#.
Last year, I noticed an increased number of questions coming from Python users trying to connect devices to the cloud. Luckily, the driving force behind this is the growing community of ML developers using Python. They are increasingly involved in IoT projects.
That’s why I scrambled some samples together into one demonstration showing the capabilities of Azure IoT Hub-connected devices.
We will see how device-to-cloud messages are sent from the device to your IoT Hub. And we will see several ways of cloud-to-device communication so we can enforce actions on the device.
Update: recently, I added a second Python script with an individual Device Provisioning Service Enrollment based on a symmetric key. The example is exactly the same, you only need to provide other variables for this provisioning.
This introduction will get you started within a moment.