Get on board of a Flight into IoT 2 – Long-Haul – Microsoft Tech Days UK

Update: Recordings are done, it was a Blizz! Check out the recording here.

After the very successful flight Into Azure IoT from last year, a new flight will take off on November 24, 2022: A Flight into IoT 2, the Long-Haul:

We have a lot of upgrades: a new route, new telemetry, new Azure services, etc.

This is an online event with live recordings. Please register here for free.

Doorgaan met het lezen van “Get on board of a Flight into IoT 2 – Long-Haul – Microsoft Tech Days UK”
Advertentie

How to cope with IoT Hub enrichment restrictions

As seen in my previous post, The IoT Hub routing feature supports message enrichment, both for IoT devices and IoT edge modules.

Using the routing message enrichments, each incoming message gets extra user properties based on either static values, device twin tags, or device twin desired properties.

Unfortunately, only ten enrichments can be added:

If you want to pass more values, this will not work for you.

It would be great if nested JSON properties would count as one.

Again, unfortunately, only simple types (string, decimal, boolean, date/time, etc.) are supported so this excludes nested JSON (complex types).

Below, a viable solution to overcome both restrictions, using Azure Stream Analytics, is presented.

Let’s see how this works out.

Doorgaan met het lezen van “How to cope with IoT Hub enrichment restrictions”

Adding context using EventData properties in Azure Stream Analytics

The Azure IoT Hub is the preferred Azure Cloud gateway for IoT Devices.

The format of messages sent by devices to the IoTHub is described in the EventData class.

There, a regular message is split into three data parts:

  1. The actual message body, the data inside
  2. System properties, context added by the system (like IoT Hub context)
  3. Application (aka User) properties, context added by the user

These properties are key value pairs. these are not encoded opposite to the message body which has to be decoded before it is accessible.

So, we see that the original messages can get context along the way.

As a developer, using Azure IoT, adding application/user context can be done at two levels:

We can add application properties to the device message. This is normally done on the device when the message is constructed.

The Azure IoT Hub also supports message enrichment. These enrichments are also added as application/user properties, as seen in my previous blog post.

There, I showed how to read the properties using an Azure Function.

Consuming message context is a little bit of a challenge as seen in that post.

Here, we dive into this using Azure Stream Analytics.

Doorgaan met het lezen van “Adding context using EventData properties in Azure Stream Analytics”

Creating an Azure Stream Analytics job using VS Code

Azure Stream Analytics is often the centerpiece of our IoT solutions.

It acts like a rule engine where data streams from multiple sources can be combined together, even enriched with static reference data.

Azure Stream Analytics does not come cheap if you only want to do some simple aggregations. For that, probably Azure Functions can help you out.

But, when it comes to more elaborate rules using multiple inputs, multiple outputs, time windowing, custom functions, and Machine learning integration, Azure Stream Analytics (and many more capabilities) should be your first choice:

Normally, I demonstrate Azure Stream Analytics using the Azure Portal.

There, it offers me a convenient browser experience where I can show how inputs, outputs, and user-defined functions are created. I can also copy/paste a (basic) query and demonstrate how it can be tested and run in a simple manner:

For people new to Azure Stream Analytics, this is a perfect starting point.

Still, this is for demonstration purposes only!

The Azure portal lacks (professional) abilities like source control/versioning, superior user-defined functions, and diagnostics.

If you plan to use Azure Stream Analytics in your projects, please consider starting using the VS Code project template.

In this blog, we will see how to start with the Visual Studio Code project for Stream Analytics.

Doorgaan met het lezen van “Creating an Azure Stream Analytics job using VS Code”

Decompressing Azure IoT messages using Azure Stream Analytics

Azure IoT Edge messages are bound to a maximum size limitation of 256KB. Each message sent is divided into chunks of 4KB. The metering of an IoT Hub is based on these chunks.

So, if a message size of 10KB, it is counted as three separate messages.

Note: the chunk size of the IoT Hub free tier is smaller, just 0.5KB.

In a recent project, we were sending messages of more than 70KB. That means almost twenty chunks or even more in additional cases.

This is technically just fine but we were not feeling comfortable about this:

  • This means going rapidly through the IoTHub quota of daily messages
  • Messages were sent over a metered network

So this could become quite some investment in traffic and chunks.

I checked out if this is possible to limit this message size, there are several solutions.

But first, let’s see how to start compressing messages.

Doorgaan met het lezen van “Decompressing Azure IoT messages using Azure Stream Analytics”

Cloud IoT dashboards using Grafana with Azure IoT

Azure IoT offers a great solution for connecting IoT devices to the cloud and communicating with them in a secure way and in a two-way fashion: D2C and C2D.

Once you start ingesting telemetry you probably, at some point, want to represent the data in some kind of dashboarding.

This can either be a custom dashboard that gives you the most flexible way to represent the data. I have shown how to do this with Blazor. Or, you could choose PowerBI which is a well-known and productive tool used by many Data Scientists already.

Recently, our team invested some time in building dashboards using Grafana.

With Grafana you can create, explore and share all of your data through beautiful, flexible dashboards.

Azure supports Grafana in various ways in the Azure Marketplace.

For this blog post, I selected the official Grafana template which is hosted in a single VM:

The telemetry is ingested by an IoT Hub and send to a SQL Azure database using Azure Stream Analytics.

As you will see, this is quite an elaborate solution due to all the Azure resources being used.

Still, the solution is quite straightforward and certainly interesting if you are already familiar with Grafana.

Doorgaan met het lezen van “Cloud IoT dashboards using Grafana with Azure IoT”

Azure Stream Analytics anomaly detection on the edge

Back in 2018, the Azure Stream Analytics team announced Anomaly detection for Azure Stream Analytics. And, it was also supported on the Azure IoT Edge. In November 2018, I was allowed to demonstrate this at the SPS IPC Drives in Nuremberg:

Since then, anomaly detection has become a first citizen of Azure Stream Analytics.

Azure Stream Analytics Anomaly Detection is able to ‘automatically’ detect spikes, dips, and trends in a stream of values. This is based on math and all you need to do is to specify how many values you expect and how sensitive the detection must be. It is in fact Machine Learning, in the end. So it is a prediction with a certain certainty…

And it’s just part of the Stream Analytics query language. So on the edge, we can deploy a Stream Analytics module (the engine of stream analytics). This is a fixed module from Microsoft.

All you have to do is feeding it a query, inputs, outputs and, user-defined functions if available:

We can define all this in the cloud, inside a stream analytics job and we can even test it. Those parts are then packed as a blob and put in blob storage. The Stream Analytics job can then download it and run it.

The inputs and outputs of the Stream Analytics job can be attached to the normal Azure IoT Edge routing mechanism.

Let’s see how this is set up:

Doorgaan met het lezen van “Azure Stream Analytics anomaly detection on the edge”

Turn Jetson Nano vision into insights

Recently I got my hands on an Nvidia Jetson Nano toolkit. This wonderful device runs Ubuntu and is capable to support Azure IoT Edge:

The heart of this device, just beneath the cooling sink, is this board with a 128-core Maxwell GPU:

That is a lot of GPU compute power for just 99 dollars.

I found this hands-on lab from Paul DeCarlo. This simple to follow lab brings vision and object recognition to the cloud using Azure IoT Edge.

Let’s see how Azure Stream Analytics turns object recognition into insights in the cloud.

Doorgaan met het lezen van “Turn Jetson Nano vision into insights”

One Azure IoT accelerator to rule them all

The family of Azure IoT resources is very diverse. If you know what you are doing and have developers available you can have a great time with the many PaaS cloud resources.

If you have devices which need internet connectivity but you have no developers, you can check out IoT Central, the SaaS IoT solution.

Recently, Microsoft announced a very powerful integration with other leading IoT Platforms like SAP Leonardo and PTC Thingworx. Both can connect directly with the Azure IoT Hub, the cloud gateway. This opens a broad range of integration opportunities.

And last but not least, you can start with prebuild verticals, Azure IoT accelerators, formerly known as Azure IoT suites. If you have developers available but you do not want to start from scratch, check them out. You can deploy a typical accelerator in 15 minutes to see how they behave. And the smart thing is, all the code behind the logic is available for free on Github.

The most known accelerators are:

  • Remote Monitoring (version two is based on microservices)
  • Connected Factory (support OPC-UA protocol)
  • Predictive Maintenance

But there are also third-party accelerators.

If you are a developer or architect, it’s time well spend checking them out!

Remote monitoring

The remote monitoring is a good starting point, it has a lot of out-of-the-box features:

In one of our current projects, we were looking for a rule engine. And while playing with the demo of the Remote Monitoring Accelerator, we stumbled on one.

The picture shown above is not really helping to explain how this rule engine works and you can try to read about it or check out the code on GitHub.

The features of this rules engine are both simple and powerful:

  • Define rules for alarms or even actions as JSON files in blob storage
  • Bind rules to groups of devices (defined as CSV file in blob storage)
  • Rules can react to ‘instant’ messages using Javascript comparisons
  • Rules can react to time windows aggregations using Javascript comparisons

And the best feature is that the rules engine is based on Azure Stream Analytics. Therefore it’s modular and it can be separated and reused completely in your own solution.

In this blog, we will see how it’s done.

Doorgaan met het lezen van “One Azure IoT accelerator to rule them all”

Compare previous and current message in Stream Analytics

Last week I was testing the temporary storage in IoT Edge. I was interested in the stability so I wanted to know if messages were missing or maybe even coming in twice.

I have this heartbeat module which produces a counter. So I am able to generate messages which can be measured as a sequence.

One way is to check this using your eyes 🙂

But this can be seen as a more generic issue, comparing two messages after each other. So I was thinking about Azure Stream Analytics. This should be the perfect tool for this job.

Let’s check out how we can compare subsequent messages using Stream Analytics.

Doorgaan met het lezen van “Compare previous and current message in Stream Analytics”