Azure Data Explorer connector for Blob storage (IoT Hub) files

Over the last couple of months, I have written several blog posts regarding Azure Data explorer ingestion connectors.

I checked out the IoT Hub connector, dynamic mapping, and table update policies.

Until now, this is all based on both the IoT Hub connector and the Event Hub connector:

Next to those two, Azure Data Explorer also supports ingesting complete files using the third connector: Blob Storage ingestion in combination with EventGrid events:

Let’s check out what this connector offers and how to set it up.

As we will find out, it even has a surprise for those of you ingesting IoT Hub messages.

Doorgaan met het lezen van “Azure Data Explorer connector for Blob storage (IoT Hub) files”

Extending the AZ-220 Digital Twins hands-on lab with 3D visualization

Azure Digital Twins is advertised as a “platform as a service (PaaS) offering that enables the creation of twin graphs based on digital models of entire environments, which could be buildings, factories, farms, energy networks, railways, stadiums, and more—even entire cities”.

This sounds promising but it does not really ring a bell, does it?

Fortunately, besides the excellent documentation, Microsoft provides a great learning path in MS Learn as part of the AZ-220 Azure IoT developer exam preparations.

There, you will learn how Azure Digital Twins offers new opportunities for representing an Internet of Things solution via twin models, twin relations, and a runtime environment.

You finish the learning path with a hands-on lab where you build a model around a cheese factory and ingest sensor telemetry:

In the demo, the telemetry flows through the runtime and ends up in Time Series Insights.

Yes, the learning path is a good start and will prepare you for the exam or the assessment (you need to pass this assessment for a free one-year certification renewal).

On the other hand, many extra features could be added to turn this good start into a great start!

Think about propagating Azure Digital Twins events and twin property changes through the graph and visualizing live updates of twins in a 3D model, complete with alerts.

Let’s check out some of these additional features and see what you need to do to extend the ADT example.

Doorgaan met het lezen van “Extending the AZ-220 Digital Twins hands-on lab with 3D visualization”

Adding context using EventData properties in Azure Stream Analytics

The Azure IoT Hub is the preferred Azure Cloud gateway for IoT Devices.

The format of messages sent by devices to the IoTHub is described in the EventData class.

There, a regular message is split into three data parts:

  1. The actual message body, the data inside
  2. System properties, context added by the system (like IoT Hub context)
  3. Application (aka User) properties, context added by the user

These properties are key value pairs. these are not encoded opposite to the message body which has to be decoded before it is accessible.

So, we see that the original messages can get context along the way.

As a developer, using Azure IoT, adding application/user context can be done at two levels:

We can add application properties to the device message. This is normally done on the device when the message is constructed.

The Azure IoT Hub also supports message enrichment. These enrichments are also added as application/user properties, as seen in my previous blog post.

There, I showed how to read the properties using an Azure Function.

Consuming message context is a little bit of a challenge as seen in that post.

Here, we dive into this using Azure Stream Analytics.

Doorgaan met het lezen van “Adding context using EventData properties in Azure Stream Analytics”

Azure IoT Device lifecycle events (part 1)

The Azure IoT Hub is the main gateway for Azure IoT-related device connectivity.

It features several useful features to make your IoT developer life easy.

It offers a registry for devices so each device has its own credentials. Each device is capable to ingest device-to-cloud telemetry data into Azure and the IoT Hub also offers several ways for cloud-to-device communication (e.g. desired properties and direct methods).

The IoT Hub is also capable to report any device change event during the lifecycle of that device:

It can generate messages when a device is created or deleted. It also generated messages when the device twin is changed. It even generates a message when the ‘digital twin’ is updated.

Note: The digital twin update is related to Azure IoT Plug and Play. This is out of scope for this post. Though, an example of the digital twin change event is seen here.

In this post, we will look at the format of these messages.

Doorgaan met het lezen van “Azure IoT Device lifecycle events (part 1)”

First look: The Things Network new Azure IoT Hub integration

Right from the start back in 2015, The Things Network supported connecting their LoraWan backend platform to other (cloud) platforms.

Connecting to the Azure cloud was no exception to this.

There was no default integration but the support for the MQTT protocol opened this door. Back then, I already blogged about using MQTT to connect. Based on that logic, bridges appeared for both uplink and downlink messages.

Later on, Microsoft also introduced the secure and flexible IoT Central bridge to connect to The Things Network, based on uplink support for webhooks.

Still, even with the new V3 portal of The Things Network, this integration was still based on either the Webhook integration or the original V3 MQTT interface.

All these techniques require an extra layer of logic between the two clouds.

But…

Finally, a native Azure IoT Hub integration has been made available:

The main features of the Azure IoT Hub Integration are:

  • Devices only need to be registered in the Azure IoT Hub. The related TTN application device registrations are kept in sync by TTN. Just use IoT Hub Device Twin tags to control your TTN device registration
  • Uplink messages appear both as telemetry and as Device Twin reported property updates
  • Downlink messages are sent using the Device Twin desired properties

The current documentation is pretty straight forward (although the DPS service logo is shown, it’s not part of the standard solution).

Let’s check out how this integration works and let’s dive deeper into this solution architecture.

Doorgaan met het lezen van “First look: The Things Network new Azure IoT Hub integration”

Event Hub alerts, great for detecting drops in connectivity

If we look at the Azure IoT Reference Architecture we see how streaming data is the heart of the IoT platform:

eha02

Data arrives at the IoT Hub and can be routed to any Azure resource using eg. the IoT Hub routing, Stream Analytics jobs and Event Hubs.

This gives us the maximum flexibility to divide the data into three major data streams or storage:

  1. Hot path – Event, Alert, Conditions
  2. Warm path – aggregated data, data for reporting
  3. Cold path – the raw data, mainly untouched and available in large quantities; great for data scientists

But there is an often unseen, even ignored forth stream. And that is the stream of data for monitoring.

In my earlier blogs, I wrote several times hoe heartbeats en watchdogs can tell the story about the quality of the data. And I have shown how eg. Azure Functions and notification services can improve the insights about the quality of the communication.

Today I want to add a little gem to that list.

Let’s check out Monitoring Alerts in Event Hubs. It’s part of the overall Monitoring service in Azure.

Doorgaan met het lezen van “Event Hub alerts, great for detecting drops in connectivity”

Introduction to the IoT Edge SDK V1, part 4: IoTHub routing

Update: Recently, Microsoft introduced the new V2 version of their IoT Edge. This blog is referencing the former but still supported V1.

We have already made great progress understanding and using the Azure Gateway SDK.

What do we have right now? We can send telemetry data from multiple ‘offline’ devices and accept commands from the IoT Hub.

The data we send is well-formatted JSON so we are good to go.

But I am a bit worried. While reading all documentation regarding the transformation from Azure Gateway SDK towards the IoT Edge SDK, it is clear that multiple types of messages are sent to the IoT Hub. For example, I can imagine that a Stream Analytics module generates other data.

And let’s look at a more ‘close to earth’ example. The gateway itself is a potential device too! But I do not want to mix data coming from the gateway and from sensor devices.

Of course, we recently got the ability to route messages using the message sent. But what about using the properties? This keeps the message content clean.

Will this be working?

Doorgaan met het lezen van “Introduction to the IoT Edge SDK V1, part 4: IoTHub routing”

Triggering Microsoft Flow using an Azure Function

For a recent Azure IoT workshop, we were looking for a nice ending of the Azure IoT pipeline. Because after introducing IoT Hubs, Stream Analytics, Event Hubs and Azure Functions, we wanted to show the real power of IoT.

Normally, we show a bunch of charts in PowerBI. You have seen it in my previous blogs, it’s so powerful. But this time, we wanted to do something else, something snappy.

Well, we came up with Microsoft Flow. At first, I was a bit sceptic. Will this help us to ‘sell’ Azure IoT?

Well, see for yourself… Here is an example on how to send, conditionally, an email to an email address you provide.

Doorgaan met het lezen van “Triggering Microsoft Flow using an Azure Function”

Closing the Windows IoT Core feedback loop using Azure Functions

Windows IoT Core is my preferred solution for the proof of concepts I build. The IoT stack is both easy and powerful and it’s a good choice to build real world solutions on too.

Getting telemetry in the Cloud using Microsoft Azure IoT Hub is easy also. And in my previous blog, I showed that adding live charts for BI only takes a couple of minutes.

There is one other thing that is very typical to IoT Hub. And that is sending commands back to devices. I use Azure Functions for that purpose.

In this blog, I will show you how to make use of this new, cheap and handy feature in Azure.

Update: Azure Functions is still in preview. I fixed some blocking issues in this blog due to current changes in this Azure resource (and this is a good thing).

Doorgaan met het lezen van “Closing the Windows IoT Core feedback loop using Azure Functions”

Introduction to Tessel combined with Azure EventHub

Last month, during the Azure Bootcamp, I was introduced in programming the Tessel.

This device is a fairly cheap IoT programming board and for me, the most remarkable feature was that it was running on JavaScript.

WP_20150425_004

Yes, everybody with the basic understanding of JavaScript can start programming the Tessel.

My main goal for  that day was putting some effort in Azure EventHubs but first I did the ‘hello world’ of the Tessel called Blinky.js

// Import the interface to Tessel hardware
var tessel = require('tessel');
// Set the led pins as outputs with initial states
// Truthy initial state sets the pin high
// Falsy sets it low.
var led1 = tessel.led[0].output(1);
var led2 = tessel.led[1].output(0);
setInterval(function () {
  console.log("I'm blinking! (Press CTRL + C to stop)");
  // Toggle the led states
  led1.toggle();
  led2.toggle();
}, 100);

It was not that intuitive to get this working (I had to install node.js etc) but the convenience of JavaScript is very nice. So just follow the scripted examples.

Then I tried out and combined two other hands-on labs:

These HOLs helped me to explore how I can send messages to the EventHub. And I was surprised how easy it was to send JSON data into the EventHub. And now I can imagine how the EventHub is just a VERY BIG bucket in which Azure MessageBus messages are off-loaded. And I can image how the data can be investigated by StreamInsight for a couple of days before the data gets steel and is deleted.

So finally, just for fun, I came up with this continuous probing and sending:

var https = require('https');
var crypto = require('crypto');
var climatelib = require('climate-si7020');
var tessel = require('tessel');
var climate = climatelib.use(tessel.port['A']);

climate.on('ready', function () {
  console.log('Connected to si7005');
});

climate.on('error', function(err) {
  console.log('error connecting module', err);
});

// Event Hubs parameters
console.log('----------------------------------------------------------------------');
console.log('Please ensure that you have modified the values for: namespace, hubname, partitionKey, eventHubAccessKeyName');
console.log('Please ensure that you have created a Shared Access Signature Token and you are using it in the code')
console.log('----------------------------------------------------------------------');
console.log('');

var namespace = 'SvdvEh-ns';
var hubname ='svdveh';
var deviceName = 'mytessel';
var eventHubAccessKeyName = 'SendRights';
var createdSAS = 'SharedAccessSignature sr=[...]';

console.log('Namespace: ' + namespace);
console.log('hubname: ' + hubname);
console.log('publisher: ' + deviceName);
console.log('eventHubAccessKeyName: ' + eventHubAccessKeyName);
console.log('SAS Token: ' + createdSAS);
console.log('----------------------------------------------------------------------');
console.log('');

var SendItem = function(payloadToSend){
  // Send the request to the Event Hub
  var options = {
    hostname: namespace + '.servicebus.windows.net',
    port: 443,
    path: '/' + hubname + '/publishers/' + deviceName + '/messages',
    method: 'POST',
    headers: {
      'Authorization': createdSAS,
      'Content-Length': payloadToSend.length,
      'Content-Type': 'application/atom+xml;type=entry;charset=utf-8'
    }
  };

  var req = https.request(options, function(res) {
    console.log('----------------------------------------------------------------------');
    console.log("statusCode: ", res.statusCode);
    console.log('----------------------------------------------------------------------');
    console.log('');
    res.on('data', function(d) {
      process.stdout.write(d);
    });
  });

  req.on('error', function(e) {
    console.log('error');
    console.error(e);
  });

  req.write(payloadToSend);
  req.end();
}

setImmediate(function loop () {
  climate.readTemperature('f', function (err, temp) {
    climate.readHumidity(function (err, humid) {
      console.log('Degrees:', temp.toFixed(4) + 'F', 'Humidity:', humid.toFixed(4) + '%RH');
      var payload = '{\"Temperature\":\"'+temp.toFixed(4)+'\",\"Humidity\":\"'+humid.toFixed(4)+'\"}';
      SendItem(payload);
      setTimeout(loop, 5000);
    });
  });
});

This code was enough to see how the EventHub was filled with data. I had no time to check out StreamInsight. But it seems very easy to write some SQL-like code to get the right information out of the hubs.

The biggest problem was the administration of all the namespaces, keys, connection strings and secrets of both the MessageBus and the EventHub. Sometimes I had to cut/copy/paste parts of connection strings but that was not that intuitive.

And the Tessel? It was real fun to play with. And the complete humidity/temperature sensor was easy to plug in (had to fix the code because it was referencing another module type). But somehow I missed the spider web of wires and components and the accidental reboots of the RaspberryPi 🙂

A special mention goes to the free service bus explorer. It was very exciting to see messages coming in live. Very useful for debugging too.