Azure Data Explorer connector for Blob storage (IoT Hub) files

Over the last couple of months, I have written several blog posts regarding Azure Data explorer ingestion connectors.

I checked out the IoT Hub connector, dynamic mapping, and table update policies.

Until now, this is all based on both the IoT Hub connector and the Event Hub connector:

Next to those two, Azure Data Explorer also supports ingesting complete files using the third connector: Blob Storage ingestion in combination with EventGrid events:

Let’s check out what this connector offers and how to set it up.

As we will find out, it even has a surprise for those of you ingesting IoT Hub messages.

Doorgaan met het lezen van “Azure Data Explorer connector for Blob storage (IoT Hub) files”
Advertentie

Azure Time Series Insights introduction

Just this week, I was part of the Microsoft Tech Days: Flight into IoT event.

With a whole team of MVPs, we all explained different parts of Azure IoT using a simulation of an airplane flight from London to Budapest.

I myself talked about the pros and cons of Azure Time Series Insights:

Because I only had twenty minutes for explaining what TSI is and for demonstrating how it works, I had to skip some topics.

In this blog, I give an overview of what I demonstrated plus I add some extra goodies and in-depth information because there luckily is no time limit to this blog 🙂

Doorgaan met het lezen van “Azure Time Series Insights introduction”

Get a bundle of support files from your Azure IoT Edge for remote diagnostics

Azure IoT Edge makes it possible to run extensive logic within your factory, building, vehicle, etc. while it’s connected to the cloud.

This way, we can monitor the underlying sensors and protocols and measure what’s happening on the Edge. We are even capable of making predictions, both on the Edge and in the Cloud, of what is going to happen based on the current measurements and the data received over the last second, minute, hour, etc.

On a meta-level, Azure IoT Edge also comes with features for monitoring the edge device itself. Think about monitoring metrics and logging.

These features are mostly centralized around the edge modules and runtime. Due to the edge logic being sand-boxed, this is fine.

Still, we want to be able to go beyond the logic we deployed.

It would be nice if we would be able to break out of that sandbox and get some information about the Docker/Moby environment, IoT Edge runtime daemon, network, etc.:

This is actually offered!

Azure IoT Edge offers a so-called ‘support bundle’.

It is just a bundle of files with eg. logs, taken from various sources on the edge device and it is made available so you can support your edge device.

It contains:

  • Module logs
  • IoT Edge security manager logs
  • Container engine logs
  • iotedge check‘ JSON output
  • Other useful debug information

It’s even possible to retrieve these files ‘over-the-air’. This makes remote diagnostics possible for all your Azure IoT Edge devices!

Let’s take a closer look at this.

Doorgaan met het lezen van “Get a bundle of support files from your Azure IoT Edge for remote diagnostics”

Uploading an image to Azure Blob storage

Adding an image to the Azure Blob storage is a basic skill for developers. But working with images is not something developers do on a regular basis.

If you are using hardware independent code, .Net Core does not make it any simpler, a lot of the examples found on the internet are written for the regular .Net framework.

Image handling was bound to a lot of OS related features. But Microsoft had to learn .Net Core some GDI tricks.

Here is a simple example of how to construct and send images, created in memory, to Azure Blob storage.

Doorgaan met het lezen van “Uploading an image to Azure Blob storage”

Azure IoT Edge Blob module posts BlockBlobs blocks dosed in Storage

Already last year, I wrote a blog about the Azure Blob storage for IoT Edge module. Back then it was just in preview but just now it’s generally available.

The module still provides the same functionality: you can read and write to blob storage with the same SDK and programming modal you use for handling blobs in the Azure Cloud Storage:

There are some limitations regarding the API to use for the Blob module (eg. no support for lease blobs) but there are also extra features.

The most interesting feature is:

It enables you to automatically upload data to Azure from your local block blob storage using deviceToCloudUpload properties

Yes, you can configure the blob storage module running on your IoT Edge device to automatic upload blobs to the cloud. This is a great data pump!

Microsoft enumerates some advantages in their documentation. For me this is the ideal way to move raw data with low priority to the cloud in a cheap but reliable way without much effort.

I was especially interested in the BlockBlob synchronization:

The module is uploading blob and internet connection goes away; when the connectivity is back again it uploads only the remaining blocks and not the whole blob.

This is potentially the most efficient solution, especially for large files.

Let’s check out how this works.

Doorgaan met het lezen van “Azure IoT Edge Blob module posts BlockBlobs blocks dosed in Storage”

Azure IoTHub routing revisited, Blob Storage Endpoints

Recently, Microsoft added some extra features to the IoTHub routing abilities:

  1. Support for routing using the message body
  2. Support for Blob Storage as endpoint

In this blog, we will look at both features using the Visual Studio 2017 extension called the IoT Hub Connected Service, which is updated also.

But first, let’s look at the new Blob Storage endpoint.

Sending telemetry to a Blob Storage container is a simple and efficient way for cold path analytics:

Until recently, there were a few ways to do this:

  • Sending IoTHub output to a Stream Analytics job, which filled some blob
  • Sending IoTHub output to an Azure Function, which filled some blob
  • Making use of the IoT Hub ability to receive blobs

The first two ways can be done using extra Azure resources so additional costs are involved. The third one is only used in very specific circumstances.

The new Blob Storage endpoint is a simple but very powerful way of making Cold path analytics available, right from the IoTHub.

Doorgaan met het lezen van “Azure IoTHub routing revisited, Blob Storage Endpoints”

Iot Hub supports uploading files, perfect for Cognitive Services vision

Usually, when I reference the Azure IoT Hub, you can expect that some telemetry is involved. But the Azure IoT Hub is evolving. A new feature is coming, it is now possible to upload blobs.

Why should we want to upload blobs? Well, I think it’s great to have the possibility to upload files full of telemetry in one upload.

But sometimes the telemetry is not a bunch of values but it’s more like a big bag of bytes 🙂 In the example below, we will upload an image and pass it on to Microsoft Cognitive Services  (previously known as Project Oxford) for visual analysis.

Doorgaan met het lezen van “Iot Hub supports uploading files, perfect for Cognitive Services vision”

Raw Azure StreamAnalytics output using Azure Storage

StreamAnalytics is a monster. It is capable of making decisions on large amounts of data, coming in like a stream. It has input sources and output sinks. And in between, a query is running. This query has an SQL-like structure, something like Select * into [output sink] from [input source].

Because I work a lot with the IoT part of Azure (IoT Hubs, EventHubs, etc.), a StreamAnalytics job is a great addition, handling the continuous stream of telemetry, coming from multiple devices.

StreamAnalytics is less known for usage in a non-Internet of Things environment. You can think of handling credit card payments or mutations in personal files.

In my opinion, StreamAnalytics has one drawback; it takes some time to start up and it can only be changed (alter the query of change sinks or sources) when it is stopped again. My biggest issue is that I can not directly see what telemetry is passed.

But there is a simple solution. A query can output the same telemetry (to be more specific, the data you want to pass to the output sink) to multiple sinks. So if you are not sure which data is passed, you can output it to a second sink.

In this blog, I show you how to output to Azure storage sinks. Doorgaan met het lezen van “Raw Azure StreamAnalytics output using Azure Storage”

Bulk import of IoTHub devices

This blog post is for hardcore IoTHub users. It’s even a bit boring, at first.

The Azure IoTHub does not accept anonymous telemetry. Telemetry has to be presented by devices that are enabled. So you need to have a list of all your devices. You have to manage it.

In this post, we start diving into registering a single device and we will end updating multiple devices in bulk.

Doorgaan met het lezen van “Bulk import of IoTHub devices”