Until now, this is all based on both the IoT Hub connector and the Event Hub connector:
Next to those two, Azure Data Explorer also supports ingesting complete files using the third connector: Blob Storage ingestion in combination with EventGrid events:
Let’s check out what this connector offers and how to set it up.
As we will find out, it even has a surprise for those of you ingesting IoT Hub messages.
Because I only had twenty minutes for explaining what TSI is and for demonstrating how it works, I had to skip some topics.
In this blog, I give an overview of what I demonstrated plus I add some extra goodies and in-depth information because there luckily is no time limit to this blog 🙂
Azure IoT Edge makes it possible to run extensive logic within your factory, building, vehicle, etc. while it’s connected to the cloud.
This way, we can monitor the underlying sensors and protocols and measure what’s happening on the Edge. We are even capable of making predictions, both on the Edge and in the Cloud, of what is going to happen based on the current measurements and the data received over the last second, minute, hour, etc.
On a meta-level, Azure IoT Edge also comes with features for monitoring the edge device itself. Think about monitoring metrics and logging.
These features are mostly centralized around the edge modules and runtime. Due to the edge logic being sand-boxed, this is fine.
Still, we want to be able to go beyond the logic we deployed.
It would be nice if we would be able to break out of that sandbox and get some information about the Docker/Moby environment, IoT Edge runtime daemon, network, etc.:
This is actually offered!
Azure IoT Edge offers a so-called ‘support bundle’.
It is just a bundle of files with eg. logs, taken from various sources on the edge device and it is made available so you can support your edge device.
It contains:
Module logs
IoT Edge security manager logs
Container engine logs
‘iotedge check‘ JSON output
Other useful debug information
It’s even possible to retrieve these files ‘over-the-air’. This makes remote diagnostics possible for all your Azure IoT Edge devices!
Adding an image to the Azure Blob storage is a basic skill for developers. But working with images is not something developers do on a regular basis.
If you are using hardware independent code, .Net Core does not make it any simpler, a lot of the examples found on the internet are written for the regular .Net framework.
Image handling was bound to a lot of OS related features. But Microsoft had to learn .Net Core some GDI tricks.
Here is a simple example of how to construct and send images, created in memory, to Azure Blob storage.
Already last year, I wrote a blog about the Azure Blob storage for IoT Edge module. Back then it was just in preview but just now it’s generally available.
The module still provides the same functionality: you can read and write to blob storage with the same SDK and programming modal you use for handling blobs in the Azure Cloud Storage:
There are some limitations regarding the API to use for the Blob module (eg. no support for lease blobs) but there are also extra features.
The most interesting feature is:
It enables you to automatically upload data to Azure from your local block blob storage using deviceToCloudUpload properties
Yes, you can configure the blob storage module running on your IoT Edge device to automatic upload blobs to the cloud. This is a great data pump!
Microsoft enumerates some advantages in their documentation. For me this is the ideal way to move raw data with low priority to the cloud in a cheap but reliable way without much effort.
I was especially interested in the BlockBlob synchronization:
The module is uploading blob and internet connection goes away; when the connectivity is back again it uploads only the remaining blocks and not the whole blob.
This is potentially the most efficient solution, especially for large files.
In this blog, we will look at both features using the Visual Studio 2017 extension called the IoT Hub Connected Service, which is updated also.
But first, let’s look at the new Blob Storage endpoint.
Sending telemetry to a Blob Storage container is a simple and efficient way for cold path analytics:
Until recently, there were a few ways to do this:
Sending IoTHub output to a Stream Analytics job, which filled some blob
Sending IoTHub output to an Azure Function, which filled some blob
Making use of the IoT Hub ability to receive blobs
The first two ways can be done using extra Azure resources so additional costs are involved. The third one is only used in very specific circumstances.
The new Blob Storage endpoint is a simple but very powerful way of making Cold path analytics available, right from the IoTHub.
Usually, when I reference the Azure IoT Hub, you can expect that some telemetry is involved. But the Azure IoT Hub is evolving. A new feature is coming, it is now possible to upload blobs.
Why should we want to upload blobs? Well, I think it’s great to have the possibility to upload files full of telemetry in one upload.
But sometimes the telemetry is not a bunch of values but it’s more like a big bag of bytes 🙂 In the example below, we will upload an image and pass it on to Microsoft Cognitive Services  (previously known as Project Oxford) for visual analysis.
StreamAnalytics is a monster. It is capable of making decisions on large amounts of data, coming in like a stream. It has input sources and output sinks. And in between, a query is running. This query has an SQL-like structure, something like Select * into [output sink] from [input source].
Because I work a lot with the IoT part of Azure (IoT Hubs, EventHubs, etc.), a StreamAnalytics job is a great addition, handling the continuous stream of telemetry, coming from multiple devices.
StreamAnalytics is less known for usage in a non-Internet of Things environment. You can think of handling credit card payments or mutations in personal files.
In my opinion, StreamAnalytics has one drawback; it takes some time to start up and it can only be changed (alter the query of change sinks or sources) when it is stopped again. My biggest issue is that I can not directly see what telemetry is passed.
But there is a simple solution. A query can output the same telemetry (to be more specific, the data you want to pass to the output sink) to multiple sinks. So if you are not sure which data is passed, you can output it to a second sink.
This blog post is for hardcore IoTHub users. It’s even a bit boring, at first.
The Azure IoTHub does not accept anonymous telemetry. Telemetry has to be presented by devices that are enabled. So you need to have a list of all your devices. You have to manage it.
In this post, we start diving into registering a single device and we will end updating multiple devices in bulk.