Azure IoTHub routing revisited, Blob Storage Endpoints

Recently, Microsoft added some extra features to the IoTHub routing abilities:

  1. Support for routing using the message body
  2. Support for Blob Storage as endpoint

In this blog, we will look at both features using the Visual Studio 2017 extension called the IoT Hub Connected Service, which is updated also.

But first, let’s look at the new Blob Storage endpoint.

Sending telemetry to a Blob Storage container is a simple and efficient way for cold path analytics:

Until recently, there were a few ways to do this:

  • Sending IoTHub output to a Stream Analytics job, which filled some blob
  • Sending IoTHub output to an Azure Function, which filled some blob
  • Making use of the IoT Hub ability to receive blobs

The first two ways can be done using extra Azure resources so additional costs are involved. The third one is only used in very specific circumstances.

The new Blob Storage endpoint is a simple but very powerful way of making Cold path analytics available, right from the IoTHub.

Continue reading “Azure IoTHub routing revisited, Blob Storage Endpoints”

Advertenties

Iot Hub supports uploading files, perfect for Cognitive Services vision

Usually, when I reference the Azure IoT Hub, you can expect that some telemetry is involved. But the Azure IoT Hub is evolving. A new feature is coming, it is now possible to upload blobs.

Why should we want to upload blobs? Well, I think it’s great to have the possibility to upload files full of telemetry in one upload.

But sometimes the telemetry is not a bunch of values but it’s more like a big bag of bytes 🙂 In the example below, we will upload an image and pass it on to Microsoft Cognitive Services  (previously known as Project Oxford) for visual analysis.

Continue reading “Iot Hub supports uploading files, perfect for Cognitive Services vision”

Raw Azure StreamAnalytics output using Azure Storage

StreamAnalytics is a monster. It is capable of making decisions on large amounts of data, coming in like a stream. It has input sources and output sinks. And in between, a query is running. This query has an SQL-like structure, something like Select * into [output sink] from [input source].

Because I work a lot with the IoT part of Azure (IoT Hubs, EventHubs, etc.), a StreamAnalytics job is a great addition, handling the continuous stream of telemetry, coming from multiple devices.

StreamAnalytics is less known for usage in a non-Internet of Things environment. You can think of handling credit card payments or mutations in personal files.

In my opinion, StreamAnalytics has one drawback; it takes some time to start up and it can only be changed (alter the query of change sinks or sources) when it is stopped again. My biggest issue is that I can not directly see what telemetry is passed.

But there is a simple solution. A query can output the same telemetry (to be more specific, the data you want to pass to the output sink) to multiple sinks. So if you are not sure which data is passed, you can output it to a second sink.

In this blog, I show you how to output to Azure storage sinks. Continue reading “Raw Azure StreamAnalytics output using Azure Storage”

Bulk import of IoTHub devices

This blog post is for the hardcore IoTHub users. It’s even a bit boring, at first.

The Azure IoTHub does not accept anonymous telemetry. Telemetry has to be presented by devices which are enabled. So you need to have a list of all your devices. You have to manage it.

In this post, we start diving into registering a single device and we will end updating multiple devices in bulk.

Continue reading “Bulk import of IoTHub devices”