Azure IoTHub routing revisited, Blob Storage Endpoints

Recently, Microsoft added some extra features to the IoTHub routing abilities:

  1. Support for routing using the message body
  2. Support for Blob Storage as endpoint

In this blog, we will look at both features using the Visual Studio 2017 extension called the IoT Hub Connected Service, which is updated also.

But first, let’s look at the new Blob Storage endpoint.

Sending telemetry to a Blob Storage container is a simple and efficient way for cold path analytics:

Until recently, there were a few ways to do this:

  • Sending IoTHub output to a Stream Analytics job, which filled some blob
  • Sending IoTHub output to an Azure Function, which filled some blob
  • Making use of the IoT Hub ability to receive blobs

The first two ways can be done using extra Azure resources so additional costs are involved. The third one is only used in very specific circumstances.

The new Blob Storage endpoint is a simple but very powerful way of making Cold path analytics available, right from the IoTHub.

Doorgaan met het lezen van “Azure IoTHub routing revisited, Blob Storage Endpoints”

Azure file storage, writing file blobs to a drive

Azure storage is one of those things you use every day, without knowing it. For example, Azure functions Apps uses it; in fact, all kinds of (Azure) App uses it as storage And if we are aware of using it, it is mostly about Blob storage, table storage or queues.

For more than two years ago, Azure File storage was introduced. At first, it looks and feels like regular blob storage but there is a little difference:  a blob container is called a share. To be more specific, a share can actually be shared using the SMB protocol. So a drive letter can be assigned to the file storage share.

Let’s see how this works. If you do not have an Azure storage available, please create one. Here are the four kinds of storage in Azure storage.

fs01

Doorgaan met het lezen van “Azure file storage, writing file blobs to a drive”

Raw Azure StreamAnalytics output using Azure Storage

StreamAnalytics is a monster. It is capable of making decisions on large amounts of data, coming in like a stream. It has input sources and output sinks. And in between, a query is running. This query has an SQL-like structure, something like Select * into [output sink] from [input source].

Because I work a lot with the IoT part of Azure (IoT Hubs, EventHubs, etc.), a StreamAnalytics job is a great addition, handling the continuous stream of telemetry, coming from multiple devices.

StreamAnalytics is less known for usage in a non-Internet of Things environment. You can think of handling credit card payments or mutations in personal files.

In my opinion, StreamAnalytics has one drawback; it takes some time to start up and it can only be changed (alter the query of change sinks or sources) when it is stopped again. My biggest issue is that I can not directly see what telemetry is passed.

But there is a simple solution. A query can output the same telemetry (to be more specific, the data you want to pass to the output sink) to multiple sinks. So if you are not sure which data is passed, you can output it to a second sink.

In this blog, I show you how to output to Azure storage sinks. Doorgaan met het lezen van “Raw Azure StreamAnalytics output using Azure Storage”