Deploy an Azure StreamAnalytics Job on the IoTEdge

[Update 16-01-2018]: There is a workaround available. Log into the Azure Portal using:

[Update 15-01-2018]: The issue is being fixed asap. I have been notified the dialogs of Azure Stream Analytics will be changed in a couple of days. This blog will remain valid, though.

[Update 13-01-2018]: It seems the IoTEdge-ASA integration is suddenly not available in the Azure Portal. Eg. the Environment Host question shown below is dropped and no Edge Hub input or output is selectable anymore. No idea why… When more information is available, a new update will be posted here.

The second version of the Microsoft IoTEdge solution is now available as Public Preview. In this version, you can run predefined modules like Modbusbuild your own modules, deploy Azure Machine Learning modules, deploy Azure Functions and you can deploy Azure StreamAnalytics jobs.

The concept is pretty simple, as always you have to create an ASA job, define inputs, outputs and a query. But this time the ASA will run on a local device:

Microsoft provides documentation here and here explaining how to deploy your ASA modules. Let’s dig a bit deeper into it.

Building a Stream Analytics job

First, let’s define an ASA job:

One setting is important: you have to select “Edge” as Hosting environment. This makes the ASA job available for deployment to the IoTEdge.

I also selected North Europe as location. Until recently, the IoTEdge option in the IoTHub was only limited to certain Azure locations. With the December update, this option is now available to National Clouds.

Once the ASA job is generated, you define an input:

You can only choose for the Edge Hub as the Data stream source type.

Note: It’s also possible to add reference data (from a file).

And you have to define an output sink:

Again, only the Edge Hub is selectable as the sink. Remember, your ASA module will receive data from one (or more) other modules and the output will be routed to other modules or the IotHub.

So you have defined your input and output:

It will not be possible to actually run the ASA here. How do we deploy it? We come to that later on.

Now, let’s add the query:

INTO [iotedgesink]
FROM [iotedgeinput]

This is the most primitive query you can write. I just want to know what data is arriving (hence the ‘star’). You can write more fancy queries your self but keep in mind there are some limitations at this moment (like AnomalyDetection, UDF or compression).

Once you have saved the query, we are ready to deploy the module.

Deploying our ASA module to the IoTEdge

So switch over to the IoTHub. Here we see the new option to add IoT Edge devices:

In this screenshot, you can see that I already have added an IoT Edge device to the IoTHub and there are already some modules running.

Select the device and see the current state:

Our portal is currently connected to the device. And we can see a Modbus module is already deployed.

We want to deploy our Azure Stream Analytics job so we start a wizard using the ‘Set Modules’ button.

Here we can define which modules we want to add and remove from the device. And in the next step, we will have to define the routing.

But first, we import the ASA module. Press the ‘import’ button to open the specific dialog:

This will open the dialog:

We have to do two things:

  1. Reference the Stream Analytics job
  2. Select or create some Azure Storage

Why do we need Azure storage? Because we are not actually deploying the ASA job! What will be deployed is a public Docker module named: microsoft/azureiotedge-azure-stream-analytics. This module will get a reference to the Azure storage blob in which the actual input, the output and the query information is gathered:

So we have two modules now:

But how is the routing done?


We use the following routing (you can fill it in, into the next page of the wizard):

  "routes": {
    "telemetryToAsa": "FROM /messages/modules/msftmodbus/* INTO BrokeredEndpoint(\"/modules/EdgeV2-sa/inputs/iotedgeinput\")",
    "AsaToCloud": "FROM /messages/modules/EdgeV2-sa/* INTO $upstream"

We define two routes:

  1. Telemetry from any outgoing port from the Modbus module (named ‘msftmodbus’) is sent to the ‘iotedgeinput’ input of our ASA module named ‘EdgeV2-sa’
  2. ASA module output (from any sink) is sent ‘upstream’ which is the actual IoTHub

More information regarding routing can be found here.


Once the ASA module is deployed, we see the arrival of the messages in the Device Explorer:

The telemetry looks like:

29-12-2017 21:15:03> Device: [TestGateway], Data:[[{"DisplayName":"KnobOne","HwId":"PowerMeter-0a:01:01:01:01:01","Address":"40001","Value":"4746","SourceTimestamp":"2017-12-29 20:15:03"}]]
'source': 'ASA'
'OutputName': 'iotedgesink'

It’s interesting to see we can use the properties to route our ASA messages, we even know from which output the message is coming.

And finally, we see the arrival of the message in an Azure Function which is listening to all messages coming from our IoTHub:

2017-12-29T20:13:33.793 Function started (Id=8f7ffdd4-a5d7-40b4-97fc-c01b38a90ba7)
2017-12-29T20:13:33.793 C# IoT Hub trigger function processed a message: [{"DisplayName":"KnobOne","HwId":"PowerMeter-0a:01:01:01:01:01","Address":"40001","Value":"4751","SourceTimestamp":"2017-12-29 20:13:33"}]
2017-12-29T20:13:33.793 Function completed (Success, Id=8f7ffdd4-a5d7-40b4-97fc-c01b38a90ba7, Duration=0ms)

So we have our telemetry arriving at the Azure portal. It not that hard to set things up.

The only thing that’s annoying is the fact we lose the name of the device which is sending the telemetry. I used the ‘*’ in my query but the only data we receive in the ASA query is the telemetry. This is different from an ASA job hosted in Azure and receiving data from an IoTHub.

We can see in the response from the Modbus module, the ‘HwId’is submitted. This is not the same as the deviceId but it’s a workaround.

Updating the ASA job

I had quite some issues updating an already deployed ASA IoTEdge job. What should work is deploying the same job in another Azure Storage blob. But this was not as reliable as expected. Please keep an eye on updating your own modules.


Deploying an Azure StreamAnalytics job to the IoTEdge is quite simple. Both creating the module itself and the routing as well, it’s done in a few minutes.

I really look forward to the the moment where the ASA limitations are taken away.


Reacties zijn gesloten.