Deploy an Azure StreamAnalytics Job on the IoTEdge

[Update 16-01-2018]: There is a workaround available. Log into the Azure Portal using:

[Update 15-01-2018]: The issue is being fixed asap. I have been notified the dialogs of Azure Stream Analytics will be changed in a couple of days. This blog will remain valid, though.

[Update 13-01-2018]: It seems the IoTEdge-ASA integration is suddenly not available in the Azure Portal. Eg. the Environment Host question shown below is dropped and no Edge Hub input or output is selectable anymore. No idea why… When more information is available, a new update will be posted here.

The second version of the Microsoft IoTEdge solution is now available as Public Preview. In this version, you can run predefined modules like Modbusbuild your own modules, deploy Azure Machine Learning modules, deploy Azure Functions and you can deploy Azure StreamAnalytics jobs.

The concept is pretty simple, as always you have to create an ASA job, define inputs, outputs and a query. But this time the ASA will run on a local device:

Microsoft provides documentation here and here explaining how to deploy your ASA modules. Let’s dig a bit deeper into it. Continue reading “Deploy an Azure StreamAnalytics Job on the IoTEdge”


User Defined Function in Stream Analytics

Azure Stream Analytics provides a great solution for temporal queries of streams of data. The query language is pretty simple, especially if you have a background in SQL queries.

The list of built-in functions is a long list, ranging from aggregation, analytics, geospatial, records, scalars to the recently introduced anomaly detection.

But what if you want to write your own functions?

Stream Analytics supports three types of custom functions:

  1. user-defined functions (UDF) written in Javascript
  2. user-defined aggregates (UDA) written in Javascript
  3. Machine learning endpoint disguised as functions

In this blog, I will show how easy it is to write and use your own custom logic in a Stream Analytics job. We will look at the user-defined functions.

Continue reading “User Defined Function in Stream Analytics”

Combining Azure StreamAnalytics with Azure Function Sink

There are many ways to collect data from an IoTHub and pass it through Azure Functions:

  • Trigger an Azure Function directly by the IoTHub events or operations monitoring
  • Trigger an Azure Function using an EventHub or message bus as endpoints of IoTHub routing
  • Trigger an Azure Function using an EventHub as output sink of a Stream Analytics job

All these solutions serve their own purpose.

But the last one, using an EventHub, can be pretty annoying. Yes, this is a great way if you will use the security policies and/or consumer groups of the Eventhub. But otherwise, there is a lot of administration.

Let’s check out how life is becoming much easier with a new Azure functionality which is still in public preview called Azure Function Sink.

Continue reading “Combining Azure StreamAnalytics with Azure Function Sink”

Azure Stream Analytics built-in geospatial functions

Azure Stream Analytics has had severe important updates the last few months. And these updates make the Azure Stream Analytics job even more powerful.

Microsoft has added the ability for adding your own JavaScript functions. And the job is integrated into Visual Studio (2013 and 2015) using advanced tooling.

Working with Geo coordinates has alway intrigued me. So the third major update is my favorite: built-in geospatial functions.

There are three functions for supporting GeoJSON objects:

  1. CreatePoint. Returns a GeoJSON Point record
  2. CreatePolygon. Returns a GeoJSON Polygon record
  3. CreateLineString. Returns a GeoJSON LineString record

So we have points, lines and areas.

And there are several functions for checking out the relation between these records:

  1. ST_WITHIN. Check if a point lays within an area
  2. ST_OVERLAPS. Check if an area lays within another area
  3. ST_INTERSECTS. Check if two lines overlap
  4. ST_DISTANCE. Calculate the distance between two points in meters

Let check this out using Stream Analytics.

Continue reading “Azure Stream Analytics built-in geospatial functions”

The Photon as a weather station, connected to Azure IoT Platform

A few weeks ago, Jon Gallant asked for beta tester for the Azure IoT Platform integration with the Partical platform (Beta Testers Needed for Particle to Azure IoT Integration).

This was great news to me. I have a Photon Azure Starter Kit laying around and I tried once to connect it to the Partical platform.


The kit has potential: it comes with a SparkFun Photon Weather Shield:


And on the shield are already attached:

  • Humidity/Temperature Sensor – HTU21D
  • Barometric Pressure – MPL3115A2

And it has two RJ-11 connectors for Weather Meters like this one:


But that initial (EventHub?) integration was not quite intuitive and I had more projects to work on. So I moved on.

Now, I had a second chance to make the Photon work!

The first steps are to register your unique Photon device and attach it to the internet (it has a Wifi chip onboard).

If you go to the online IDE, you can write code for your Photon and flash (deploy) it ‘over the air’. This is fun, as long as your Photon is online (wherever it is running), you can contact it using a browser.

The integration tutorial, the blog of Jon Gallant, is very straight forward regarding making an Azure IoT Hub integration. You only need an Azure IoT Hub and a specific access policy.  This will help you in sending a string from the Photon to the hub.

Update: Another useful tutorial comes from the Particle site and it shows how to send some integers.

But sending a JSON message is less intuitive.

Continue reading “The Photon as a weather station, connected to Azure IoT Platform”

The output sink format of a Stream Analytics job matters!

When you look at examples of Stream Analytics queries usage, most examples are pretty straightforward. These work with simple queries which return single line output.

For example, a query like:

  hubinput timestamp by EventProcessedUtcTime

… will return a line like:

{"cycle":4, "errorcode"=1, "deviceid":"MachineCyclesDemo"}

In an Azure Function, this will arrive as:

2016-12-19T15:15:07.045 Function started (Id=e58ea9ec-fd8d-469e-bd9c-ea027ce2dbb4)
2016-12-19T15:15:07.690 Messages arrived: {"cycle":4, "errorcode"=1, "deviceid":"MachineCyclesDemo"}
2016-12-19T15:15:07.690 Function completed (Success, Id=e58ea9ec-fd8d-469e-bd9c-ea027ce2dbb4)

But when the query is a bit more complicated, like grouping with a time interval:

  IoTHub.ConnectionDeviceId as deviceId
  hubinput timestamp by EventProcessedUtcTime
  errorCode <> 0
GROUP BY IoTHub.ConnectionDeviceId, TumblingWindow(Duration(minute, 1))
HAVING Count(errorCode) > 1 

Then the format of the messages returned can be unexpected.

Continue reading “The output sink format of a Stream Analytics job matters!”

How to fix your StreamAnalytics metrics monitoring

Today it became clear to me that I was missing something…

Every time I created a new StreamAnalytics job, and data arrived, I was staring at this monitoring windows saying: “No available data”:



Today I finally decided to check this out. And I found out it’s a simple thing to fix. And it can be a real asset.

Continue reading “How to fix your StreamAnalytics metrics monitoring”