One Azure IoT accelerator to rule them all

The family of Azure IoT resources is very diverse. If you know what you are doing and have developers available you can have a great time with the many PaaS cloud resources.

If you have devices which need internet connectivity but you have no developers, you can check out IoT Central, the SaaS IoT solution.

Recently, Microsoft announced a very powerful integration with other leading IoT Platforms like SAP Leonardo and PTC Thingworx. Both can connect directly with the Azure IoT Hub, the cloud gateway. This opens a broad range of integration opportunities.

And last but not least, you can start with prebuild verticals, Azure IoT accelerators, formerly known as Azure IoT suites. If you have developers available but you do not want to start from scratch, check them out. You can deploy a typical accelerator in 15 minutes to see how they behave. And the smart thing is, all the code behind the logic is available for free on Github.

The most known accelerators are:

  • Remote Monitoring (version two is based on microservices)
  • Connected Factory (support OPC-UA protocol)
  • Predictive Maintenance

But there are also third-party accelerators.

If you are a developer or architect, it’s time well spend checking them out!

Remote monitoring

The remote monitoring is a good starting point, it has a lot of out-of-the-box features:

In one of our current projects, we were looking for a rule engine. And while playing with the demo of the Remote Monitoring Accelerator, we stumbled on one.

The picture shown above is not really helping to explain how this rule engine works and you can try to read about it or check out the code on GitHub.

The features of this rules engine are both simple and powerful:

  • Define rules for alarms or even actions as JSON files in blob storage
  • Bind rules to groups of devices (defined as CSV file in blob storage)
  • Rules can react to ‘instant’ messages using Javascript comparisons
  • Rules can react to time windows aggregations using Javascript comparisons

And the best feature is that the rules engine is based on Azure Stream Analytics. Therefore it’s modular and it can be separated and reused completely in your own solution.

In this blog, we will see how it’s done.

Doorgaan met het lezen van “One Azure IoT accelerator to rule them all”

Advertenties

Compare previous and current message in Stream Analytics

Last week I was testing the temporary storage in IoT Edge. I was interested in the stability so I wanted to know if messages were missing or maybe even coming in twice.

I have this heartbeat module which produces a counter. So I am able to generate messages which can be measured as a sequence.

One way is to check this using your eyes ūüôā

But this can be seen as a more generic issue, comparing two messages after each other. So I was thinking about Azure Stream Analytics. This should be the perfect tool for this job.

Let’s check out how we can compare subsequent messages using Stream Analytics.

Doorgaan met het lezen van “Compare previous and current message in Stream Analytics”

Deploy an Azure StreamAnalytics Job on the IoTEdge

[Update 16-01-2018]: There is a workaround available. Log into the Azure Portal using:  https://portal.azure.com/?Microsoft_Azure_StreamAnalytics_onedge=true

[Update 15-01-2018]: The issue is being fixed asap. I have been notified the dialogs of Azure Stream Analytics will be changed in a couple of days. This blog will remain valid, though.

[Update 13-01-2018]: It seems the IoTEdge-ASA integration is suddenly not available in the Azure Portal. Eg. the Environment Host question shown below is dropped and no Edge Hub input or output is selectable anymore. No idea why…¬†When more information is available, a new update will be posted here.

The second version of the Microsoft IoTEdge solution is now available as Public Preview. In this version, you can run predefined modules like Modbus,  build your own modules, deploy Azure Machine Learning modules, deploy Azure Functions and you can deploy Azure StreamAnalytics jobs.

The concept is pretty simple, as always you have to create an ASA job, define inputs, outputs and a query. But this time the ASA will run on a local device:

Microsoft provides documentation here and here explaining¬†how to deploy your¬†ASA modules. Let’s dig a bit deeper into it. Doorgaan met het lezen van “Deploy an Azure StreamAnalytics Job on the IoTEdge”

User Defined Function in Stream Analytics

Azure Stream Analytics provides a great solution for temporal queries of streams of data. The query language is pretty simple, especially if you have a background in SQL queries.

The list of built-in functions is a long list, ranging from aggregation, analytics, geospatial, records, scalars to the recently introduced anomaly detection.

But what if you want to write your own functions?

Stream Analytics supports three types of custom functions:

  1. user-defined functions (UDF) written in Javascript
  2. user-defined aggregates (UDA) written in Javascript
  3. Machine learning endpoint disguised as functions

In this blog, I will show how easy it is to write and use your own custom logic in a Stream Analytics job. We will look at the user-defined functions.

Doorgaan met het lezen van “User Defined Function in Stream Analytics”

Combining Azure StreamAnalytics with Azure Function Sink

There are many ways to collect data from an IoTHub and pass it through Azure Functions:

  • Trigger an Azure Function directly by the IoTHub events or operations monitoring
  • Trigger an Azure Function using an EventHub or message bus as endpoints of IoTHub routing
  • Trigger¬†an Azure Function using an EventHub as output sink of a Stream Analytics job

All these solutions serve their own purpose.

But the last one, using an EventHub, can be pretty annoying. Yes, this is a great way if you will use the security policies and/or consumer groups of the Eventhub. But otherwise, there is a lot of administration.

Let’s check out how life is becoming much easier with a new Azure functionality which is still in public preview called Azure Function Sink.

Doorgaan met het lezen van “Combining Azure StreamAnalytics with Azure Function Sink”

Azure Stream Analytics built-in geospatial functions

Azure Stream Analytics has had severe important updates the last few months. And these updates make the Azure Stream Analytics job even more powerful.

Microsoft has added the ability for adding your own JavaScript functions. And the job is integrated into Visual Studio (2013 and 2015) using advanced tooling.

Working with Geo coordinates has alway intrigued me. So the third major update is my favorite: built-in geospatial functions.

There are three functions for supporting GeoJSON objects:

  1. CreatePoint. Returns a GeoJSON Point record
  2. CreatePolygon. Returns a GeoJSON Polygon record
  3. CreateLineString. Returns a GeoJSON LineString record

So we have points, lines and areas.

And there are several functions for checking out the relation between these records:

  1. ST_WITHIN. Check if a point lays within an area
  2. ST_OVERLAPS. Check if an area lays within another area
  3. ST_INTERSECTS. Check if two lines overlap
  4. ST_DISTANCE. Calculate the distance between two points in meters

Let check this out using Stream Analytics.

Doorgaan met het lezen van “Azure Stream Analytics built-in geospatial functions”

The Photon as a weather station, connected to Azure IoT Platform

A few weeks ago, Jon Gallant asked for beta tester for the Azure IoT Platform integration with the Partical platform (Beta Testers Needed for Particle to Azure IoT Integration).

This was great news to me. I have a Photon Azure Starter Kit laying around and I tried once to connect it to the Partical platform.

sparkfun_thing_azure

The kit has potential: it comes with a SparkFun Photon Weather Shield:

p05-weather-station

And on the shield are already attached:

  • Humidity/Temperature Sensor – HTU21D
  • Barometric Pressure – MPL3115A2

And it has two RJ-11 connectors for Weather Meters like this one:

p06-weather-station-meters

But that initial (EventHub?) integration was not quite intuitive and I had more projects to work on. So I moved on.

Now, I had a second chance to make the Photon work!

The first steps are to register your unique Photon device and attach it to the internet (it has a Wifi chip onboard).

If you go to the online IDE, you can write code for your Photon and flash (deploy) it ‘over the air’. This is fun, as long as your¬†Photon is online (wherever it is running), you can contact it using a browser.

The integration tutorial, the blog of Jon Gallant, is very straight forward regarding making an Azure IoT Hub integration. You only need an Azure IoT Hub and a specific access policy.  This will help you in sending a string from the Photon to the hub.

Update: Another useful tutorial comes from the Particle site and it shows how to send some integers.

But sending a JSON message is less intuitive.

Doorgaan met het lezen van “The Photon as a weather station, connected to Azure IoT Platform”