Using Influx database and Grafana on the Azure IoT Edge

Over the last couple of years, in this blog, multiple databases are demonstrated which can be deployed on the Azure IoT Edge:

Persisting incoming telemetry in a local database is useful for multiple purposes. One of them is creating a custom dashboard eg. written in Blazor.

In this blog, we explore how to deploy the popular InfluxDB. This open-source time-series database is specialized in Internet of Things usage. And on top of that, let’s explore how Grafana can be deployed on the edge too. Grafana is an open source analytics and interactive visualization web application.

We see how it can be connected to the Influx database:

As you see here, we need a custom Azure IoT Edge module (called ‘writer’) that is capable of writing incoming telemetry to the Influx database. There, the telemetry is picked up and displayed by Grafana.

Let’s see how this works.

Simulated telemetry ingestion

Let’s begin with the incoming telemetry. We use the simulated temperature sensor module from Microsoft:

This module generated random temperature, humidity, and pressure values.

We will ingest the ‘ambient temperature’ and write it into the Influx database.

Note: this is just an example of persisting telemetry. Each kind of message needs another writer module. It’s up to you to write a custom module based on your own telemetry ingestion.

Influx database deployment

Telemetry is created and available on the internal routing mechanism of the Azure IoT Edge.

We want to write this telemetry to an Influx database, deployed as a docker container using Azure IoT Edge.

The company behind the Influx database provides a Docker container on Docker Hub.

Although this is a generic third-party container, Azure IoT Edge can still deploy it. This can be done because we are able to include environment variables, ports, volumes, etc. during deployment.

In the Azure portal, We start the ‘Set modules’ wizard for our IoT Edge device in the IoT Hub. We add a new module.

First, we describe the location URI of the Influx container image:

Note: In this example, we deploy the 1.8 version of InfluxDB. The writer module which is introduced below, is also using libraries for that version. Technically, it should be possible to deploy the Influx 2.0 version image too.

We also have to provide a folder on the filesystem (so incoming telemetry is not flushed when the container restarts but persisted on disk). And we need to provide the port information. This is done in the Container Create options:

These Create Options are made available in the Deployment Manifest of the edge device:

"influxdb": {
  "settings": {
    "image": "influxdb:latest",
    "createOptions": "{\"ExposedPorts\":{\"8086/tcp\":{}},\"HostConfig\":{\"Binds\":[\"/var/influxdb:/var/lib/influxdb\"],\"PortBindings\":{\"8086/tcp\":[{\"HostPort\":\"8086\"}]}}}"
  },
  "type": "docker",
  "version": "1.0",
  "status": "running",
  "restartPolicy": "always"
}

Before we deploy this manifest to the device, we need to create that folder on the target device:

sudo mkdir /var/influxdb
sudo chmod 777 /var/influxdb

After the InfluxDB is deployed. see the creation of new subfolders in the influxdb folder:

Next to a successful deployment seen in the edgeAgent module log, this is also a good example to see if the deployment of InfluxDB has succeeded.

Testing the database access

Let’s see if we can ‘do something’ with the database.

There is a client tool for the database available. Once deployed, you can start it with ‘influx’:

sudo apt install influxdb-client
influx

As you can see, it connects automatically with port 8086 on localhost:

Now we know for sure the Influx database is sound and well.

We get a prompt where we can execute database commands and queries.

Let’s do some database stuff:

show databases

create database iotedger

use iotedger

Here, we check which databases are available. This should be empty first (except for _internal). We add a database named ‘iotedger’. We will use it in the rest of this blog post:

We use this database later on.

Note: If you are interested in some basic Influx database commands, check out this Youtube video.

Writing telemetry to the database

We have simulated data and we have a database. Let’s write that telemetry to the Influx database.

Although there are many programming languages supporting the Influx database, I went for a C# library.

Based on that library, I wrote this C# IoT Edge module using the IoT Edge extension for Visual Studio Code.

It is capable of reading the ambient temperature of incoming messages. Every message is then written into the database:

namespace InfluxWriterModule
{
    using System;
    using System.Runtime.Loader;
    using System.Text;
    using System.Threading;
    using System.Threading.Tasks;
    using Microsoft.Azure.Devices.Client;
    using Microsoft.Azure.Devices.Client.Transport.Mqtt;
    using Newtonsoft.Json;

    using InfluxDB.Collector;
    using System.Collections.Generic;
    using InfluxDB.Collector.Diagnostics;

    class Program
    {
        static readonly char[] Token = "".ToCharArray();

        static void Main(string[] args)
        {
            Init().Wait();

            // Wait until the app unloads or is cancelled
            var cts = new CancellationTokenSource();
            AssemblyLoadContext.Default.Unloading += (ctx) => cts.Cancel();
            Console.CancelKeyPress += (sender, cpe) => cts.Cancel();
            WhenCancelled(cts.Token).Wait();
        }

        public static Task WhenCancelled(CancellationToken cancellationToken)
        {
            var tcs = new TaskCompletionSource<bool>();
            cancellationToken.Register(s =>
                    ((TaskCompletionSource<bool>)s).SetResult(true), tcs);
            return tcs.Task;
        }

        static async Task Init()
        {
            MqttTransportSettings mqttSetting = 
                    new MqttTransportSettings(TransportType.Mqtt_Tcp_Only);
            ITransportSettings[] settings = { mqttSetting };

            ModuleClient ioTHubModuleClient = await
                           ModuleClient.CreateFromEnvironmentAsync(settings);
            await ioTHubModuleClient.OpenAsync();
            Console.WriteLine("IoT Hub module client initialized.");

            Metrics.Collector = new CollectorConfiguration()
                    .Batch.AtInterval(TimeSpan.FromSeconds(2))
                    .WriteTo.InfluxDB("http://192.168.1.91:8086", "iotedger")
                            .CreateCollector();

            CollectorLog.RegisterErrorHandler((message, exception) =>
            {
                Console.WriteLine($"Infux Error. {message}: {exception}");
            });

            await ioTHubModuleClient.SetInputMessageHandlerAsync(
                          "input1", PipeMessage, ioTHubModuleClient);
        }

        static async Task<MessageResponse> PipeMessage(
                                         Message message, object userContext)
        {
            var moduleClient = userContext as ModuleClient;
            if (moduleClient == null)
            {
                throw new InvalidOperationException(
                        "UserContext doesn't contain " + "expected values");
            }

            try
            {
                byte[] messageBytes = message.GetBytes();
                string messageString = Encoding.UTF8.GetString(messageBytes);
                Console.WriteLine(
                          $"Received message with body: '{messageString}'");

                if (!string.IsNullOrEmpty(messageString))
                {
                    var jsonMessage = JsonConvert.
                             DeserializeObject<JsonMessage>(messageString);

                    // Write to InfluxDB
                    Metrics.Measure("temperature", 
                                    jsonMessage.ambient.temperature, 
                                    new Dictionary<string, string> {
                                              { "area", "ambient" }, 
                                              { "prodline", "1" } } );
                    System.Console.WriteLine("entry saved.");                
                    await Task.Delay(1);                    
                }                    
            }
            catch (System.Exception ex)
            { 
                System.Console.WriteLine($"Exception: {ex.Message}");
            }

            return MessageResponse.Completed;
        }
    }

    public class JsonMessage
    {
        public JsonMessage()
        {
            machine = new Machine();
            ambient = new Ambient();
        }

        public Machine machine {get; private set;}
        public Ambient ambient {get; private set;}

        public DateTime timeCreated { get; set; }
    }

    public class Machine
    {
        public double temperature {get; set;}
        public double pressure {get; set;}
        
    }

    public class Ambient
    {
        public double temperature {get; set;}
        public double humidity {get; set;}
        
    }
}

Notice the Influx client connects to the Influx database on port 8086. This is the default port of the service. If this collides with other services, you can pick another (host) port.

The temperature values are written to a table named ‘temperature’. In a Influx database this is called a measurement (not one but a list of values). It contains all values and each value is tagged with zero, one or more tags.

Here, each temperature is tagged with both an “area=ambient” and a “prodline=1”.

These tags can be used to group values and to make subsets.

This IoT Edge Module code is also available as open-source code on GitHub. There, you can also read how it must be deployed (using the right module twin desired properties). On Docker Hub a precompiled version is available.

Connect the writer module with the simulation module using this route:

FROM /messages/modules/sim/outputs/temperatureOutput INTO BrokeredEndpoint("/modules/writer/inputs/input1")

We connected to the ‘iotedger’ database. The database is not creating the database itself. Therefore we created it in the previous step already.

Incoming messages

Once the writer module is deployed we can see telemetry arriving in the database, in a measurement (which acts as a table):

show measurements

show field keys

SELECT LAST(value) from temperature GROUP BY prodline

SELECT LAST(value) from temperature GROUP BY area

As you can see, telemetry is arriving:

See how the temperature ‘measurement’ is accessed. We check the latest values grouped by the tags.

The timestamp could be a bit intimidating. You can change it in something recognizable during the session:

precision rfc3339

This will make the timestamps human-readable.

Database retention

The way the database is created now, it will be filed without limits.

This is perhaps unwanted. The harddisk of our device could run full. Or, data that is obsolete after a few days or months is persisted and can slow down things.

Influx supports automatic retention on the level of a database. Here, we create the same database with retention of two hours:

CREATE DATABASE "iotedger" WITH DURATION 2h REPLICATION 1 NAME "iotedgerplcy" default

After two hours, or a little later (the Influx database enforces the retention once in a while), the oldest values are automatically deleted.

I tested this with “SELECT first(value) from temperature GROUP BY area”. This gives me the oldest value and the count.

After a while, I actually noticed that the count was reset every two hours and the older value changed:

You can check the oldest value with:

SELECT FIRST(value) from temperature GROUP BY prodline

In a production environment, the two hours should be more like 180 days 🙂

Now we have telemetry, it’s time for a dashboard

Deploying the Grafana dashboard

Again, we change the deployment manifest by adding a module.

The Grafana module needs basically the same settings as the Influx database: a port and a volume:

We also deploy some environment variables:

Notice the password for this dashboard.

So we need to deploy this container with these Create options, extended with the environment variables:

"grafana": {
  "settings": {
    "image": "grafana/grafana:latest-ubuntu",
    "createOptions": "{\"ExposedPorts\":{\"3000/tcp\":{}},\"HostConfig\":{\"Binds\":[\"/var/grafana:/var/lib/grafana\"],\"PortBindings\":{\"3000/tcp\":[{\"HostPort\":\"3000\"}]}}}"
  },
  "type": "docker",
  "version": "1.0",
  "env": {
    "GF_INSTALL_PLUGINS": {
      "value": "grafana-clock-panel 1.0.1,grafana-simple-json-datasource 1.3.5"
    },
    "GF_SECURITY_ADMIN_PASSWORD": {
      "value": "00000000"
    }
}

For this Grafana module we need to provide a folder on the device too:

sudo 
sudo mkdir /var/grafana
sudo chmod 777 /var/grafana

Creating your first local dashboard

Once the module is deployed using Azure IoT Edge, a dashboard is available on port 3000 of our edge device.

Log in into Grafana with the standard admin username ‘admin’ and the password as seen above:

We first connect to the Influx database. Add your first data source:

From the list of possible types of data sources, select InfluxDB:

Fill in the database Url (including the port number):

Then, fill in the database name:

The last step for the data source is saving and testing the connection:

If you get that green bar, the connection is succeeded. The data source is available:

Now, add a new dashboard:

A dashboard is made up of one or more panels. Add a new panel:

This new panel editor shows the actual widget (a chart by default) at the top of the editor and the settings to be changed below it.

It is a good idea to select the measurement (the InfluxDB table) in the ‘FROM’ line first:

There, select ‘temperature’:

As soon as the table is selected, you see the ambient temperature value shown in the chart:

There it is! We are able to represent incoming telemetry using an Influx database and Grafana.

Conclusion

With InfluxDB and Grafana, a lot of new use-cases for IoT Edge can be explored.

It’s great to see how easy it is to just deploy generic Docker containers on the IoT Edge, complete with all settings needed.

Only if you need to connect to the IoT Edge routing mechanism, you need to provide custom routing.

There is a maximum of fifty modules on one edge device.

The Azure IoT Edge Influx writer module as demonstrated here is available under MIT license. It works in conjunction with the Microsoft simulated temperature sensor module.