Dashboarding Azure Data

Good morning, currently I just connected my edge gateway to the Azure cloud, and I have data being sent to the cloud after following this tutorial, How to connect a PLCnext Control to Microsoft Azure via Node.js SDK | Makers Blog - YouTube .
The main issue is that the data being delivered to Azure is something like this in an Avro file, I converted it to a txt file because I couldn’t upload an Avro file
PLCNextDataPlaintext.txt
The data seems to have my input and output values that I listed in the edge gateway, but as you can see this isn’t really readable for dashboarding. Does anyone have experience with dashboarding PLCnext’s data to Azure?

Doc2_PLCnext_CODESYS_VisuPlus.pdf
PLCNextDataPlaintext.txt

Hello Keith,
Below is the solution you were given.

For creating an Azure pipeline, the Avro format should be replaced in favor of the JSON format. The current version of the Azure driver for the PLCnext Edge Gateway sends data to Azure in JSON format, however, Azure automatically encodes this data in the ‘Body’ of the message as Base64 which is why the format you received appears garbled. (This is an artifact from older versions of Azure) In order to process/format this data, a Stream Analytics pipeline should be created to decode the base64 data and format it with a desired schema. To do this, I created two udf functions for a Stream Analytics Job. The first simply base64 decodes the Body of the message, while the second one allows accessing the value of a JSON key inside the Analytics Query.

My two functions were “base64decode”:

function base64decode(str) {  
var base64DecodeChars = new Array(  
-1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1,  
-1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1,  
-1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 62, -1, -1, -1, 63,  
52, 53, 54, 55, 56, 57, 58, 59, 60, 61, -1, -1, -1, -1, -1, -1,  
-1, 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14,  
15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, -1, -1, -1, -1, -1,  
-1, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40,  
41, 42, 43, 44, 45, 46, 47, 48, 49, 50, 51, -1, -1, -1, -1, -1);  
  
var c1, c2, c3, c4;  
var i, len, out;  
  
len = str.length;  
i = 0;  
out = "";  
while(i < len) {  
/* c1 */  
do {  
c1 = base64DecodeChars[str.charCodeAt(i++) & 0xff];  
} while(i < len && c1 == -1);  
if(c1 == -1)  
break;  
  
/* c2 */  
do {  
c2 = base64DecodeChars[str.charCodeAt(i++) & 0xff];  
} while(i < len && c2 == -1);  
if(c2 == -1)  
break;  
  
out += String.fromCharCode((c1 << 2) | ((c2 & 0x30) >> 4));  
  
/* c3 */  
do {  
c3 = str.charCodeAt(i++) & 0xff;  
if(c3 == 61)  
return out;  
c3 = base64DecodeChars[c3];  
} while(i < len && c3 == -1);  
if(c3 == -1)  
break;  
  
out += String.fromCharCode(((c2 & 0XF) << 4) | ((c3 & 0x3C) >> 2));  
  
/* c4 */  
do {  
c4 = str.charCodeAt(i++) & 0xff;  
if(c4 == 61)  
return out;  
c4 = base64DecodeChars[c4];  
} while(i < len && c4 == -1);  
if(c4 == -1)  
break;  
out += String.fromCharCode(((c3 & 0x03) << 6) | c4);  
}  
return JSON.parse(out);  
}  
  

And “getProp”:

function main(object, key) {  
return object[key];  
}  
  

My Analytics Query was the following:

WITH step1 AS (  
SELECT udf.base64decode(Body) AS Data  
FROM [containerstoreinput]  
),  
step2 AS (  
SELECT  
Data,  
Name = udf.getProp(Data,'name'),  
Value = udf.getProp(Data,'value'),  
Ts = udf.getProp(Data,'dateTime')  
FROM step1  
)  
  
SELECT  
Name,  
Value,  
Ts  
INTO  
[explorer1]  
FROM  
step2  

In this query, “step1” decodes the body, “step2” takes the JSON formatted “Data” and extracts out the values as columns, and the final select statement formats that data as desired and sends it to a test output endpoint.

In an update to the Azure driver in the near future, the data will be encoded to force Azure IoT Hub to encode as plain JSON, making the queries much simpler. An example input data set into Stream Analytics will appear as below:

?name=inlineImage.png

Where the data can be accessed directly via the column names “name”, “value” and “dateTime” with the resulting query looking something like below:

SELECT  
name,  
value,  
dateTime  
INTO  
[explorer1]  
FROM  
[deviceJSON]