Uptimerobot is a fantastic and affordable solution for checking any number of webpages for availability, types of issues occurred if any (based on HTTP status headers).

Sometimes you want the data gathered in your own systems however – let’s say for your own data analytics, combining outage and error information about your websites with other data available. Uptimerobot features a functionality to send multiple types of notifications in parallel: you can receive emails and push the error data in parallel to any web endpoint you control. The technical term for that endpoint is “web hook”. The data payload pushedinto “your” direction from Uptimerobot looks like this, stripped off headers:

 "queries": { 

        "monitorID": "778784728", 

        "monitorURL": "(...)", 

        "monitorFriendlyName": "My shaky website", 

        "alertType": "2", 

        "alertTypeFriendlyName": "Up", 

        "alertDetails": "HTTP 200 - OK", 

        "monitorAlertContacts": "(...)", 

        "alertDateTime": "1492651585" 

    }

Now that webhook Uptimerobot would push the payload to can be perfectly implemented using Azure Logicapps which comes with multiple benefits like:

  • Detailed Monitoring
  • Plethora of integrations ready to use with LogicApps, all visually aided by the LogicApp Designer
  • All can be done from the Azure Portal without any need to leave it

In fact by the time I wrote this I only had a (rather simple, cheaper flavour of) Chromebook available and found no issues building an integration ad-hoc.

So what I would be doing is capture the Uptimerobot error/outage data and pump it into a DocumentDB instance on Azure for further transformation later on. (Not covered in this tutorial. Maybe more on that later.) This could have worked with Azure SQL Database of course as well, in fact the support for SQL Server in LogicApps is excellent.

How would that all work?

1) Create a LogicApp workflow and design a new “HTTP” flow step. You can define the schema of the payload there by providing sample payloads received by the HTTP step. Your LogicApp webhook would receive POST and GET calls and the LogicApp blade will show your the URL. I opted for processing POST requests going forth.

2) Finish the flow by having the output point to a Document DB instance.

3) All of this is extremely straight-forward, now what I did was crafting the JSON data to be put into the Document DB database by using Logic App expressions. That’s some sort of meta language – downside with that is that you have to get familiar with it, upside is that it’s powerful.

Screenshot 2017-04-28 at 17.35.26

So with a couple of functions we can include easily some useful extra-information.

{ 

              "body": { 

                  "id": "@guid()", 

                   "time_utc": "@utcnow()", 

                   "uptimedata": "@triggerOutputs()['queries']" 

                    }

Create a new id and assign a GUID to it, capture the current timestamp in another variable, for the rest pass on everything into one variable with the input from the Webhook trigger filtering out everything but the “queries” node. Since that is what I am interested in at least.

4) Save + put the LogicApp flow active.

5) In Uptimerobot, specify to send HTTP POST notifications right at that URL mentioned in 1.

And that is about it. New data will keep flowing into my Document DB instance. I can check for every single webhook call captured what data was flowing in and how it got transformed.

Screenshot 2017-04-28 at 17.43.21.png

You could use the same flow to dispatch any type of event data and e.g. chain it all up with other LogicApp components that screen for new Twitter posts or anything alike. So there is a quite some complex workflow you could build up there, comparably easily.

Yet, not everything is dandy about LogicApps as it is still in a maturity process – the  LogicApp code viewer and editor will remain one of your best friends when working with LogicApps. I read the ambition is to change that and include e.g. intellisense (more than already available).

On the other hand, this little “how to” is only scratching the surface of course – you could add a lot more security, pre-validation and more by hooking in for example Azure Logic App proxies. With Azure Monitor you could add a great deal of governance to the whole data ingest.

All in all, this little and of course simple use case was again a very pleasant experience with little hurdles in my way. LogicApps remains promising. It certainly is very useful already. I’ll keep watching it for sure.

Advertisements