Monitoring Humidity…

TL;DR

This boils down to:

  • A Raspberry Pi with a humidity sensor attached
  • Two Python scripts: one for a web endpoint, and another to generate push notifications running from cron.
  • A simple iPhone app to receive the push notifications and to graph the humidity data.
App Screenshot…

I have a small collection of guitars, one of which suffered damage from humidity over the winter. It was a tough lesson: I had no idea that they were so sensitive. I’ve subsequently bought another instrument which I’m now nervous about keeping out of its case.

Having looked unsuccessfully on Amazon for USB connected humidity sensors, I decided to re-purpose a sensor which I’d been using with an Arduino a couple of years back, and which had subsequently been consigned to the attic. I was reasonably satisfied with my first pass: the connection to the GPIO pins on the Raspberry Pi is a lot less faff than processing serial data over USB from the Arduino. I wrote a Python script which ran from cron every hour, and if either the temperature or humidity was over / under a threshold, it would email me. Side by side testing with a borrowed standalone sensor suggested it was reading consistently a few degrees hotter. The only male-to-female jumper wires I had lying around were pretty short, so the most obvious culprit was the heat from the Pi. I ordered some longer ones — and duly broke the sensor somehow when I reconnected it. Doh!

I ordered up a replacement [GY-BME280] which seems a lot more sensitive. A hot spell of weather suggested that something less intrusive than email would be better, so I thought I’d have a quick play with iOS push notifications. I ended up using OneSignal, who have a generous free tier. One slight constraint is that their Python library is for V2 only, and the dependencies for the sensor had already committed me to V3. Having gone through the app setup on OneSignal’s admin interface, and a quick browse of their API [simple for my purposes], I use requests to send the notification instead:

if (ALERT):
   post_body ={
      "app_id": "app-id-from-One-Signal",
      "headings": {"en": "Indoor weather warning!"},
      "contents":  {"en": CONTENT_TXT},
      "included_segments": ["Active Users"]
   }
   r = requests.post(url = pushEndpoint, headers={'Authorization': 'Basic api-key-from-OneSignal'},
 json=post_body)
 

Obviously the vendor library takes care of the exact syntax of the Authorisation header. That took a bit of testing to see exactly what was required when rolling my own API call.

I generate the CONTENT_TXT based on some if / elif statements earlier in the code. I took a different approach when I was using email: the content was additive, so if both the temperature and humidity were over the threshold, I’d get all of the info. Now, the first one to trigger wins as there isn’t a lot of real estate in the notification window. As an aside, the code above doesn’t process the response from the OneSignal endpoint. If it failed for some reason, I could resort to email, but that is a little circular for my implementation.

It’s rather one-dimensional having a boilerplate app on the phone which does nothing other than receive the push notifications, so charting seemed like a sensible addition. The state of the art has certainly moved on from when I first used charting in my weight tracking app [it was 7 years ago!]. There are a couple of components to this: the first is a very simple web service with a couple of endpoints [using Flask], the first of which returns historic data generated by a script called by cron – which I’ll come back to – and the second which calls a script to get live data from the sensor.

The cron script – the same one that generates the push messages – retains the last 24 hours of data on a rolling basis. I managed to make pretty heavy work of this, but here’s what I’ve come up with:

 with open(dataFile) as json_file:
         data = json.load(json_file)
         json_file.close()
         data['allData'].append(dataPoint)
         currentLength = len(data['allData'])
         if currentLength > 24:
             # This should always only be deleting the single, front-most element:
             delta = currentLength - 24
             trimmedData = data['allData'][delta:]
             trimmedData = {"allData":  trimmedData}
             data = trimmedData
         with open(dataFile, 'w') as outfile:
             json.dump(data, outfile)
             outfile.close()

The JSON structure is an outer dictionary with a single value, the array of ‘dataPoint’ dictionaries, where each of those is a single reading. Where I got myself into a bit of trouble was with this line:

trimmedData = data['allData'][delta:]

…which returns the ‘inner’ array reduced by the delta number of values from the front, rather than the ‘outer’ dictionary which wraps the array. There is probably a better way of doing this – not using the outer dictionary, for instance – but it works.

Programmatically, the iPhone app itself is pretty trivial. The more complicated part – relatively – is stepping through the install and config of the OneSignal SDK [plus admin processes on their web UI] and cutting the push certificate, but it’s all well documented.

I’m still clinging on to writing Objective-C which is starting to feel akin to admitting that I like to code in Latin. One aspect of Charts which took some digging into was how to put custom labels on an axis – in Objective-C. At the same time I parse out the readings into a mutable array of data entries…

[hums addObject:[[ChartDataEntry alloc] initWithX:xCounter y:humidity]];

…I also grab the timestamps – actually just the hours:minutes – into another mutable array and then…

 lineChart.xAxis.valueFormatter = [ChartIndexAxisValueFormatter withValues:timeLabels];

The Buried Bodies

Functionally, that’s pretty much it. There are a couple of extra decorations, the most significant of which is only attempting to haul the data down from the endpoints when the phone is connected to my home network. While I have the facility to carve out a DMZ on my network, this endpoint isn’t going to be the trigger for me to take the plunge. One body I’ve buried up to this point is that the Python scripts which connect to the GPIO need to run as root. There are ways round this, but it’s a utility vs effort trade-off thing. The same goes for using Flask, which is calling a script which runs as root.

Well I work in security, so it’s nice to let my hair down at weekends 🙂 .

The second minor detail of this implementation is that unfortunately Apple doesn’t allow the configuration of the push notification capability without signing up as a full developer. When I deleted my apps from the Apple Store last year, I decided to let my developer subscription expire.

While you can do quite a lot without it, having to recompile apps every 3 months or so can be a nuisance – something that was reinforced when my wife was asking me what I was doing on my Mac 10 minutes before a taxi was due to take us to the airport last week. The answer: re-compiling the iOS password manager that I wrote, and am quite reliant on when travelling. Instinctively, I knew this was the wrong answer 🙂 .

Anyway, I’ve signed up for another year.

Containerising The Plot

A couple of months ago, what should have been a routine upgrade via the WordPress UI started alarm bells going off: fail2ban was complaining about the version of PHP I was using. My attempt to update PHP turned into a can of worms, because the version of Debian that I use on the vm for hosting was pretty old. I thought it would be an interesting learning exercise to build out a new vm for the upgrade using Docker [and Docker compose] for WordPress, MySQL, PostFix and Dovecote. ‘Interesting’ turned out to be ‘quite a lot of work’.

I didn’t use the stock WordPress container: there are a few customisations that I wanted to add, and which [at the time] seemed a bit tricky to shoehorn in.

The main thing I wanted to do was to get a certificate from LetsEncrypt on startup; I’ve created a cron which should request a replacement cert every month; I also have a pretty nasty script which scans the Apache log files for people trying to break in, and adds them to iptables [I was never convinced that fail2ban was working as intended]. That’s another cron job, which runs every half hour. I’ve also added in a few utilities for debug. Finally, I have some static content – a pretty large galleries directory – which lives outside WordPress, and which I need to copy across.

There is a very heavily starred container – docker-mailserver – which took care of Dovecote and PostFix; it’s fantastic. I’ve configured it to mount in the volume where the WordPress certificate lives. I’ve noticed that its fail2ban is configured in turbo-facist mode: I initially couldn’t understand why I couldn’t connect to the server from my laptop. fail2ban had immediately blocked our public IP when it hit the server with an old password from my phone.

One other gotcha was around port exposure. I obviously didn’t want to map the MySql port to the outside world, but I got stuck at the WordPress install page that asks for the database details. I finally figured out that I needed to refer to server by its container_name.

There is a balancing act in terms of what to bake into the container. My it’s-exploded-and-I-need-to-start-from-scratch approach is to build my WordPress docker image; deploy it using docker compose; then import a backup using the All-In-One plugin. I had a bit of a think over whether or not to bake that in as well in but decided against it. While I could wget it [the same way I do with WordPress itself], it would have to be a reference to a particular version rather than the latest.

So my WordPress container isn’t exactly a work of art: it’s pretty flabby, with all of the nonsense I [tell myself I] need. In some performance testing, I also noticed that WordPress returned its results considerably slower than the hand-rolled equivalent. It’s hardly surprising: this is running on OVH’s cheapest vm and, with 3 containers running it has a fair bit on its mind.

Anyway, if you actually know anything about Docker, you can amuse yourself with my efforts here.

Sky Broadband, YouTube Restricted by Network Administrator and Other Woes

TL;DR: if you’re a Sky customer and YouTube suddenly seems a bit broken, have a look at a service called ‘Broadband Shield’. Read about what it’s doing before you turn it off: you may actually want to live with it.

During the week, I started noticing something weird was happening with the DNS on my home network. I don’t have anything vastly complicated set up, but I use pfSense to apply rules to route traffic between three physically separate networks. I’ve had a mess around with DNS configuration before, so that was my first port of call. I tried changing over from Google to CloudFlare to see if that helped, in a just-home-from-work-let’s-poke-a-stick-at-it sort of way. It didn’t. I was able to get access to various sites in the .com domain – and we’re talking imap.google.com, so not exactly off the beaten track – for a while, and then all of a sudden, the names would stop resolving. Oddly enough, I seemed to be having the same problems with my VPN, which I thought offloaded DNS. I didn’t test this thoroughly though.

The real clue came with YouTube, where I was noticing that some videos and all comments were being blocked, and some rendering problems in the app view layout – stuff where there should have been content was blank. Initially I thought this was because Google was delivering content from different domains, and my DNS woes were continuing. Looking at the settings for the YouTube app, the switch for ‘restricted mode’ was greyed out with some text explaining that it had been enabled by the network administrator. Up until this point, I’d have assumed this was me :).

A little bit of googling later to see what sort of parental controls Sky were offering me landed me at the Broadband Shield site. I turned it off, and suddenly all of the name resolution weirdness disappeared, and the YouTube comments were back.

I appreciate the intent with the Shield service but I’d appreciate it a lot more if Sky had told me that they were turning it on, or changing it in a way that would apply a pretty broken policy to name resolution.