Building an Electronic Programme Guide [Part 1]

[Edit: 12/06/2016. The source I used for the programme data, called Radio Times XML TV, is just about to be shut down and replaced with another service which restricts the amount of data that can be hauled down to 24 hours. The whole premise of this app was to facilitate offline access. If you landed here via Google note that, while you might find some utility in the various postings on the UI, the app itself is useless without the data source. If I find another suitable data source I may update the app.]

This is the first in a series of posts about a new TV guide app that I’ve been working on in my spare time for about the last 6 weeks or so. The motivation for this comes from a number of quarters. I’d been scratching around for ideas for a reasonably useful Apple Watch app. I’d also been finding the guide that I got off the shelf from the app store is getting more reliant on in app advertising. While I’m sure there are premium, ad-free variants out there, I thought it was an interesting area for development, and one that I could extend to the watch with a ‘what’s on now / later’ mini view.

I now have a working app for my phone, but without all of the UI frills I’m aiming for. The main guide looks like this…
 epg
… and with programme previews that look like this:
preview
I’m continuing the development, and it’s already a pretty large codebase. It’s not one that’ll end up on GitHub [for reasons I’ll get on to shortly], but I think there are some pretty interesting challenges that I didn’t come across anything approaching kitform answers for, so I’m going to pluck out the interesting parts for a deeper dive.
The first and obvious question is where to get the Electronic Programme Guide data and, all importantly, data that is free. After some research, I settled on a service offered by the Radio Times, which uses a standard format called XMLTV. As I’m not into hacking PVRs, I’d never come across this standard before. The first thing to say is that the Radio Times variant isn’t actually in XML, it’s character delimited. The next is that it’s very clearly identified as being for personal use only – hence I’m not going to publish the app in full.
First, let’s deal with the data download from the per-channel URLs. Currently, I’ve hard-wired the app to download 12 channels’ worth of data, something that I ultimately plan on linking back to the UI so that the user can self select them. Each channel download contains two weeks of programme data, so over a mobile network, the overriding UX consideration is progress.
Having previously lost a couple of hours to this in the past, trying to hook up a fancy UI component showing progress graphically, it’s one to put front and centre when you’re designing the networking component. Quite commonly, servers will be generating responses dynamically, in which case, there’s a very good chance the Content-Length header won’t be set.
The progress indicator must have the total of bytes available to be able to calculate progress. The Radio Times site isn’t setting the required header, so no fancy progress meters – sad face. This is actually a more complex use case anyway, because we’d have needed to calculate the progress for a dozen downloads, and the progress of each is going to be interlaced.
What I’ve done is to initiate each download in a for loop using AFNetworking, and then in the setCompletionBlockWithSuccess block:

NSData *channelData = (NSData *)responseObject;
ChannelDataParser *channelProgs = [[ChannelDataParser alloc] init];
NSMutableArray *asProgObjects = [channelProgs programmeChunker:channelData andChannelName:thisPhoneChannel.channelName];
[self storeChannelProgData: asProgObjects forThisChannelName:thisPhoneChannel.channelName];

I’ll come back to the parser in a second, but one point to focus on is the method to store the parsed results. As the network responses are interleaved, and they’re all ultimately going to be written to the same Core Data entity, I’ve wrapped the method in an @synchronized(managedObjectContext).
In that same @synchronized method, I increment a completed download counter, which I use to set a label in the UI. Pretty crude, but given the progress limitations, it seems like a reasonable compromise. I’ve actually spent the vast majority of the time on the UI for the programme guide, so I might take another look at this later.
Back to the parser itself. First up, for the NSData that contains the AFNetworking response for the channel, the first step is to convert it into a string, and then an array of strings divided by line breaks:

NSString *stringFromData = [[NSString alloc] initWithData:channelData encoding:NSUTF8StringEncoding];
// The line break vs carriage return: done through experimentation:
NSArray *eachLineOfString = [stringFromData componentsSeparatedByString:@"\n"];

The parsing was something that required trial and error, as the file contained some copyright info at the start, and then some carriage returns at the end:
for (int progCount = 2; progCount <= [eachLineOfString count] -2 ; progCount++)
{
NSString *oneLine = eachLineOfString[progCount];
if (![oneLine isEqualToString:@"\r"])
{
[.....]
Within the loop, I then parse out the carriage returns, and finally split the data into an array, based on the separator character [a tilde]r:

NSString *oneLineNoBreak = [oneLine stringByReplacingOccurrencesOfString:@"\r" withString:@""];
//NSLog(@"oneLineNoBreak is %lu long", (unsigned long)oneLineNoBreak.length);
NSArray *oneProgrammeData = [oneLineNoBreak componentsSeparatedByString:@"~"];
I think I would have struggled with the whole \r versus \n thing if I hadn’t been bitten with it way back in the old days of the  mid 1990s. I can’t recall if it was FTPing files with the wrong transfer type or using Samba between Windows and Unix machines, but either way, you were prone to getting oddities with the control characters in text files. Anyway…
At this point we are ready to assign values, based on the content of the array conforming to the XMLTV standard. So the 0th element of the array is the programme title, then comes a subtitle, etc. I assign these to an instance of a custom class. I found it useful to add an additional three fields, all related to the UI. The first is the running order that the programmes appear in: once you save them to Core Data as individual records representing the programmes, they become unordered – standard database stuff. Next is a GUID: I’ll get into this in more detail in a subsequent posting on the UI, but the summary is that it’s to associate programme data with a button action. Finally, I also include the channel name for the programme: this is to do with a limitation in the UI design – again, I’ll come back to this later.
I cycle through parsing each channel, ultimately saving them to the core data in the @synchronised method I mentioned earlier.
One final point on the networking side. I do a simple check in a Core Data entity with a single value to see if it’s the first run. If it is, I haul down the data before presenting the UI. I also haven’t finished the part of the UI where the user can self select the channels. For now, it’s hardwired. This list of channels and the corresponding URLs also gets written to another simple entity during that first run.
I’ll follow up with a description of the UI design….

First Apple Watch App Using WKInterfaceMap

I was lucky enough to get my Apple Watch last week through the ‘expedited delivery’ option that was offered to developers on a random basis, and I’m really pleased with it.

I’ve written a very simple app to understand the communication between the watch and the phone, and it’s unearthed a couple of interesting points. Before that, setup: you need to add the watch’s unique ID to the rest of your devices on the Apple dev portal, and create a new developer profile. You’ll also find that you want to mess around with Xcode to change the logging: it can’t show logs from the phone and the watch at the same time as they are separate processes, so you will need to switch between them.

So the simple app: I thought it would be fun to display the current location of the International Space Station on a map. Note that there is no ‘glance’ for this app. Having first added the WatchKit app template to a single view project, I added the following elements to the Watch storyboard:

Screen Shot 2015-05-04 at 10.24.10

So simply a button, the WKInterfaceMap and two separate labels. I’ve then created outlets and actions as you’d expect, in the InterfaceController.h:

Screen Shot 2015-05-04 at 10.27.20

In the awakeWithContext method, I call a method to do the communication with the phone and render the results:

Screen Shot 2015-05-04 at 10.30.39

I also call this method from the action for the ‘refresh’ button, after deleting the current location pin.

So the main communication is with the openParentApplication, where you both send and receive data in an NSDictionary. It’s all nice and clean. A quick explanation around the way I’ve marshalled the data: I’ve sent the latitude and longitude values over in the dictionary as strings. Not one for the purists, but you have as much work to do with NSNumbers, and the two values are ultimately going to be set as string values for the labels anyway.

One interesting point to call out here is the part of the code I’ve commented out. Rather than a pin, I thought it would be nice to display a little image of the ISS as an alternative. I spent quite a lot of time on this, and the conclusion that I’ve come to, so far at least, is that the Watch doesn’t support images with transparency [an alpha channel]. I posted to the Apple Developer Forum [the thread is here; authentication required], and that seems to be the consensus. I also tested this with a WKInterfaceImage and had the same result. While I’ve seen quite a few references to transparency, especially in articles about animation, I’ve failed to get it to work – and the same goes for the other people on the developer forum thread. Either there are other options in the SDK, or there may be something baked into the image metadata that the WatchKit doesn’t like. I’ve tested with transparent images I’ve used in traditional iOS apps, which I’ve created myself using Gimp.

Anyway, on the phone side of the communication, you use handleWatchKitExtensionRequest:

Screen Shot 2015-05-04 at 10.54.34

So the first thing to notice here is that it’s completely synchronous: that reply(responseDict) is getting populated in this method call. It took me a while to figure  out the implications of this. Initially I was going to use an async variant of NSURLConnection, when I realised that the connectionDidFinishLoading data delegate wasn’t going to be much help here: there is no way of joining the dots between identifying the end of the response from the web server in that delegate method and then populating and calling the reply back up in the handleWatchKitExtensionRequest.

There are so many methods that return results asynchronously, not just at the network level, that dealing with them in iOS is a constant refrain. The way I normally do this is to farm the entire functionality out to a class, and then set a key value observer on the setting of a property in the class instance. I’ve used this recently for getting results back from the keychain – which I’ll come back to in a second.

I’m not sure what the implications of this are: there may be a way round it that is either beyond my knowledge of the WatchKit or Objective C. While one option would be to prepare data and save it locally – this is all on the phone – for the response through background processing, there may be reasons why you don’t want to do this. Data that depends on the keychain is an obvious example.