The online home of John Pollard

Yeltzland - Stuff about Halesowen Town

Yeltzland

Everything you wanted to know about Halesowen Town FC in one simple app:

No original content, but having everything in one place makes it easier for you to find.

I've also developed an Alexa Skill for Amazon Echo and an Google Assistant Action so you can use voice assistants to tell you about information about the fixtures and scores for the mighty Yeltz.

There is also an Apple TV app!

The app is an Open Source project, so if you're interested in the code it's available on GitHub - iOS, Android, Alexa Skill, and Google Assistant repositories.

Technology:

Download on the App Store Get it on Google Play


Blog Posts about Yeltzland - Stuff about Halesowen Town:


iOS 16 Yeltzland Release

iOS 16 should be available any day soon now, so I’ve been adding some new features to the Yeltzland app to take advantage.

Lock Screen widgets

The biggest visible change in iOS 16 are the new lock screen widgets - which presumably will be great with the heaviliy rumoured always-on screens of the new phones.

The widgets show either the latest score, the last result or the next upcoming fixture, whatever is the most relevant at the current time.

The app already had Apple Watch complications, which made it very little work to add similar widgets on the lock screen.

Screenshot of the new lock screen widgets

It took a bit of fiddling to get the update frequency working correctly, especially during the match. This works quite well now, especially if you have the game time tweet notifications feature turned on (which I can use to trigger an widget update out of the scheduled timeline)

As you can see from the screenshot above, I’ve also added a new “badge only” widget, which just shows the club badge if you just want to personalise your phone a bit.

Improved complications on the Apple Watch

Watch complications and lock screen widget code has been unified, so I’ve been able to combine the code for both into a single extension.

This also means the watch complications are a bit better now, as they benefit from the work I put in to keep them better updated as the goals fly in.

Better Siri Shortcuts

Thankfully the old intent definition code has been deprecated into a much nicer system using AppIntents. This makes it much nicer to both define and build the logic for Shortcuts support, as well as to be able to donate shortcut suggestions directly from the app.

This means I have been able to delete a whole load of code - which is always nice - and simplify the discovery of the shortcuts the app supports.

There’s how I use the shortcuts (so I can ask Siri the latest score) …

Screenshot of a Yeltzland action in Shortcuts

… plus how it looks visually when this shortcut is run

Screenshot of the shortcut result dialog

Summary

It’s been the quietest summer for a while on the new feature front, with not many customer facing changes of note in iOS 16.

However, I think the lock screen widgets are a big improvement, and I’m pretty happy how they’ve turned out.

Let’s see if we have any surprises later today when the new hardware is announced 🤞


Playing with Siri Intents

I’ve been enjoying playing with the new Siri Intents in iOS 12, and obviously didn’t need much of an excuse to get my Yeltzland app on yet another platform!

Shortcuts from NSUserActivity

It was pretty easy to get some basic integrations with Siri Shortcuts working. I was already using NSUserActivity on each view on the app to support Handoff between devices, so it was quite simple to extend that code for Siri shortcuts.

For example, on the fixture list page I can add the following:

    // activity is the current NSUserActivity object

    if #available(iOS 12.0, *) {
        activity.isEligibleForPrediction = true            
        activity.suggestedInvocationPhrase = "Fixture List"
        activity.persistentIdentifier = String(format: "%@.com.bravelocation.yeltzland.fixtures", Bundle.main.bundleIdentifier!)
    }

Making the activity eligible for Prediction means it can be used as a Siri Shortcut, and obviously the suggested invocation phrase is a hint for when you open the shortcut in Settings to be able to open the app directly on the Fixture List view from Siri.

Building a full custom Siri Intent

Probably the most useful app feature to expose via a full Siri Intent is the latest Halesowen score. By that I mean an intent that will speak the latest score, as well as showing a custom UI to nicely format the information.

There are plenty of good guides on how to build a custom Siri Intent out there, so I won’t add much detail on how I did this here.

However one strange issue I couldn’t find a work around for was that, when trying to put a number as a custom property in the Siri response, I couldn’t get the response to be spoken.

As you can see from the setup below, I got around this by passing the game score as a string rather than a number, but I wasted a long time trying to debug that issue. Still no idea why it doesn’t work as expected.

Custom response properties

Building a custom UI to show alongside the spoken text was also pretty easy. I’m quite happy with the results - you can see it all working in the video below

To make the shortcut discoverable, I added a “Add to Siri” button on the Latest Score view itself. This is really easy to hookup to the intent by simply passing it in the click handler of the button like this:

    if let shortcut = INShortcut(intent: intent) {
        let viewController = INUIAddVoiceShortcutViewController(shortcut: shortcut)
        viewController.modalPresentationStyle = .formSheet
        viewController.delegate = self
        self.present(viewController, animated: true, completion: nil)
    }

I’m sure you’ll agree the view itself looks pretty classy 🙂

Latest Score view

Summary

It was a lot of fun hooking everything up to Siri, and I’m really pleased with how it all turned out.

Overall I think opening up Siri to 3rd party apps could be game-changing for the platform. Previously Siri was so far behind both Google and Amazon it was almost unusable except for the most basic of tasks. However, now it can start working with those apps you use all the time, I can see it being a truly useful assistant.

Siri is still a way behind of course, but once custom parameterised queries are introduced - presumably in iOS 13 - and if the speech recognition can be improved, it is definitely going to be a contender in the voice assistant market.

I’m also looking forward to Google releasing their similar in-app Actions at some point soon.

Exciting times ahead!


Google Assistant vs Alexa Development

I’ve been pretty much all-in on the Amazon Echo as my home voice system, and still loving having multiple devices around our home we can call out commands to.

However I’m always looking to expand Yeltzland onto new platforms, so I’ve ported the Alexa skill I wrote about a while back on to the Actions on Google platform.

This is a summary of what I learnt doing that, and my view on the advantages and disadvantages of developing for each platform.

Actions on Google Advantages

Also available on phones

The main advantage of Google Assistant - one I hadn’t realised until I started this, even though it’s actually pretty obvious - is that it’s available on phones as well as the Google Home speaker.

On newer Android phones Google Assistant might be installed out of the box (or can be installed on recent versions), and there is also a nice equivalent iOS app.

I’ve just bought a Google Home Mini to try out, and it’s definitely is comparable to the Echo Dot it sits next too, but I’ve found myself using Google Assistant a lot more on my iPhone than expected.

Visual responses are nicer

Because the Google Assistant apps are so useful, there is a lot more emphasis in returning visual responses to questions alongside the spoken responses.

Amazon does have the Echo Show and the Echo Spot that can show visual card information, but my uneducated guess is they make up a small percentage of the Echo usage.

Google offers a much richer set of possible response types, which not unsurprisingly look at lot like search results answers.

In particular, the Table card - current in developer preview - offers the chance to provide really rich response which suit the football results returned by my answer very well.

Screenshot of Google Assistant answer

Nice development environment

Both the Actions on Google console (used for configuring and testing your action), and the Dialogflow browser app (used for configuring your action intents) are really nice to use.

Amazon has much improved their developer tools recently, but definitely a slight win to Google here for me. In particular, for simple actions/skills Dialogflow makes it easy to program responses without needing to write any code.

Using machine learning rather than fixed grammars to match questions to intents

Google states it’s using machine learning to build models that match questions to your stated intents, whereas Amazon expect you to be specific in stating the format of the expected phrasing.

Now from my limited testing - and since I’m basically implementing the same responses on both platforms - it’s hard to say how much better this approach is. However, assuming Google are doing a good job (and with their ML skills it’s fair to assume they are!), this is definitely a better approach.

Allowing prompts for missing slot values

Google has a really nice feature where you can specify a prompt for a required slot if they’ve matched an intent, but not been able to parse the parameter value.

For example, one of my intents is a query like “How did we get on against Stourbridge?” where Stourbridge is an opposition team matched from a list of possible values.

Amazon won’t find an intent if it doesn’t make a full match, but on Google I can specify a prompt like “for what team?” if makes a partial match but didn’t recognise the team name given, and then continue on with the intent fulfilment.

Actions on Google Disadvantages

Couldn’t parse “Yeltzland” action name

A very specific case, and not a common word for sure, but Google speech input just couldn’t parse the word “Yeltzland” correctly. This was very surprising, as I’ve usually found Google’s voice input to be very good but it kept parsing it as something like “IELTS LAND” 😞

You also have to get specific permission for a single work action name - not really sure why that is - so I’ve had to go with “Talk to Halesowen Town” rather than my preferred “Talk to Yeltzland” action invocation. It all works fine on Amazon.

SSML not as good

A couple of my intents return SSML rather than plain text, in an attempt to improve the phrasing of the responses (and add in some lame jokes!).

This definitely works a lot better on the Echo than on Google Assistant.

What about Siri?

All this emphasises how far behind Siri is behind the other voice assistants right now.

Siri is inconsistent on different devices, often has pretty awful results understanding queries, and is only extensible in a few limited domains.

I really hope they offer some big changes in next week’s 2018 WWDC - maybe some integration with Workflow as I hoped for last year, but I really don’t hold much hope any more they can make significant improvements at any sort of speed. Let’s hope I’m wrong.

Conclusion

As you can tell I’m really impressed with Google’s offering here, and definitely seems slightly ahead of Amazon in offering a good development environment for developing voice assistant apps.

In particular, having good mobile apps offering the chance to return rich visual information alongside the voice response is really powerful.

My “Halesowen Town” action is currently in review with Google (as of May 30th, 2018), so all being well should be available for everyone shortly - look out for the announcement on Twitter!

P.S. If you are looking for advice or help in building out your own voice assistant actions/skills, don’t hesitate to get in touch at johnp@bravelocation.com


My first app for Apple TV

My quest to get my Yeltzland (Halesowen Town FC) app on every platform continues apace, and I’ve just released a version for Apple TV.

It was my first time developing for tvOS, so it was fun project despite being of interest to a very limited audience.

Design challenges

tvOS and iOS share a lot the same frameworks under the covers, so it was quite easy to reuse existing code to fetch the data.

However designing a nice UI for the TV was more of a problem (for me!), and it took quite a few iterations to get to an interface I’m happy with.

tvOS screenshot of the app

Some key learnings were:

  • Horizontal scrolling of collections/tables makes much more sense on the TV than vertical
  • It’s really important to clearly highlight the currently selected cell, to aid navigation (which often isn’t easy with the over-sensitive Apple TV remote control)

Technical Notes

It’s really easy to develop the app in Xcode using exactly the same techniques as for iOS apps, so it was really easy to get up and running.

What is really good is wireless debugging directly on your Apple TV. I can’t imagine how painful on-device testing would have been if I had to connect my laptop via a wire.

Summary

I’m really pleased with the results, and as you can see from the screenshot above it actually looks pretty decent!

Installing the app

You can check out the app using the App Store link below - although to be honest I don’t really know how you link directly to an Apple TV app.

I think if you have the iOS app installed it might show up directly on your Apple TV. Alternatively search for “Yeltzland” in the App Store directly on the Apple TV.

The Yeltzland code is all open source, and can be viewed on GitHub.


Building an Alexa Skill

I recently bought an Amazon Echo, which I absolutely love. It’s great in the kitchen as a hands-free music player, as the speaker is really good, and has replaced the slightly unreliable Siri completely for setting timers when cooking.

Obviously I wanted to develop my own Alexa skill, so I’ve built one to tell me about the latest Halesowen Town scores and fixtures information. Right now there’s almost certainly an audience of one for it, but it makes me very happy!

Here’s a video showing the exciting things it does …

Technically the easiest way to write a skill is on AWS Lambda, Amazon’s on-demand compute engine. Alexa skills can run on Lambda with minimal setup, and looks a lot easier than running code on your own server.

The Alexa Skills SDK explains pretty well how to get set up - not always a given on Amazon’s platforms in my experience - so I won’t repeat anything here that isn’t covered in detail in the documentation.

In essense, to get your skill running you have to complete the following tasks:

  1. Add a list of intents you want your skill to answer
    • You can add custom content slots for known lists of names (think Enum) e.g. I have a list of teams in Halesowen’s league to match against
  2. Add sample phrases which will match to each of your intents
    • The content slots are used here e.g “FixtureIntent when is the game against {Team}”
  3. Point to where the skill code runs
    • Trivial to do if you are using AWS Lambda, but you can use your own https server with a bit more effort
  4. Upload your skill code in your AWS Lambda functions
    • The code can be written in Python, Java or Javascript/NodeJS - I chose the latter as I prefer the easy extensibility.

The example skill code for Lambda is pretty easy to understand and adapt as necessary. Note you can either update the code directly in the browser-based Lambda console, or upload a ZIP file containing your code. The latter is much preferable if you have any node modules you want to include in your solution.

My code simply loads existing JSON data from my server used in the iPhone/Android apps for the fixtures, results and latest score, parses it appropriately based on the user’s intent, and then returns a string which the Echo reads back to the user.

The voice recognition accuracy isn’t too bad - I suspect it struggles because of the slightly obscure names of some of the teams in the Northern Premier League!

All in all, it was pretty easy to knock together something based on existing data, and it’s really cool to be able to ask Alexa what the latest Halesowen score is. I look forward to be able to do this on Siri in about 3 years time at Apple’s currently glacial pace of opening up their systems.

The code is all available on Github at https://github.com/bravelocation/yeltzland-alexa/ which hopefully makes things a little clearer if you want to dive in and take a look.


Getting Ready For iOS 10 Widgets

It’s summer, which means updating all the Brave Location apps for the new version of iOS.

Thankfully this year the UI changes weren’t too big, and the real work was because of how the Today widgets are changed in iOS 10.

In iOS 9, the widgets are on a dark background, so it makes sense for the text to be generally white. Here’s how my stunning well-designed widgets look in the Today section on my iPad running iOS9 …

iOS9 Today widget

Now in iOS 10, the widgets are much more accessible - can be accessed directly by right swiping even on the lock screen - but the design has also fundamentally changed. The background is now a light, semi-transparent color by default, on which obviously the white text of the existing widget design is basically unreadable.

Now I didn’t want to have an iOS10 only release ready, as all of my apps currently target iOS 9.0 and above and I want to keep it that way for a while.

So what I do at runtime is detect whether we are iOS 10.0 or above by the following code snippet:

let ios10AndAbove:Bool = NSProcessInfo.processInfo().isOperatingSystemAtLeastVersion( NSOperatingSystemVersion(majorVersion: 10, minorVersion: 0, patchVersion: 0) )

I can then set the background and text colors of the widgets to appropriate for iOS 10 when necessary, buy keep the “traditional” look and feel on what will soon be legacy versions.

Quite happy with the way it’s turned out, even though I say it myself.

iOS10 Today widget

Updates are should all be in the App Store shortly (in case anyone else is running the iOS10 beta), and for everyone else in September I assume!


Yeltzland Android app released

By “popular demand”, I’ve now made an Android version of my Yeltzland app.

Just like the iOS version, the app just makes it easy to read everything about the mighty Halesowen Town FC (Forum/Official site/Yeltz TV/Twitter/…) in one place rather than have to click around different places. There is no new content you can’t get elsewhere.

It’s my first Android app that made it as far as the Google Play Store, and it was interesting to port a relatively simple native iOS app over to Android - but I’ll leave my more technical observations to another post.

The first version of the app is available on the Google Play Store now.


Azure vs Firebase for iOS push notifications

I wanted to add push notifications for my Yeltzland app, an interesting task which took longer than expected to complete.

What I wanted to do was to push notifications to the app during game time, which copies tweets from the club’s official account as notifications. This means I can get goal updates etc. without having to turn on notifications from the account in the main Twitter app (which can get a little tiresome!)

The server component part was pretty easy, as I already have a data feed of the team’s fixture list, so I knocked together a quick NodeJS script run in a corn job to use that plus the Twitter feed to generate the notifications.

However I didn’t want to build a full web app to manage user registrations myself just to do this, so it made sense to use an “off the shelf” notification hub instead.

Firebase

Firebase seemed a good choice as a notification hub to try first, especially since at Google I/O they’d just done a major release to pull many disparate developer tools under the Firebase brand.

Firebase offers unlimited notifications, plus support for both Android and iOS, and it looked easy to set up.

In fact I did get it working pretty quickly, but during testing it seemed really unreliable. Notifications wouldn’t get delivered, or spasmodically, and I sepnt a couple of days trying to figure out what was going wrong without much success.

Now it could be that the iOS support has issues (my guess is the service has been mainly Android only before), or more likely I was doing something wrong, but in the end I gave up very frustrated.

Azure

I thought I’d give Azure a go instead, and it was really easy to setup a working Notification Hub without much effort at all.

Out of the box, the notifications seem much more reliable, and I’m really happy with the service so far.

I really wish I’d have started with Azure rather than wasting 2 days wrestling with Firebase.

One minor thing I did have to do was to have separate hubs for “sandbox” and “production” modes, which wasn’t clear from the Azure portal interface.

Conclusion

Your mileage may vary, but I found Azure a much more reliable notification hub for iOS and the client-side code was much simpler too.

I’d definitely recommend checking it out for small projects, although for heavy usage there is a cost involved compared to the unlimited free usage Firebase offers.


I made a Yeltzland iPhone/iPad app

So I made a Yeltzland app for iPhones and iPads.

I really just made it for myself, to make it easy to read everything about the mighty Halesowen Town FC (Forum/Official site/Yeltz TV/Twitter/…) in one place rather than have to click around different places. There is no new content you can’t get elsewhere. I am lazy.

Anyway it turned out OK so I thought I’d stick it on the App Store so other people can use it if they want.

You can read a bit about it/install it on the App Store

Before you ask, it isn’t available for Android right now I’m afraid, and I have no immediate plans to write an Android version. I may change my mind of course.