Last night’s best-idea-ever: Duet Hookup App

This idea came from a conversation at last Friday’s Silicon Drinkabout in Manchester. It was generally agreed to be the best idea ever.

Picture the scene:

You’re in a bar after a long day in work. All you want is to sing and feel true joy and love. But alas, you are alone, and that would be weird.

You load up [catchy app name here] and using the Tinder-style selection method you start swiping through duets. Meatloaf -swipe left- West Side Story -swipe right- Sonny and Cher -swipe right-

A match! And it’s “Tonight” from West Side Story, your favourite! [catchy app name here] gives you a location to start walking towards.

You’re a couple of minutes away from your destination and you hear the music start. Lyrics appear on your phone’s screen and you heart swells.

Only you, you’re the only thing I’ll see forever,

In my eyes, in my words and in everything I do.

You begin to sing as you walk. Turning a corner your heart jumps. You hear, faintly in the distance, a voice…

And there’s nothing for me but Maria,

every sight that I see is Maria.

It’s your duet partner. You’ve been walking and singing towards each other.

The song builds to its final climactic chorus as you lock eyes across the road. You sing wildly, loudly, without a care in the world. You don’t notice the looks people are giving you. You are now singing a duet in public with a stranger. All is right in the world. The city is filled with song.

Key features:

  • Voice tracking to make sure you’re in key.
  • Ratings so you match with people as good at singing as you.
  • Watch others’ duets with Stalker Audience mode.

We’re looking for $100 million in seed funding.

And if anyone can remember who agreed to be CFO, that’d be great. I forgot to write it down.

Changing your document structure without downtime or risk

Often, our document structure (the schema of our data) is forgotten about. It’s a hassle to change, so we don’t bother updating it. This, as we all know by now, is bad. We should be treating our data just like every other part of our system, and refactoring it regularly. This is the only way we can keep future updates simple and painless.

When the document structure changes we need to do two things; change the existing data, and change the code which uses the data. If we try to use the system when we have done one but not the other, we have a bad time. This is why we sometimes use a maintenance mode. Or, if we’re feeling reckless adventurous, we just do it on live and hope nobody clicks anything before we’re done.


But there’s a better way. And as with all great refactoring methods, it’s safe and less exciting. Here are the steps:

For this example lets assume we want to move from this:


To this:

    bio: {

Step 1: Do both

The first thing we change is the code which generates or edits the data. We update it to save the data to include both the old document structure, and the new document structure. We will end up with data which looks like this:

    bio: {

Step 2: Update everyone

Write an update script (ideally re-using the well tested code from Step 1) which takes each document in your store and saves the data in the new structure and leaving the old structure in place.

At this point we know all the data will always be in both the new, and old, structures.

Step 3: Use the new structure

We can now update the code which reads the data to read from the new document structure. What before was a risky update, now is safe to do so. Worse case; we can roll back and all our data is still fine.

Step 4: Remove the old

Once we’re using the new structure, and we’re happy it’s all working, we can safely remove the old stuff; start with the code which updates the data in both locations, then update the data to remove the duplication.

The Tech Which Launched Bristlr



A few days ago launched its public beta. The website is a sort-of dating site, mostly social networking site, focussed around the love of beards. As the tag line reads “Connecting those with beards to those who want to stroke beards.”

The goal is to show you people nearby who might want to meet up, with the ability to filter those results by whether or not they have a beard.

Here’s a quick overview of the products and services used to get it to its proof of concept launch:

The App

  • Node.js (Language) – I’m just in love with server-side JavaScript. Notable modules:
    • Express.js (Server)- the application framework of choice
    • Passport.js (Authentication) – Easy authentication that’ll scale well
  • MongoDB (Database) – Plays nice with Node.js and is fast to develop with
  • JQuery (JavaScript library)- Why re-invent the wheel
  • Bootstrap (CSS framework) – Pretty, simple, and makes the site instantly work on any device
  • Mandrill (Email infrastructure) – Easy to use, with pretty powerful monitoring and analysis
  • Google’s Geocoding API (location services) – Turns user entered addresses into longitude and latitude values


  • Sublime Text 2 – Simply the best text editor I’ve ever used.
  • GitHub – Everyone’s favourite source code repository
  • CircleCI – Clean and simple continuous integration and deployment service
  • Mocha – Easy to use and versatile testing framework
  • Dollar Photo Club – High quality and cheap stock photography


  • Namecheap – Not-terrible domain management
  • Heroku – Easy, scalable, cloud hosting
  • Amazon S3 – Store every file ever
  • MongoLab – Fast, simple, cheap MongoDB hosting
  • – Resize and crop images, with bonus CDN


  • UptimeRobot – Emails me when the site goes down. So simple, and it works so well
  • Librato – Great monitoring of your Heroku traffic and nodes
  • Logentries – Somewhere to shove all the Heroku logs
  • Google Analytics – In-depth analysis of who is using the site, and how The Men’s Roller Derby World Cup

The first ever Men’s Roller Derby World Cup happened, and I built and managed its website. The website requirements were simple:

  • Display news and information in the run-up to the tournament
  • Show multiple embedded video feeds and score updates during the tournament


  • Handle traffic spikes estimated at between 1,000 and 10,000 concurrent users

All code mentioned in this article is available on GitHub.

The Front Page

The tournament needed a home on the web.

WordPress is the obvious choice, and obvious is good. I’m a fan of MediaTemple’s GridServer cloud hosting (more details here) as it copes well with traffic spikes. In general terms, and remember this is on a cloud shared service designed for spiked usage, this gave us:

  • 2,000+ CPUs
  • 1TB/s bandwidth
  • SSD backed mySQL

To be sure the website would hold up under load I added the WP Super Cache plugin, which causes the site to only serve static content where possible.

Basic stress testing using showed the site able to handle around 90 requests per second for dynamic content, with an average response time under load of around 500ms. The predetermined worse-case benchmark of 17 requests per second was comfortably passed. was built with the knowledge that during the tournament we would be funnelling users away from it where possible, and towards, a sub-domain acting as the hub for all coverage.

The Live Site

The majority of my time was spent working on This is the site that can not have downtime during the tournament and is where fans go to get the bulk of the coverage.

It’s got a couple of basic jobs to do; show the video feeds, and show the current and past scores.

The video feeds are simple embed codes so no trouble there. Getting the site to display the latest data was the hard part.

With reliability being the most important part to the site, I designed a basic architecture which would see as a static site, loading the dynamic “state” of the tournament (scores, current games, past games etc.) via a very simple Ajax call from an external source.

This makes the one page that must not go down as simple as possible, and gives us options on where to load the dynamic information. And diversity like this gives us reliability.

For ease of scale I chose Heroku to host the live site. You can scale up and down your resources simply and instantly. Hosting this part of the site on Heroku also puts it in a different data centre to the MediaTemple site (MT is in the US, and the Heroku apps are in the EU).

As is now only serving a very basic single page (using node.js and Express), its benchmarking was astronomical, able to easily handle 10,000 requests per second on just 10% of the available resources.

Real Time Data

One of the features of the site, which hadn’t ever been done before for a derby tournament of this scale, was to stream live data from the scoreboards to viewers’ computers.

The software used by the scoreboards could, assuming you had a rock solid internet connection, post the data to the RDNation server, which grants API access to the data. In theory, this makes it very simple to hook into the API and fetch the data. In practice, getting information from the API was indeed straightforward, however there was a real challenge getting the scoreboard software sending its data on less-than-perfect Wi-Fi.

Once the data had reached the Roller Derby Nation API, my code took over and through the system outlined below, sent the information out to any device that requested it.

The rate at which the scoreboards on viewers’ machines updated could be adjusted remotely allowing me to slow down or speed up the request rate when needed. For most of the weekend the refresh rate was set at around 8 seconds.

This makes the amount of time from an official entering the score into the scoreboard and a user at home seeing it on their device somewhere between around 2 seconds and 18 seconds, plus latency. Given our video feed had a lag of a few seconds, this was fine.

The state was saved in a Mongo database hosted by MongoLab. The MongoLab plan chosen uses lots of nice redundancy for reliability, and as it’s Mongo, allows multiple thousand concurrent database connections.


An early draft of the eventual architecture we would end up with. The diagram shows the relationship between each major component.
An early draft of the eventual architecture we would end up with. The diagram shows the relationship between each major component.

The following description of the architecture (loosely based around the idea of microservices) for the rest of the system is taken from the GitHub Readme I wrote when the code was made public.

You’ll see there’s four different services which make up the system. This allows for easy scaling and a division of labour. Here’s a brief description of each module.


This is the static website you see when you visit It’s super simple and does nothing dynamic. All the data is fetched via a browser ajax call to the next module.


The horizontal scaling of the site all comes from here. It has a simple job; return a JSON representation of the current tournament state, to be displayed on the site. This includes current game stats, as well as brackets, tables, and even the text to display on the alternate language feed options. The module gets the state from a Mongo database (hosted on MongoLab).


Easily the most complex component, this is the admin interface used throughout the tournament via its own URL. If the front-end of the site dies for any reason, the state can still be calculated and served via a back-up process. This module also allows manual entry of game scores if the scoreboard software in the building has issues.

This module loads all the components which make up the tournament state, builds the complete state, and updates the Mongo database.


This simple little module pokes the mrdwc-command module every few seconds and prompts it to rebuild the status.

Distribution and Protection

In the past, roller derby tournaments have had problems when it comes to the reliability of their websites & live video. This put pressure on myself to deliver a service that would stay up no matter what without hiccups. Due to these needs I built it all with two questions in my mind;

  1. What’s the most reliable solution?
  2. What if that breaks?

The basic principle I tried to apply to all levels of the site is break down the problem into assorted smaller parts, with each having its own Plan B (and C, and sometimes D). For example (this list is far from exhaustive);

  • If goes down … the live site ( is on a different server so will stay up.
  • If Heroku (hosting goes down … we can switch to a backup on the MediaTemple server.
  • If there’s an unforeseen bottle-neck and mrdwc-query goes down … we can switch to a manual backup for the state delivery.
  • If the scoreboard in the building breaks … we have an override for the website and backup software in the building.

During the weekend the only backup system that we needed to use was to handle the Wi-Fi breaking for the laptops with the scoreboards. When this happens the score data can’t automatically reach the outside world. I built manual score-entry forms into the control module for the website, so when the Wi-Fi went down, I manually updated the scores.

For The End User

The main page for the tournament worked well, looked clean (thanks Bootstrap), and did exactly what we needed it to do. There was a tab for each track which the viewer could easily switch between.

The front page

As well as the two tabs for each track, there’s a third for scores. This lets fans follow the tournament really simply. The data is duplicated manually on so if either or goes down, the results are still accessible.

There is some basic, nicely abstracted code which reliably fetches the state and populates a page based on jQuery selectors. So when I came to build some auto-populating overlays for the video feed, the logic was already there and reusable. I created two different overlays; one for the tables and one for the knockout stage. The latter was so popular it was released to the public, and widely used.

The knockout overview, auto-populated and found on
The knockout overview, auto-populated and found on
The round-robin results page
The round-robin results page

All of these pages combined to provide a fantastic user experience.

Photo taken by Pitchit showing his set up for watching.
Photo taken by Pitchit showing his set up for watching.

Whilst at the tournament I also saw a lot of people sat watching one game, with the video feed or scoreboard for the other track open on their phone or tablet.

Vanity Metrics

  • Number of concurrent devices viewing the final: 3,400
  • Live Scoreboard updates: 24 million
  • Website uptime: 100.00%
Stats gathered over the weekend for
Hourly stats gathered over the weekend for
Hourly stats gathered over the weekend for


I covered the full cost of the website. This was the first tournament of its type, and was being privately funded with some uncertainty about if it’d break even. Covering the costs myself also gave me more control over what I purchased and in what quantity. the emphasis moved from getting it cheap to getting it done.

Excluding my time, which would arguably be the most significant cost to the project, the rough breakdown of pricing is as follows:

  • MediaTemple hosting; $40 ($20 GridService, $20 mySQL Grid Container)
  • MongoLab: $40ish
  • Heroku: $76 (mrdwc-live $26, mrdwc-query $50)
  • $5

Total cost: $161

These were not optimised for price, and in many cases I intentionally provisioned significantly more services than were required to guarantee reliability in the face of uncertainty. As such, the total price is roughly double or triple what I think we could have got away with and this is knowledge I’ll bring forwards to future projects. If the site wasn’t being privately funded by myself, I would almost certainly have been more conservative.

The Internet

During the tournament I took a lead in ensuring our data connection out of the building was as solid as possible. This isn’t detailed much in this post, but as I know there was some interest here’s a quick run-down of what we had:

  • 3x Satellite connections (roughly 4Mb each) for video
  • 4G connection (20Mb) for general use
  • 3G connection for media Wi-Fi
  • 150m Cat5 cable* from the satellite truck to our switch
  • One Wi-Fi access point
  • 2x 5G Wi-Fi bridges to connect each scoreboard

*There was a very high error rate on the cable, but upping the speed from 100Mb to 1Gb allowed just enough bandwidth for us to cope.

Things I’d Like For Next Time

Even more automatic backups/redundancy/recovery

The redundancy and backup services we had worked really well over the weekend, handing everything that was thrown at them. However, there’s still scope to improve the way switching from plan A to plan B happens. Whilst there was plenty of monitoring from the outside-in of the system (to ensure viewers had a perfect experience), the internal and third-party systems didn’t have a huge level of exposure to check their health.

When the Wi-Fi to the scoreboards broke, for example, I would only see this manifest in the scores on the live site not updating for a few minutes. A way to monitor these third-party dependencies would be great.

Release the overlays to the public earlier

The overlays produced just for TV would have been fantastic as stand-alone pages to release to the public. The distribution and popularity of the auto-updating knockout page shows this. Even if they’re not perfect (e.g. not mobile friendly) pages like these are still fine to release to the public.

It was also apparent that the information was far easier to get at when at home on the website than when in the building. Having screens up around the building with the website on would have been very helpful.

Choice in scoreboard software

Much of my energy at the weekend went into helping officials use the scoreboard software. The software was chosen because it could send data out the building. Given the issues had with the scoreboard software (there were real bugs, as well as issues sending the data out when the connection isn’t flawless), I’d like to devote some time to building ways for other scoreboard software to send data to remote third parties.

A developer’s toolbox (February 2014)

It seems a lot has changed since I wrote my last toolbox post. I’m now happily employed at Allegro Networks, enthusiastically embracing and using TDD (Test Driven Development/Design), and my toolbox has been slimmed and almost entirely replaced.


  • Ubuntu – used inside Virtual Box so if/when I break something important, I can very quickly clone a backup base machine image and get moving again.


I’ve slimmed down my dev tools a lot. It’s been a pretty cathartic experience, and now forces me to be closer to the code I write, without relying on heavy tools to do my work for me.

  • Sublime Text – an amazing text editor that’s very extensible. I tend to only use a code-formatting plugin, and keep everything else the same.
  • FileZilla – Fantastic FTP client, though i don’t use it all that often because…
  • Git & GitHub – My entire dev process now seems to flow through Git and/or GitHub. No I’ve got the hang of it, and I’m not trying to over-complicate things, git is overtaking FTP as my deployment method of choice.
  • CircleCI – Used only in work (as the price is a little too steep for pet projects. For that I’m looking into Travis CI) the continuous-integration-as-a-service model is fantastic. Simple, easy and effective.


Being interested in what I’m doing – When I can keep my work engaging (which I’m lucky enough to be able to do at Allegro), I can concentrate on it far more effectively.


  • Google Docs – I don’t tend to need much office-related work doing, but my budgeting and text documents are now pretty much exclusively handled by Google Docs.
  • Chromium for Ubuntu – it’s better than Firefox.
  • Heroku – now I actually understand what Heroku is and how to use it, it’s amazing.

A freelancer’s toolbox (July 2013)

I seem to have an ever-changing collection of preferred software and tools, so I thought I’d document what I use. This list is for anyone who may be looking for inspiration, and for future-me to see how my development process has evolved. So, here’s a rough list of every piece of software I currently use as a developer:

Web Browser

  • Chrome – My default web browser
  • IE, Firefox, Safari, Opera – Installed for testing


  • PHPStorm – It’s fantastic! I un-installed Eclipse last week after it kept crashing and was recommended PHPStorm. I’ve been very impressed.
  • FileZilla – Fantastic FTP client
  • PuTTY – Everyone’s favourite SSH program
  • GIMP – Free image editor which can do everything I need
  • GitHub – Git version control; I tend to only use this for personal projects as clients have their own version control methods (or not).


  • Chromodoro – Chrome plugin to time 25 minutes then 5, to help focus my mind then give me set breaks
  • Coffitivity – Plays sounds of a coffee shop. Ambient noise is really helpful when trying to concentrate.
  • Grooveshark – There’s new “broadcasts” which are kind of like hand-picked radio stations, and there’s one which plays electronic and alternate music specifically to compliment coding.
  • RescueTime – Helpfull to spot trends, though I only really bother to check the weekly digests. I’m only on the free version so there’s not a huge amount of control I have over what it monitors.
  • Paymo – Time tracking software which has everything I need. It also lets you share reports and logs with clients, which really is the only reason I use it.


  • Google Docs – my internet connection isn’t great which has been causing trouble, but all in all Google Docs is still my go-to office suit.
  • Open Office – If I need to do something more complex. OO also has a very helpful save-to-PDF option.

I’m a freelance web developer, hire me!

I'm a freelance web developer and this is a photo of me, dressed in my casual clothes.
A photo of me, dressed in my casual clothes.

If you’re reading this then you’re hopefully on the lookout for a web developer to build you a website. May I be the first to offer you congratulations as you’ve found what you’re after.

I’ve been a freelance web developer since 2008, when I graduated with a Masters in Advanced Computer Science. I’ve worked on teeny-tiny projects, as well as large international sites. If you’d like to know more about my history, my CV is here and my portfolio is here.

If you have a website and need it updating, or would like a new website building, then the easiest place to start is to just tell me what you need. Be as specific as you can be, but don’t worry if you don’t have all the details yet.

[contact-form-7 id=”49″ title=”What would you like?”]

I work mostly with the following technologies:

  • PHP (4 & 5)
  • mySQL
  • JavaScript
  • CSS
  • HTML5
  • WordPress (themes & plugins)
  • jQuery (Mobile & UI)
  • PhoneGap based Android and iOS apps
  • MVC frameworks
  • SEO
  • eCommerce solutions
  • Web Security
  • Responsive web design
  • Performance & optimisation

One of the first things a lot of people what to know is how much will it cost, so I’ll just be open and honest with you:

  • For small projects my rate is £30/$45 per hour
  • For larger projects my rate is £200/$300 per day or fixed cost

These rates are negotiable and depend a lot on your budget, what skills of mine you require, how busy I currently am, and how straight-forward (or not) the work may be.

Over-engineering and horsing around


There’s been a scandal. A scandal! Horse meat has made its way into a number of typically not-horse meat based foods. This inspired me to spend a couple of spare hours on a website I named Percentage Horse.

The idea is simple:

Given the name of a thing, what percentage of it is horse?

Or, in terms of the technical task:

Given a word or phrase, generate a number between 0 and 100 which relates to its connection to horse meat.

Where’s a guy to start? I first tried searching for a way to automatically scan a combination of online supermarkets and news sources to try to automatically research the ingredients list for food items. I realised pretty early on that whilst this would work for all the obvious things, the supermarket’s don’t have APIs and the scope of what you can give the site to search for is limited.

The solution I went with consists of three parts (and lots of caching, which I won’t go into in detail beyond; cache everything):

  1. Ask Wikipedia about the search word or phrase
  2. Search for key words in the resulting text
  3. Add a small random number

As I only had a few hours to make the site, I cut to the chase and made a crude curl-based function which returned the wikipedia entry for a given piece of text. Wikipedia is nice because they have a public API.

function return_page($url)
	// create curl resource 
	$ch = curl_init(); 

	curl_setopt($ch, CURLOPT_USERAGENT, 'User-Agent: YOUR-DETAILS-HERE');
	// set url 
	curl_setopt($ch, CURLOPT_URL, $url); 

	//return the transfer as a string 
	curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1); 

	// $output contains the output string 
	$output = curl_exec($ch); 

	 // close curl resource to free up system resources 
	return $output;

$string = "" . urlencode($search_value) . "&prop=revisions&rvprop=content";
$search_page_string = return_page($string);

The next step is to process the lump of text we get back.

Quick note here on redirects: If you search for some words, they redirect to others (“sausages” redirects to “sausage”, for example). I made an ugly-but-it-works detector which searches for redirects and then follows the URL accordingly if it finds one:

function get_search_results_word($search_value, $allow_redirect = true)

	$string = "" . urlencode($search_value) . "&prop=revisions&rvprop=content";
	$search_page_string = return_page($string);
	if ($allow_redirect && stristr($search_page_string, "redirect") && (strlen($search_page_string) < 1000))
		$search_page_string = explode("[[", $search_page_string);
		$search_page_string = $search_page_string[1];
		$search_page_string = explode("]]", $search_page_string);
		$search_page_string = $search_page_string[0];
		if ($search_page_string)
			$search_page_string = get_search_results_word($search_page_string, false);
	return $search_page_string;

I could have used a sexy regular expression, but who’s got time for that?

Right now I can search for a word, and get a whole tonne of related words come back (in the form of the text on the Wikipedia page). If we break the user-submitted phrase into its constituent words then merge all our results into one big text string, we’re almost there.

We don’t really care about sentences or structure, so I pushed the large text string through a word-frequency analyser, which gives me an array of words and their popularity in the results.

Some words are more important than others so I made a small array of key words, like “meat” and “smart” (any food that’s “smart price” is probably pretty low quality), and another array of words to ignore like “at” and “the”. Assuming we don’t care about words that only appear once, we can step through the words and calculate $is_horse and $not_horse values. With a little bit of weighting we can subsequently divide one value by the other and get our percentage.

foreach ($wiki_text_array as $wiki_text_word => $frequency)
	if (($frequency > 2) && !in_array($wiki_text_word, $ignore_words))
		if (in_array($wiki_text_word, $key_words))
			$is_horse += ($frequency * 1);
			$not_horse += ($frequency * 0.05);

// calculate the score, and prevent it being more than 100
$score = round(($is_horse / ($is_horse + $not_horse)) * 160, 2);

if ($score > 100)
	$score = 100;

The random number added on the end is purely theatrical. The whole website is a joke, and as long as the percentage is within 25% of what we want, the joke works. Adding a random number prevents similar phrases having identical percentages, which is a surprisingly frequent occurrence and makes the website look broken.

And that’s it! Season lightly with some basic CSS and jQuery, and you have yourself a novelty website in a couple of hours.

You can see the website for yourself at

WFTDA App Proposal: A PhoneGap based iOS and Android App

The Women’s Flat Track Roller Derby Association were looking for somebody to build them a new mobile app for their rules. I submitted a proposal, but in the end they chose to go with somebody else as they’d prefer to work with a larger company with more resources. Given what they’re looking for, this decision makes a lot of sense to me and whilst a little disappointed it’s understandable.

It’s important to go for projects like this, as a freelancer. Partly because there’s always a chance you can get it, and hey, more work. And partly because you learn a lot. The research I did with this little project led me to brainstorm with the very talented Gemma Cameron, which has inspired me to actually go to all the fantastic developer events happing around Manchester.

It also led me to closely examine some cool technologies (like PhoneGap) and learn more about some interesting development processes (like Test Driven Development), which fed directly into other projects I’m working on (like the Minimum Skills app).

For anyone curious, I’ve made my proposal public now (you can get the pdf here).

If you have any feedback, I’d be interested to hear it.

Building The Official WFTDA Rules App: Android & Beyond

This is a proposal by John Kershaw, in response to the WFTDA’s “Rules App For Android Devices Request For Proposal” issued on February 14, 2013.


I’m John Kershaw (skate name Sausage Roller), a freelance web developer for 5 years working primarily on custom solutions to interesting problems. Since graduating in 2008 with a degree in Advanced Computer Science I’ve been building complex websites and consulting for a number of both for-profit and nonprofit organisations. I’m based in Manchester, UK, and have clients in the UK, USA and Australia.

In my spare time I develop the Roller Derby Test O’Matic website (see, a tool used globally by thousands of skaters each month to learn the rules of flat track derby. I am the Training Manager for Manchester Roller Derby, one of the UK’s largest roller derby leagues. I also skate for the league’s men’s A-team.

For this project I would be working closely with Gemma Cameron (skate name Ruby on Rails). She’s a senior software developer at with seven years of commercial development experience, and 20 years of skating experience including three years of derby at two roller derby leagues. She is also prolific in the local software community, hosting many events each year. During Leeds Hack 2011 (an event where participants aim to complete a project from start to finish in 24 hours) Gemma built an (unpublished) native Android app to track roller derby circuit training, with her group also creating a native iOS version too.

I am currently developing the Roller Derby Test O’Matic mobile application. When completed in the next few weeks it will be structurally and technologically very similar to the proposed WFTDA Rules app detailed on the following pages.

In my freelance career I typically have one large project on the go and several smaller projects. Small projects range from upkeep on past work, consultancy, or simple client requests which take a few hours to sort. Currently I have one part-time client with the rest of my energy going into the Roller Derby Test O’Matic app. With the acceptance of this proposal I would decrease the workload I have with my existing main client (I have a second programmer on the project who can take over), and the app I am building is due to be completed by the start-date of this project. This allows me to prioritise the WFTDA app and work on it full-time.

The Solution


Using the platform agnostic PhoneGap framework, an HTML5 based application will be developed. Native apps (ones built specifically for iOS or Android) have greater power and are likely to be more responsive than their HTML5 counterparts, however they are limited to a single platform and are much more expensive to develop. Using a framework like PhoneGap allows a single core application to be developed, and without significant effort ported and sold on iOS, Android, BlackBerry and Windows Phone devices. For the purpose of this proposal it is assumed the project’s initial focus is on Android devices.

HTML5 brings with it a clear advantage when it comes to responsive design; using a combination of HTML, CSS and jQuery Mobile (an HTML5 based user interface system, optimised for mobile devices) any sized device will be able to effortlessly display the app. Flexibility of this kind is essential when tackling the huge range of Android devices on the market.

The design of the new app will be very close to the current iOS app, allowing a smooth continuation from the style of the current app and WFTDA website. This also reflects and compliments the WFTDA’s website branding and style. The new design will differ from the existing design in the implementation of device specific design preferences; for example, Android’s standard button size. Here, the existing WFTDA design is still used, but is subtly modified to allow more “expected” behaviour and style from the perspective of the end user.

It’s important to note that whilst the new app will look and feel similar to the existing app, it will be fundamentally different under the hood and enable the WFTDA to manage the data via a remote Content Management System. All this whilst still providing content if the user is offline, plus creating an app that can be ported to multiple devices.

Structure and Navigation

The app will follow a simple, hierarchical structure that’s easy to follow, and easy to expand. A basic home page listing the main sections will greet the user. From here they can select any number of key features, split across three main sections and a menu bar: Rules Documents for all rules, track images and publications (both Clarifications and Q&A); Officiating for official-related reference material; Training to group together everything skaters need for improvement (including tick-list minimum skills and mock rules tests). Along the bottom of every page is a common bar to allow users to easily access their bookmarks, search, profile and settings.

During the initial discussion at the start of the project we will get the detail of the project sorted, we will asses the important features and plan accordingly. I have ideas on key features, as I’m sure the WFTDA do.

The Profile page will list a user’s accomplishments in the app so far; rules tests taken, minimum skills % complete, etc.

The Settings page allows users to customise important features, such as when and how the app updates its data.

A basic example (style is not representative of the final product, this is just a mock up) showing the standard layout and navigation follows.


Back and forward buttons as found in the current iOS app will be included, to allow horizontal navigation of the content tree.

Every page can be bookmarked using the handy bookmark button at the bottom of the page, or holding the left-hand side of the page.

Offline & Updating

It is essential that the application works with any kind of data connection, from persistent always-on connections to non-existent ones. As such the app will rely on locally stored data, and update this data as and when possible from WFTDA’s server.

The app will come with a copy of all the data it needs, and will simply update its local copy when possible. If no data connection is available, the local copy will continue to be used. If, however, there is a data connection available, updates for the data will be fetched automatically.

The data will be sent to the user’s device from the WFTDA server via an XML API, powered by a Content Management System. The details of the CMS are given in the next section.

A diagram made using Balsamiq showing, in basic terms, how the app would auto-update.

Some data (such as new software features, or videos) may be impractical to update using this method and would require a software update.

Whilst users should be encouraged to keep their data up to date, in some cases they won’t be able to, so it is important the app continues to be as useful as possible under all circumstances.

Content Management System

The power of the automatic updates will come from the API. This will be generated by custom PHP code running on the server and provide the app with all of the information it needs. A basic CMS will likely need to be built to allow you to have full control on what is updated and what isn’t.

Having had no view into how the current website and data is stored, I can only make assumptions on how easy or hard it will be to tap into the current data and feed it to the app. The worse-case situation is we will end up with a stand-alone CMS built for the app.

Basic training will be provided, and the CMS will be well tested to make it as easy to use as possible, whilst also being secure and flexible.

Additional Feature Recommendations

Premium Content

The use of in-app purchases enables any number of features, items or data to be made available to users. What this content is will depend on what you want and your business model. One option is to have a free app that provides the rules and has the other features visible but unavailable. Then charge users for access to each specific feature, or all for a discount.

An Interactive Quiz

I’ve been building the Roller Derby Test O’Matic for some time now, and have access to and fully own the contents of its database. It’s been a learning curve, and an occasionally challenging experience, and I will bring the knowledge I have gained with me into this project. I have large quantities of user data which I’ve used to learn how to make the quiz site increasingly effective.
It currently has a basic API which we would use to have balanced and automatically generated tests given to users. Many leagues already use the Test O’Matic as part of their training, so this would be a logical inclusion.

Given that I am also developing a very similar app to the one in this proposal, specifically for the Test O’Matic, the potential for useful interaction between the two apps is high.

A Shared Data Source

This should be a high priority for this project. However, without knowing more about the current system, it’s hard for me to advice on the best course of action. Assuming it’s been produced in a logical manner and with common technologies, I am confident it will be possible to centralise the data.

Multi-lingual Support

Any app, with little overhead, can be built to allow for multiple languages. My recommendation is to build in this functionality, but actually add alternate supported languages after the main app has been completed.

Development Process

Agile Code

The key to fast software development is rapid iteration; delivering minimum viable features to the app and reacting to their successes and failures, then advancing. This lean and agile development process, favoured by successful technology startups and companies such as Facebook, is what I will be using rather than the more traditional approach of a step-by-step design, implementation then testing process. Experience and history shows the step-by-step approach leaves little room for reacting to feedback, users and other factors.

Almost immediately a working app will be made with a basic set of features, and it will be steadily improved and built upon with additional functionality as the weeks progress. Test driven development and the skills of two successful and experienced developers will ensure the quality of the application is high; features will be released, but not bugs. Continuous delivery of features (new updates every few days, rather than every few weeks) will allow us to gain feedback from testing of the application right from the very start of the project. This will allow the look, feel and functionality of the project to develop and react with ease.

Client Responsibilities

At the start of the project I will need to work with the WFTDA to get the most accurate description of the wanted product as possible. I will need either detailed information on, or access to, any currently existing system that’s likely to be involved in this app (for example, I’d need to know how the current rules are stored, and how I can interface the app with them).

As the project progresses I would expect regular check-ins. If there are members of the WFTDA who would like to join in with the testing process, that is very much encouraged.
I would need a copy of all the data you wish to be used by the app within a few weeks of starting the project.

Project Schedule

The following table shows the estimated order of events, along with which week of development it falls within. The project as a whole is divided into four phases to make things simpler. Phase 1 gets the core functionality in place, Phase 2 adds the server-side features, Phase 3 adds in the remaining app features and Phase 4 makes sure everything is working.

The time each feature takes to build is an overestimation made with the assumption there will be unforeseen problems. The agile approach will allow the schedule to become more and more detailed and accurate as the project develops.

Phase 1

Week 1

Specification document created

Establishing development environments

Communication patterns set

Testing plan

Week 2

First usable app generated

Rules pages built – static content used

Week 3

Rules completed

Q& A and Standard Practices started

Week 4

Basic CMS created for content to be generated from

Updating via the website, available offline

Phase 2

Week 5

Complex content begins to be added – zoomable Track layout added.

Hand signals & flash cards begin being added.

Week 6

Minimum skills & video content added

Settings & Profile pages completed

Week 7

Bookmarks feature added

The CMS is completed

Phase 3

Week 8

Search feature added

User Guide written & testing begins with WFTDA volunteers

Week 9

Notifications feature added

Week 10

Interactive Quiz added

Phase 4

Week 11 & 12

Detailed feedback with WFTDA

Training volunteers completes

Final debugging

App goes live!

Training And Ongoing Support

A user guide will be provided to be used as reference material.

As a relatively simple to use system I will sit with a trainer (using something like Skype) to walk them through everything, taking approximately an hour or two. I would expect to do this with a number of people, and make myself available at any time for questions. Details on what I am explaining, to be referenced later, would be given in the user guide.

Customer support would be provided on a deal we would work out. Initially the support would be free, but following this either a retainer would be negotiated, or a simple hourly rate would be established.

Time spent fixing bugs would not be billed.

Future additional features would be charged at an hourly or daily rate, depending on their size. Large features (taking more than a week to complete) would likely come with a negotiation to establish a more detailed description of the work being done and a price-per-feature option.


The total cost of the project will be $12,000.

This is divided by phase (see Project Schedule) with all phases other than Phase 1 being a payment on completion to the WFTDA’s satisfaction. Phase 1 would need to be paid in advance to allow development to begin immediately.

Prices are based on the estimated number of hours each phase will take, as well as any specialist software or hardware the phase may require. The breakdown of pricing is as follows:

Phase 1: $5,000

Phase 2: $3,000

Phase 3: $2,000

Phase 4: $2,000

A more detailed breakdown of the pricing can be provided once more detail on the specifications of the project is known.

Future work will be billed at $38/h, or $245/d, or at a fixed rate (which would work out at a decreased hourly rate) for larger updates. This is negotiable.


The WFTDA would maintain full ownership of any software produced. I would ask to be able to refer to and use screen-shots of the application in future applications and on my own website and future proposals.

The Roller Derby Test O’Matic and accompanying code & data would remain the property of myself, however a free-to-use licensing agreement with attribution would be expected.

My Micro Start-Up Adventure: Idea to Profit in 7 Hours

[A few years ago I had a stupid idea. It turned a profit 7 hours later, and had slipped into obscurity by the following day. At the time I made notes on how it happened, which are republished here.]

Around seven hours ago I came up with idea: An online club where you pay to be added to the members list, and that’s it. Half an hour ago that idea turned a profit. Mostly that’s down to luck.

5:00 pm-ish (all times are GMT)

I’ve been toying with having an exclusive (very pretentious) collective online for a while. It was talking in the hallway about this idea with housemates this afternoon and I made a joke that turned into this evening’s start-up.

What if you have a club you had to pay a dollar to be a member of, but all membership gave you was your name on an official and public list of members?

5:44 pm

I’m a guy who likes to see immediate results, so within a few minutes I had the URL nailed down and pointing at my existing server (Useful tool: Instant Domain Search). now existed.

5:49 pm

I’m also a guy who wants people’s opinions and I quickly started telling Sam about my idea. He thought it was amazing (though probably more in a humorous way than the business way) and the first payment was agreed.

6:30 pm

40 minutes later and the site was in a state where it worked. Sam formally applied for membership to The Club, and payment was received.

6:35 pm

After celebrating my first income I triumphantly began my PR campaign. Which was this tweet.

7:48 pm

The Club’s membership reached double digits. Hurrah! I took this opportunity to buy a URL version of the question that greets you on loading the site [which has since gone offline]. That doubled the total cost of the project.

9:18 pm

With orders steadily coming in, mostly from my Twitter followers, I decided to make a video advert (phase two of my cunning PR drive).

11:56 pm

In less than seven hours I had the 23 members needed (each, apart from me, paying $1) for the site to cover the costs of the URLs. It turned a (tiny) profit.

Domain Registration: $15.34
Membership dues: $23.00
PayPal Fees: $7.59

Total Profit: $0.07

[The site went on to be popular for a couple of days, and I actually got a few more members.]