Why I’m throwing down the gauntlet to our councils over RSS feeds « Mash the State

We are republishing this post by Adrian Short / Mash the State under a Creative Commons Attribution 2.0 UK licence

Why I’m throwing down the gauntlet to our councils over RSS feeds
14 April 2009
You’re free to republish this article under the Creative Commons Attribution 2.0 UK licence with credit and a link to Adrian Short / Mash the State

Today I connected 66 councils to their citizens by making it easy to subscribe to their news by email. It took me around ten minutes. I’d say this was a fairly good use of my time in terms of the ratio of effort to value produced, but I can’t claim to have done it single handed. What made it possible is that all 66 of these councils serve an RSS feed from their websites — and they’re the only ones in the country that do. Hooking those feeds up to FeedMyInbox through the council pages at Mash the State was a simple matter of dropping a single web link into a template and pushing it to the live site. Job done.

RSS is a simple way of getting data out of a website and into another program. The technology is ten years old and RSS feeds are ubiquitous on blogs, on mainstream news media websites and in Web 2.0 applications. The three leading web browsers — Internet Explorer, Firefox and Safari — all contain built-in RSS readers. Yet despite running websites costing tens of thousands of pounds annually each, only 15% of UK councils bother with RSS. Nothing could be more symbolic of large parts of government’s unwillingness to think beyond the confines of their own websites than making it practically impossible to receive basic local council information like news and events except by taking a trip to anytown.gov.uk to do it on the council’s own terms.

The ten minutes it took to emailify those 66 councils compare quite unfavourably with probably a similar number of hours I’ve spent trying to scrape Sutton Council’s news into a database, and from there through Delicious into RSS and Twitter. Writing screen scrapers — programs which extract text from web pages and turn them into structured, reusable data — is sometimes tricky but Sutton’s news is trickier than most. The news archive serves inconsistent page structures and even dynamically changing URLs to compete with. I vowed never to write another scraper, though as we’ll see, that’s a promise I soon had to break.

Screen scraping and copyright infringement are the dirty not-so-secrets of the civic hacking world. Show me a useful, innovative third-party civic website and I’ll most probably be able to show you the terms and conditions that were ignored and the data that was taken and repurposed without permission or legal licence. Similar behaviour is not unknown in the public sector itself, in some cases because government organisations are recycling that very same stolen data from third party applications into their own websites. The recent Rewired State National Hack the Government Day saw some incredibly inspiring, innovative and useful projects produced in very short order. How many of these projects didn’t involve citizens jailbreaking their own government to get the data they’ve paid for? What kind of society not only massively impedes but actually criminalises — in principle if not in practice — citizens devoting their own time, skills and money to write software to improve democracy and public services? Our society, it seems.

This has to stop. Hackers have shown their ability and willingness to surmount technical obstacles and run legal risks to get the data they need but less technical citizens simply cannot. No-one should have to. A rich, technologically-advanced and supposedly forward-thinking society such as ours should make citizens’ access to government data so commonplace that it doesn’t deserve comment. No technical wizardry required. No legal minefields to navigate. Just all the data served through common protocols with open licences that permit, well, anything. Then we can focus our time and energy on the considerably more interesting higher-order opportunities that come from actually using government data, not just getting hold of it.

Last week I launched Mash the State, a national campaign to get government data to the people. It’s not a new idea but our method is. We’ll be setting up a series of challenges to the public sector, asking one group of public bodies at a time to release one specific set of data. Our first challenge asks all local councils to serve up an RSS news feed by Christmas. I wouldn’t have bet good money in 2003 that by 2009 370 councils would still be without RSS, but here we are. I’ve thrown the gauntlet down and I’m pleased to see that a couple of hundred people have signed up to our website or followed us on Twitter to help make this happen. The councils have got over eight months to do what in most cases will not be more than half a day’s work to serve RSS from their websites. Others less fortunate will have to persuade their content management system suppliers to enable this feature for them. All have got plenty of time to perform this technically trivial task in time to give the public a small but highly symbolic Christmas present that shows that government in this country is prepared to trust its citizens with their own data.

As for my promise never to write another scraper, it didn’t last long. The very first task to build Mash the State was an hour spent writing a scraper to tease a list of councils from a government website. Join us and help to hasten the day when no-one will ever have to do anything like that again.

[From Why I’m throwing down the gauntlet to our councils over RSS feeds « Mash the State]

Leave A Comment

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.