This blog post first appeared on blogs.fco.gov.uk on 19 February 2009
I met a delegation of Ukrainian government officials the other day to talk about digital diplomacy. It’s one of the perks of my job that people outside the UK are interested in what we do. They provide a challenge that I don’t necessarily get from my peers in the UK digital community. (Who else is going to tell me that Twitter won’t work as a tool in Ukraine – because you only get about 3 Ukrainian words for 140 characters?)
We talked about the online campaigns that we’ve run recently in the Foreign Office, the way we manage and present web content, and some of the tools we’ve been using for digital engagement. I think I surprised them (and myself) by how excited I got when they asked how we evaluate our work in Digital Diplomacy Group. But the fact is I am very excited about proving that digital engagement works. And more than that: I think we have a responsibility to measure the actual impact of digital campaigns, rather than get carried away with the ease with which we can develop new tools.
Of course, web practitioners are notoriously lazy about evaluation because everything we do on the web produces numbers. Stats are almost always interesting, and it’s easy to present them as evaluation. But they’re not enough. The Foreign Office web platform had 2.5 million unique visitors in January. But so what? I know that I could significantly drive up traffic to the Foreign Office YouTube channel by posting a film of 150 ambassadors line dancing (I’m sure they’d be up for it). But traffic doesn’t deliver foreign policy objectives. It just delivers traffic.
Our approach to evaluation was developed by Liam King, who is even more excited than I am about evaluation. It’s not complicated – this is what we aim to do:
1. Insist on setting objectives and identifying target audiences for everything we do on the web.
2. Pick something that we can measure that will give us an indication of how well we met our objectives and and reached our target audience.
3. Measure it.
We do use stats, and we welcome independent evaluation (the Hansard Society are evaluating our blogs and our London Summit campaign at the moment), but we concentrate on providing evidence that tells us something about what we set out to achieve. This approach means that all the evaluation we do is useful for the people we’re working with (because we are very clear about expectations right at the start), and it’s useful for us (because we can use it to improve what we do).
I’ve pasted below the objectives and performance indicators that Liam and Paul set Digital Diplomacy Group in January for our work on the London Summit website. Our approach will develop, and we’ll measure KPIs for each of our engagement exercises over the next 6 weeks. But the original performance indicators won’t change – once the summit is over we’ll be able to say with authority whether we delivered what we set out to.
London Summit website objectives and performance indicators:
1. The focal point for engaging and shaping global opinions
- http://www.londonsummit.gov.uk most authoritative site for “London Summit” according to major search engines.
- influential sites in every target country link back and quote from http://www.londonsummit.gov.uk
- all visitors find it quick and easy to find the info they are looking for
2. Authoritative provision of in-depth briefings on Summit
- all unclassified policy papers accessible from londonsummit.go.uk in web friendly form
- only the highest-quality content goes on the site based on the content guidelines. If it doesn’t help to achieve an objective is doesn’t go on
- at least four expert bloggers providing authoritative real time content for London Summit
3. Effective operational functions for 2,000 journalists
- Media centre regarded by journalists as most respected government media site ever
- live streaming of all press conferences/keynote speeches
- the site is reliable (minimal down time) and meets AA accessibility at all times
4. Respected Platform for discussion and debate
- seamless integration with all partner engagement sites
- clear evidence of link between pre-summit web debate and post-summit outcomes
- visitors return to the site, go to other areas of our London Summit web presence or subscribe to feeds/emails
- the site (and related wider web-presence) becomes a best-in-class example of digital engagement
I’ll report back on how we did against these in April.