London Summit digital campaign evaluation

This blog post first appeared on blogs.fco.gov.uk on 17 April 2009

So our website is now a historical record – the debate has moved on. There’s still plenty to do of course, and for many the summit marks the start of a new debate. But our digital campaign is over, and we can take a step back and assess how we did. Did it deliver what we intended? And what lessons can we learn and share?

We’ll publish all our evaluation on the London Summit website itself, including all the numbers, measurement against all our KPIs, and the independent evaluation of others. But it’s been a couple of weeks now, and I think I have enough distance and data to reflect on what we did.

We started this work with a degree of confidence. We’ve been thinking about – and practicing – how to do digital diplomacy in the Foreign Office for a while now, and as a result we were pretty confident about our methods in January. The lead up to the summit seemed like the ideal opportunity for digital engagement. I published here the performance indicators we set for ourselves at the start of the process, but if I’m honest, I really didn’t know whether we’d deliver them all, or how much of an impact our digital campaign would have.

So did we succeed? Well, I’m not sure yet.

There are bits that I think we did well. I think we made good judgments about the ambition for the London Summit website, and the tone of the content that appeared there. We did well to ensure that the site aggregated content from around the world, and provided space for a range of voices. Some of our partnerships with other web platforms and forums worked really well. We produced a lot of video and photo content, which helped bring the debate to life. We provided good authoritative content on the purpose of the summit, an efficient service for journalists and accreditation, live streamed video throughout the summit, and lots of structured ways for people to contribute to the debate.

We’ve learned a lot about how to deliver intensive digital campaigns. We had a great team working on the campaign, and we relied on the expertise of others across government, the FCO network and our partners. We published new content many times a day which enabled us focus our work on the narrative of the debate as it emerged.

We also learned a lot about the FCO web platform, about delivering content to a global audience with a huge concentrated peak of traffic. And we learned a lot about citizen engagement, how and when to encourage debate, and when to just take a step back and reflect what’s going on elsewhere.

There were plenty of things that didn’t work. As you’ll see from the performance indicators below, there are some things that we set out to deliver, but just didn’t achieve. Some of our ideas were never realised and some of our partnerships just didn’t come off. We hosted and commissioned a lot of debate, and we listened to what others were saying online, but we didn’t often actively participate ourselves other than in delivering messages, aggregating, or summing up.

Our website was a destination for authoritative content and specialist debate, but of course other websites provided alternative places for the popular global debate to take place online. And our success in encouraging debates in some countries, wasn’t matched with all our target audiences.

Of course, this is all just subjective, and I’m possibly not in the best place to judge. So we’ll publish all the bits of evaluation that we and others produce on the now frozen summit site, and I’ll highlight the best bits here.

But I promised to report on our performance against the original KPIs. So here you are (we’ll publish the full version of this on the summit site):

1. The focal point for engaging and shaping global opinions

– http://www.londonsummit.gov.uk most authoritative site for “London Summit” according to major search engines.

Met. Our content was well optimised for search so we were quickly at the top of natural search rankings for our key terms. Having said that, we found that other terms were far more widely used that some of the initial key terms that we identified.

– influential sites in every target country link back and quote from http://www.londonsummit.gov.uk

Partially met. Of the 23 countries targeted, influential sites (those with a Google page rank of 6 or above) from 12 countries linked to the London Summit site.

– all visitors find it quick and easy to find the info they are looking for

Our user survey will provide more data here.

2. Authoritative provision of in-depth briefings on Summit

– all unclassified policy papers accessible from londonsummit.go.uk in web friendly form

Met. We published everything we had, including communiques, transcripts, summaries, video. And we didn’t resort to PDFs.

– only the highest-quality content goes on the site based on the content guidelines. If it doesn’t help to achieve an objective is doesn’t go on

Partially met. We wrote good copy, kept to our content guidelines, and had a clear process for clearing policy-sensitive content. But I know we sometimes published in a hurry, and corrected later.

– at least four expert bloggers providing authoritative real time content for London Summit

Not met. We ran an editors’ blogTom Watson blogged from the summit, Stephen Timms posted tweets (#timms), the Foreign Secretary posted blogs, and government economists posts articles on the Vox EU debate. But we didn’t have specific attributed expert bloggers contributing to the online debate throughout the campaign.

3. Effective operational functions for 2,000 journalists

– Media centre regarded by journalists as most respected government media site ever

I think we did ok, but our survey of users will tell us more.

– live streaming of all press conferences/keynote speeches

Partially met. We streamed the whole summit. We didn’t live stream any other events, but we published a lot of same-day video thanks largely to our partnership with British Satellite News.

– the site is reliable (minimal down time) and meets AA accessibility at all times

Partially met. The site had 99.82% availability, the page templates all met AA standards and have been tested with real users for accessibility, and our key content was always provided in accessible formats. But our content did not always validate as AA accessible (for example, we did not caption, or provide text alternatives for all our content, and when were faced with publishing a video without a transcript, or not publishing it at all, we published the video).

4. Respected Platform for discussion and debate

– seamless integration with all partner engagement sites

Not met. Our referral stats show that user journeys between the parts of our web presence were not common. So although we did some cross promotion through RSS feeds and promos we didn’t always deliver reciprocal links, so our users could not move seamlessly between our partner sites.

– clear evidence of link between pre-summit web debate and post-summit outcomes

Met. Our debate issues reflected many of the outcomes. It is less easy to measure whether the online debate influenced – or just reflected – the outcomes, but I hope our detailed evaluation (and the evaluation of others) will provide some data here.

– visitors return to the site, go to other areas of our London Summit web presence or subscribe to feeds/emails

Partially met. 27% of visitors to the site were repeat visitors, and 2,273 people subscribed to receive our email newsletters, but there wasn’t significant traffic between areas of the summit web presence.

– the site (and related wider web-presence) becomes a best-in-class example of digital engagement

For others to decide

And some slected numbers (because however much I protest that evaluation isn’t just about numbers, I know I can’t get away with not mentioning them at all):

London Summit website (29 January – 6 April): 466,159 visits, 1,572,643 page views

London Summit YouTube: 149,580 video views

Live video streaming: 165,000 views on 2 April

London Summit Flickr: 1,065,825 photo views

VoxEU Global Crisis debate (ongoing): 160 articles, 300,000 page views

Yoosk London Summit: 327 questions, 36 answers from public figures

Chosun debate (Korean London Summit forum): 231,832 unique visitors (by 2 April)

I’ll post more on the digital engagement, and the other evaluation as it’s published.