In many ways the measurement still produced for various stakeholder dashboards has not changed for many years. We still see the benchmark of activity as something which should be measured. The value of the activity is something which stakeholders rarely asked for.To measure the value of the relationships and transparency created by the individuals, groups and communities residing on collaborative or social platforms we still need to conduct a lot of manual digging to find measurement around such artifacts as:
• Social Knowledge – this can be defined in many ways such as assets being shared around a community (and beyond) and related practices emerge.
• Relationship development – the ability to create new relationships and networks that previously didn’t exists
• Number of relationships created by individuals and their depth – look at followers and participation in threads
• Discovery of communities – have members joined communities outside their ‘physical’ or existing network
• What collaborative activities are emerging
• What threads, replies, comments or connections contain referrers to potential collaborators
• What threads contain creative or innovative ideas
• Are members sharing personal stories and how much emotional support is provided
The various web metric packages and social business tool reports do not provide this type of information and much of it will be antidotal evidence. Social analytics are poor within most social tools (it will be a major revenue stream for a vendor that can start to provide some of the softer metrics that articulate quality and not just quantity).
Over the years I’ve reported on numerous ROI and metrics to various groups of stakeholders. My top 3 in no particular order are:
Creating an online community platform saw a 25% increase in the production of material for clients – by providing a collaboration platform for an existing professional service group their monthly ‘physical’ were supported by an online community platform. It enabled the sourcing of wider expertise (from across the country) that resulted in a 25% increase in the production of thought leadership material to be issues to clients (you could argue if that was a good thing but that is missing the point).
IA change resulted in senior managers saving an hour per month searching for documents – by conducting user research into how audit managers worked a change of IA and navigation within their community site saw, on average, senior managers save 1 hour per month in sourcing the relevant methodology documentation required, enabling greater time to be spent on finding and minding clients
Developing the online community sees a rise in employee satisfaction scores – a large customer service group within a global organisation were given access to form their own online community. With good strategy, governance and stewardship the community thrived. In annual employee satisfaction surveys the groups average % score increased significantly (I’m sure there were many other factors involved by why spoil a good tale) and was over 20% higher than other similar customer service groups. In some areas a 1% rise in employee satisfaction equates to £2m extra revenue – so you can work out the potential benefit!
On the downside my most disappointing metrics was reporting the drop in homepage visit after an expensive rebranding exercise on our intranet homepage but that did reflect an increasing trend in the value of the homepage becoming diminished
My favourite ‘metric’ as such involves a community set up to bring two very diverse groups together, to collaborate in reporting common faults and reporting back workarounds and fixes. I am hard pressed to call it a community as neither group had any previous interaction (which was part of the issue) and I do preach that unless a conversation is already taking place in the physical world it is hard to develop this online.
One group was a skilled manual workforce based across the UK. The other group dealt with customer service and could be located across the global. With governance and steward in place the volume of activity began to increase.
When it came to the assessment report the ‘metric’ I took most pride in was not the volume of activity nor the number of cases solved but the anecdotal evidence from both sides of the fence that the visibility and transparency created through the forum had begun to create a greater appreciation from each group, an understanding of the issues each face and how to work with them.
You could then spend months evaluating how much benefit this continuing of connections could save the organisation but sometimes the user comments mean so much more than a hard metric.