The ‘hardy perennials’ of our trade have no trouble finding their place in the key themes to focus on in 2017, according to the recently published State of the Sector report from Gatehouse . This annual report always provides some valuable and thought-provoking insight into the day-to-day work of IC practitioners, and tucked alongside the hot topics of current digital trends and latest social channels, one theme in particular caught my eye: Measurement. How do we demonstrate our value?
Lee Smith, co-founder of Gatehouse says: “The survey data suggests that practitioners are focused on output measures, rather than outcome measures. Showing how many employees received a message or how many clicked on an intranet page measures activity. But does it help us understand the difference that message has made in terms of attitudes, behaviours or knowledge?”
simply spoke to Jonathan Phillips, founder of ClarityDW, for whom measurement in IC is a bit of a hobby horse: “It’s tremendously good news that the IC community is interested in measurement and even better news to see that the State of the Sector report suggests that more measurement is going on than ever before. But there is a ‘But’. Too often, across the IC profession we are investing energy in producing the wrong information, amassing data that at best doesn’t tell us very much at all, or worse, misleads us.”
Proving the value and impact of communications is central to our survival as a function so it is worth looking in a little more detail at why it is so important to measure the right things. At the basic level we want to know whether our communications did what we intended them to do – we need measures that help us to understand our PERFORMANCE. Then, we want to use our data and insights so that we can improve on the outcome next time; we want to make PROGRESS.
Let’s first consider why we communicate at all? Jonathan again: “Being able to ask the right questions at the outset is the key to knowing what exactly we need to measure. Every communication should have a purpose designed to make the employees Think, Do or Feel something new. Identifying this critical outcome tells us what needs to be measured. It’s how we know whether or not our communication has impact.” It is worth noting that the owner of ‘our’ success metrics may well be colleagues in HR, IT, Finance – or whoever in the business owns the project we are communicating on. “Co-sharing metrics builds better alignment between project and communication outcomes and, in my eyes, makes for better communications.” he added.
Five Golden Rules of Measurement
To help us get a handle on establishing what exactly to measure, Jonathan shared his thought process with Five Golden Rules of Measurement.
- Never measure anything that can be unduly influenced by individuals or small groups.
At the most basic level, this tells us that measuring page views, for example, is not a robust measure of anything useful! Page views are easily influenced by repeat visits and even leaving an open browser on refresh! On a more serious level, think about the VW emissions scandal where the level of exhaust emissions recorded was able to be influenced by a handful of clever engineers executing a ‘fix’. Likewise, the USA Veterans Health Administration scandal of 2014 whereby appointment records were falsified to meet a 14 day target. While no one condones the falsifying of records, it was acknowledged that the 14 day target was unrealistic in several states yet that was what hospitals were being assessed on. When you measure, you may end up gamifying the system, and what starts out with good intentions may backfire as the incentive to meet the target measures overshadows the purpose behind the measurement in the first place. Poor measurement criteria inevitably leads to poor behaviour.
- Measure the change you want to see: the outcome not the output
Identify the purpose of your communication, your intranet page or your feature and measure that. Remember: the purpose is what you want the employee to think, do or feel as a result of the communication. If you really can’t measure the outcome – and it is not always straightforward to do so – be 100% certain of the causal impact of the outputs. Using the page views example again, this might be the difference between page views and unique page visitors. The latter tells you a good deal more than the former. If your page is promoting a new training course the outcome measure is surely the number of people who attend that training course. Unique visitors and click through rates may be useful data to discuss if the total numbers attending training is not what was expected. This may mean that the true owner of success measures are outside of the internal communications community.
- The ‘So What?’ or 5 Whys’ test
There’s a small child in all of us: set them free! Ask yourself why you are measuring a particular thing. Then ask again; and again… It’s how we drill down to get to the nub of something, to really distil an issue to its most salient point. Being able to identify the context and work out whether or not what you are measuring will really help you make progress is a skill worth working on. Data alone is just numbers. Add some context and you can start to draw conclusions. This is your valuable information. Consider this as a light-hearted example:
- Beware of Heisenberg
Drawing on some senior school physics and a touch of irony, this prompts us to be mindful about how much effort we put into measuring just for measurement’s sake. ‘Energy invested in measuring the system, distorts the system’ the school books say and it holds true for communications. It urges us to be proportional in our investment in measurement and check for distortion along the way. The diagram below illustrates this clearly: The ‘Star ratings’ is an interesting point. What does a star rating actually tell us? In our consumer lives many of us are influenced by star ratings on our purchases whether that be our online shop from Sainsbury’s, our Amazon purchases or our holiday bookings on Trip Advisor. It is likely, though, that in most cases we read the reviews that support the star rating to give us an understanding of the context in which the rating was made. Armed with these richer insights, we are able to make our purchasing decisions. Many organisations -with Sharepoint intranets for example – use a star rating system on pages too, yet without full accountability and with no documented context, the star rating alone is just data. Without context, you run the risk of failing to adhere to the first three golden rules.
5. Ensure there is real clarity in each measure
When deciding what to measure it’s important to make sure that you really know what you are looking at. Through our IC lens, we may be guilty of too much positive interpretation of measures simply because the things we are measuring are open to misinterpretation. The table below explains this:
The clear message here is to ensure that there is a causal link between what you are measuring and the success you are looking for. Take dwell time on a web page for example: on an externally facing website this is generally a good thing. But is it a good thing on your intranet? Time on the intranet is time not doing work so perhaps this is not the best measure in this case. “A comfortable measure of success for an intranet is adoption levels. Tracking now many of your employees are visiting the intranet on a daily, weekly and monthly basis is very good place to start.” suggests Jonathan.
What does success look like?
“Very simply,” explains Jonathan, “if you’re not moving the dial, your communications are not successful! However brilliantly written, however beautifully designed, if your communication fails to change behaviour in some way, it is not successful. As communicators we need to tie in our KPIs with those of the stakeholders. We are working to the same business goals after all. This could be as straightforward as asking the question of your stakeholder: ‘How are you going to measure your success?’ If your HR team are looking to switch to a new benefits system, their likely success measure will be how many sign ups on the new system they get. That’s your success measure too. Video views, downloads, click-throughs, open rates – it’s all just noise and data. Collect it by all means. As causal measures, they may help you set targets for future campaigns, but on their own they do not tell us the whole story.”
Jonathan Phillips is the founder of ClarityDW and a globally-recognised thought leader in Digital Workplace technologies. He is also co-founder of intranetizen.com, advisor to the UK Government and a non-exec charity leader.