Measuring social media reach

SHARE:

It used to be so simple. For years the pace of change in the media measurement industry reflected the pace of change in PR itself – pretty ...

It used to be so simple. For years the pace of change in the media measurement industry reflected the pace of change in PR itself – pretty glacial. As an industry, media analysis and evaluation started to become more widespread in the late 1980s and early 1990s.

Back in those days my colleagues and I used to receive literally sack loads of paper clippings each day that would be sent to us either directly by our clients or from the press cuttings agencies. Clips would arrive days and often weeks after publication. The usual turnaround period that the evaluation agencies would work to was then a further two weeks as we read and analysed the clips according to each client’s brief. Written reports would often arrive back with the clients weeks after the content had been published.

All of the evaluation companies tended to measure similar things. Categories would include the size of the article, its position in the paper and on the page, whether it included a headline (and if so the size of that), a photo, the article’s tone, brand mentions, organisational and industry issues, inclusion of positive or negative messages, the media type and publication, the date, the journalist, whether spokespeople were included, the readership figure and frequently Advertising Value Equivalents (widely known in the industry as AVEs) and other things too.

The only way we differed from each other was in the amount and quality of the supporting written text and our approach to how we reported on the above categories. Many companies looked for unique angles with their own proprietary scoring systems where they factored different elements from the aforementioned categories with a weighting system to come up with their own magic formula number. And the vast majority of evaluation programmes concentrated on measuring purely the outputs (like newspaper clips) rather than the out-takes (what the target audience now thinks) or the outcomes (what they now may have done). Of course, newspaper clippings alone could only answer this part of the equation.

The side-effect of this disparate approach by the industry was that clients would often be confused, and the majority of the metrics themselves did not speak the language of the boardroom.

As the years passed, media evaluation became more mainstream around the world, and the differing methods of analysing the content spread. Then, in 2009, great progress was made when AMEC (the International Association for the Measurement and Evaluation of Communication), working in partnership with other global PR bodies, declared the Barcelona Principles, which were widely accepted. These were seven statements detailing how PR measurement could and should be done as best practice.

The seven principles are:
1. Goal setting and measurement are fundamental aspects of any PR programmes.
2. Measuring the effect on outcomes is preferred to measuring outputs.
3. The effect on business results can and should be measured where possible.
4. Media measurement requires quantity and quality.
5. AVEs are not the value of public relations.
6. Social media can and should be measured.
7. Transparency and replicability are paramount to sound measurement.

To lead the way, AMEC published its Valid Metrics Framework.The Valid Metrics Framework provides a template (shown below) whereby each communication tactic can be measured down the marketing funnel from awareness to knowledge and understanding, interest and consideration, support and preference and finally to action.


It appeared that we were enjoying a new era in the PR and measurement industry – one where the industry had agreed to turn away from spurious scores and weighted outputs and to focus on measuring outcomes based on objectives of campaigns. But something else had been changing.

Throughout the ‘noughties’, and with increasing urgency since 2005 with the early case studies of the potential crises to which companies were now exposed, online and social media were growing in popularity and importance. While organisations were working out what it all meant, whether it mattered, how to engage with the channel, and, if so, which department should be involved, new companies like Radian6, Sysomos and Brandwatch were being set up with the purpose of monitoring all of this new content.

Back to the Dark Ages?

Quickly they realised that purely monitoring the content was not going to be enough and that they needed to analyse and attempt to bring meaning to the information as well. As these services have a one-size-fits-all approach, they began inventing scores and systems to measure content as if the previous 20 years hadn’t existed. Worse, with the one-size-fits-all approach, all metrics had to be the same for all organisations regardless of each one’s differing objectives, goals and tactics.
To cope with the vast volumes of content, it was apparent that humans couldn’t analyse the content fast enough or cheaply enough, so automated algorithms using natural language processing (NLP155) techniques were deployed to analyse topics and tone in real time.
Other automated scores also started to be added as the industry tried to understand this new medium. 

It became increasingly apparent that influence spreads differently across social media than for mainstream media. No longer were communicators dealing with individual channels delivering their own unique audiences; now it was an interconnected channel offering multiple audiences – and each member of each of these audiences had their own audience. Although the vast majority of consumers may just observe, or lurk, there were some that seemed able to make things happen – whether accidentally or in a more deliberate manner. These types of people came to be known as influencers. From a PR point of view, if they could be identified and targeted, the potential power to a brand would surely be significant.
The automated tools, of which there were more than 200 at all price levels by 2012, rushed to come up with ways to identify these influencers using automated scoring and weighting systems. The best known of these new tools that focuses on influence is probably Klout, which calls itself ‘The Standard of Influence’. But, as Philip Sheldrake, author of The Business of Influence argues, true influence cannot be measured with a score. According to Sheldrake, influence has been exerted when someone thinks something they wouldn’t otherwise have thought or does something that they wouldn’t otherwise have done.
As he says:
‘What influence is not … Influence is definitely not some quantity invented by a PR firm, analytics provider, or measurement and evaluation company that rolls up a number of indices and measures into some relatively arbitrary compound formula that makes any appreciation of the underlying approach, variables and mathematics completely opaque to the end-user, thereby radically attenuating any little use it may have been but in such a way that it can be branded nicely and sold as “unique”.’
Sheldrake develops his thoughts on understanding social media influence further in his book.
The similarities with the early media influence indexes and scores of so many traditional media analysis companies is striking.
The problems with the measurement and analysis of social media didn’t stop here though: my own company, Metrica, trialled three of the best-known social media monitoring companies to see how they stacked up against each other. We set each company the same brief. We were looking for English-language content only based around a tightly defined search string to do with a well-known IT company. We set them the same time period of two weeks and set up feeds direct from their databases to our own where we then pored over the results. To check the automated analysis, we had three expert analysts check the sentiment analysis that the automated systems had decreed for 500 posts each. We were shocked by the results. Some of the startling facts we discovered included:
The total numbers of posts returned ranged from 281,000 found by one provider to 451,000 found by another. Some of the providers found the same piece of content repeatedly but counted it many times, failing to deduplicate effectively.
At least one-third of the content found by each provider was not relevant to the brief that we had set them.
Only 20% of the total content was found by all three of the providers on test.
All three providers varied wildly in their ability to find content in the differing social media channels.
The speed that they returned results also varied dramatically. One company averaged over 24 hours to return the content.
The accuracy of their sentiment analysis was at best correct 61% of the time and at worst only 29% of the time.
These sorts of results are not unique to Metrica’s research; many other companies have looked into the monitoring platforms and found similar results.
The unsuitability of automated systems for measuring social media in a meaningful manner has meant that there is now a concerted drive to look at setting standards and best practice.
AMEC, having succeeded previously driving the PR industry forward in its approach to measurement with the Barcelona Principles, is looking to repeat its success with a similar strategy. The organisation has set up a social media measurement standards group working in a coalition with global PR trade bodies that include the Institute for Public Relations, the Council of Public Relations Firms (both US trade associations) and in collaboration with the CIPR and PRCA too. The CIPR has already made great progress, with its research, measurement and planning toolkit updated to include an approach to measuring social media.AMEC has now assembled a team of experts representing companies from across the world to help drive the standards.
Areas of confusion identified by AMEC that will benefit from further clarity include:
Influence – is there a best practice way to identify who has got it?
Sentiment – has your content been analysed by man or machine? Do you know what error rate your approach should expect?
Engagement – what constitutes engagement? Just because someone has clicked on a Facebook ‘Like’ button, does this mean that they engaged with your organisation or brand?
Monitoring/content – what methods have you used to source your content, how fast is the company that you are using at retrieving data, how broad are its searches? Within the content returned, what amount should you expect to be relevant or spam or even porn?
Demographics/target audiences – is there a way to bring target audience information into the reporting?
Interestingly, as marketers got used to the concept of paid media, owned media and earned media, it became apparent that the PR industry was not the only marketing discipline looking to establish standards. In October 2011, the first Social Media Measurement Standards Conclave was held, which saw a coming together of a number of different trade bodies from across marketing to identify areas of potential collaboration and overlap. In addition to AMEC, IPR, CPRF and the CIPR, SNCR, Web Analytics Association, IABC and WOMMA were all represented. The Conclave largely agreed with AMEC on the key areas, although with a slightly different emphasis, and is currently focusing its work on:

1. Reach and engagement.
2. Influence and relevance.
3. Value/impact.
4. Content definition.
5. Sentiment and advocacy.
This is all still a work in progress and the results will not be available until after this book’s publication, but it seems clear where the work is likely to take us. As with the Barcelona Principles, there won’t be one number which will work across all social media measurement. Nor will there be one approach which will definitely be the right one. I’d expect to see, as one attendee at the Conclave put it, a GAAP (Generally Accepted Accountancy Principle) approach. I expect the standards to be guidelines rather than prescriptive techniques and a similar framework approach as the Barcelona Principles to be endorsed at AMEC’s 2012 summit in Dublin in June.

Practical Next Steps

Measuring social media can sound overwhelmingly complex. When confronted with a new-to-many channel and with so much content, flowing in real time, it can be tempting to settle for accepting metrics that come included with monitoring platforms. My advice is: don’t.
When looking to analyse your coverage, think as you have always been trained to think. Stop and ask yourself what your objectives are. Make sure that you tailor your metrics back around these objectives. Every time that you feel you have a potential metric, ask yourself why is this relevant/so what? Do this two or three times per metric before you accept it. If it doesn’t pass the ‘so what’ factor, it shouldn’t make it in to your analytics.
Until AMEC, the Coalition and attendees of the Conclave publish the next steps, I would recommend following Don Bartholomew’s approach. Don blogs regularly and is one of the brains at the forefront of the measurement industry.
Don has adapted the marketing funnel and suggests it should now read:

He also believes that a framework approach is the right one. Below we can see how he suggests appropriate metrics for each of the steps of the funnel and against each of the integrated types of media.

So, for each of your campaigns, in advance of executing any work, think about how what you are trying to achieve will fit into a similar framework. And before you rush out to spend money on the latest all-singing and all-dancing automated platform, ask yourself whether this will really be giving you the insight and measurement that you need. As always, ask yourself what success would look like, and then think which would be the appropriate metrics to demonstrate this.
The last word should go to Amber Naslund, who co-authored with Jay Baer the recent must-read book The Now Revolution. In a post entitled The Most Powerful Social Media Measurement Tool Money Can Buy,162 Amber says:
‘With so many pieces of information floating around, we are more pressed than ever to find something, anything that can help us make sense of the mess. Tools and apps and platforms abound, smashing together data with alacrity, and pouring out more data as a response.
Measurement has become almost as bad of a battle cry as ‘influence’ or ‘awareness’ or ‘Community’. We have millions of pieces of information out there, and if we can come up with any way of distilling them into something that feels simple, we cry eureka! and slather it all over our reports like it tells us everything we want to know.
But software and tools and automated rankings and everything of the stripe leaves one feature off the list, the feature that only that Human 1.0 can bring to you: Critical Thinking.
The ability to look at a number and ask hmmm, where did that come from? Is that accurate? Complete? Relevant? Does it matter? Why does it matter, and what other information do I need to pair it with in order to make it matter? What is this number actually telling me, and can I improve upon it by changing how we gather it somehow?
Only the human brain is capable of accurately and consistently critiquing and evaluating some of the most important qualitative things around data: context, nuance, sarcasm, unspoken implication, the dynamics of the ecosystem that sprouted the numbers, the impact that the gathering mechanism has on the numbers, understanding what other numbers and data should be related to one in order to make it potentially meaningful.’
Amber’s advice makes perfect sense. No amount of numbers, scores and indexes are going to tell you whether you are achieving your specific objectives. That takes some thought, a tailored approach and human brain power.

COMMENTS

Name

Analytics,29,Case Study,9,Content,6,Experience,11,How-To,6,Mobile Marketing,2,Social Media Strategy,52,Strategy,15,
ltr
item
The Digital Media Strategy Blog: Measuring social media reach
Measuring social media reach
https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEh3Vy70FmP1fMigOvKfMjsDHjm7SFvtp2j0GNHUa4XrShu3k_a9HRdCuk_rV9TDPoMw86ChG9ilbtcS_rJeUZFBhZ3PGoOOg5oGW5LnOzSFsS0n-CdscyDuonC4FiTYY_h2yjT7IsmPS974/s1600/measuring+social+media+reach.PNG
https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEh3Vy70FmP1fMigOvKfMjsDHjm7SFvtp2j0GNHUa4XrShu3k_a9HRdCuk_rV9TDPoMw86ChG9ilbtcS_rJeUZFBhZ3PGoOOg5oGW5LnOzSFsS0n-CdscyDuonC4FiTYY_h2yjT7IsmPS974/s72-c/measuring+social+media+reach.PNG
The Digital Media Strategy Blog
https://social-media-strategy-template.blogspot.com/2015/04/measuring-social-media-reach.html
https://social-media-strategy-template.blogspot.com/
https://social-media-strategy-template.blogspot.com/
https://social-media-strategy-template.blogspot.com/2015/04/measuring-social-media-reach.html
true
5221119051567202753
UTF-8
Loaded All Posts Not found any posts VIEW ALL Readmore Reply Cancel reply Delete By Home PAGES POSTS View All RECOMMENDED FOR YOU LABEL ARCHIVE SEARCH ALL POSTS Not found any post match with your request Back Home Sunday Monday Tuesday Wednesday Thursday Friday Saturday Sun Mon Tue Wed Thu Fri Sat January February March April May June July August September October November December Jan Feb Mar Apr May Jun Jul Aug Sep Oct Nov Dec just now 1 minute ago $$1$$ minutes ago 1 hour ago $$1$$ hours ago Yesterday $$1$$ days ago $$1$$ weeks ago more than 5 weeks ago Followers Follow THIS PREMIUM CONTENT IS LOCKED STEP 1: Share to a social network STEP 2: Click the link on your social network Copy All Code Select All Code All codes were copied to your clipboard Can not copy the codes / texts, please press [CTRL]+[C] (or CMD+C with Mac) to copy Table of Content