Vanity Reporting: Feeling Good Doesn’t Get Results

by | Sep 1, 2020 | Strategy | 0 comments

What looks better to share with the leadership team? “This month we had 10 new followers on our Facebook page” or “This month we increased our Facebook audience by 25%”?

Of course the latter option looks better, so you may be inclined to use that in your monthly reporting. But if you present that percentage and are asked how much that is, only to find out it is an increase of 10, will there be disappointment? Will they think you’re misleading the team? 

Consider this example: In your community, would you prefer an announcement that your property taxes are increasing by $250 per year or that they are being increased by 125% per household over last year?  Can both be true? Yes. Depending on the size of your community, a 125% increase over the average taxes from last year could be $250 per household. It’s better to present that as a dollar amount so there are no heart attacks over such a high percentage increase.

It’s a paradox but if you do it right, you will be able to successfully report metrics in a meaningful way that is understood by all and won’t leave room for unintended data lies.

As a strategist, collecting and analyzing data surrounding initiatives is important; this tells us if goals were met, if a campaign was successful, or how to improve on current initiatives.  We all want to report good news to stakeholders, but sometimes it’s just not pretty. This is when the temptation to use vanity reporting sets in.

Just like vanity metrics, vanity reporting is using data that “looks good” and “feels good” when the actual reporting metrics aren’t looking so great. It can also be described as looking at overall data and then cherry picking the most positive statistics when there’s a choice between choosing two metrics to explain the same result. Whether it’s a way to sugar coat bad news or fear of reporting a less than stellar outcome, it may be tempting, but steer clear of this.

Data can be wonderful; it should be absolute, with the thinking that “numbers don’t lie.” We all know that the same data can be interpreted in different ways, and there is an inclination to use the results that are most positive. 

When it comes to social media strategy, analysis & reporting is vital; it drives decisions and helps build and maintain initiatives that help businesses reach goals and understand the why behind what is being done across the marketing team.

So what’s a strategist to do when it comes to reporting?

There are some things to think about when it comes to analysis and reporting that can serve you well as you are preparing regular reporting to team members and key stakeholders:

  1. 1. Make sure expectations and metrics are communicated clearly at the onset. The first part of strategy is planning and goal setting. This is often times the least favorite activity, but so very important. In this context, the importance comes in setting KPIs taking into consideration not only what will be measured, but how it will be measured. 

This will take some thought and consideration – which metrics will accurately measure whether a KPI was met or not? Take it one step further and decide how that metric will be presented – will it be a number, a percentage increase/decrease? In terms of engagement, are you looking at all engagement, regardless of whether it’s a like, share or comment, or does the KPI require that you focus on a specific type of engagement? Take time to talk it through and consider how you want to move forward with reporting. This is just as important as taking the time to create content to publish or any other piece of the process.

  1. Don’t report “just” the metrics. Simply reporting a series of performance numbers with no information or context doesn’t mean anything. Stakeholders may skim the report, seeing a bunch of increases (or decreases) and make assumptions that may not be accurate.

Offer explanations that serve as reminders as to what the metrics are looking at and why they are important to consider, and then end with a description of what the metrics mean in relation to the goals and KPIs.

  1. Metrics do matter, but context matters more. Just like vanity metrics, it’s great when a published post does exceptionally well, maybe sending the “like” count off the chart. Does that automatically mean it was successful and all future content should replicate this? 

Maybe, maybe not. 

Looking at the context behind it will be important. Was it a positive spike, or did something happen where it was taken the wrong way and actually spiked for negative reasons? Remember this when preparing reports – you may have bad news to share because reporting shows this month’s KPIs weren’t met, but putting context behind it may accurately offer an explanation without resorting to misleading statements or suggestions. I am not saying that adding context can “explain away” a less than desirable outcome but it’s good insight to look at and share with stakeholders in reporting whether the metrics are trending positive or negative.

So, how can you avoid vanity reporting when analyzing social initiatives?

  1. Stick to the goals. When creating goals, the M in SMART goals is measurable. Not only does a KPI need to be measurable, but the measurement should be decided on beforehand, and then stick with it. If the goal is to increase engagement on an Instagram account, for instance, are you setting a goal of increasing X shares and likes per month, or is the goal to increase engagement overall by 20%? It will depend on the situation, but once the measurement is decided, this is how it should be reported consistently.
  2. They say look for the silver lining, but not when it comes to data. While looking at the context may provide insight into the reporting metrics, it should not be used as a silver lining (hey, we didn’t meet our goal in CTR but we did get more likes, so let’s focus on that for right now). That doesn’t explain why the CTR goal didn’t materialize, though it could be helpful to know for future initiatives.
  3. If data collection changes, so should your reporting. Use metrics to make changes, but if you change the metrics, change it all. I hate using this example, but it might be helpful. Stripping this example down to the numerical analysis, we can look at tracking of the Coronavirus (I’m sorry, but stick with me). 

At the onset, the daily number of positive cases were announced. We used a rise or fall as indication of the spread of the virus. Things shifted a couple months in and testing increased fairly significantly; you know what happened next – the more tests, the more opportunity for positives. Looking at the number of daily positives was no longer an effective metric to truly understand spread. In fact, it may have looked like there was a huge spike with no context behind it.

A better way to look at the data at this point, given this change, is the percentage of positives in relation to tests given. This is more accurate and give a better sense of what might be happening.

However, the change needs be made with some parameters put into place to avoid confusion and ensure it doesn’t look like an attempt to suppress data. It has to be clearly stated WHY we are changing the reporting metric and what implications it may have, and then go back to past data and recalculate using the new reporting metric (percentages over actual positives). The data can then be evaluated using this new metric over time. We may find things have really improved, or maybe not, but at least we can now look back on a level playing field.

A different example related to social engagement may be an initial goal of increasing engagement at a rate of 20% per month. As time goes on, there may be a shift in focus because the content being published is changing, and its goal is to drive people to click to the website to register for an upcoming webinar.  

In this case, measuring all engagement isn’t effective anymore – shares and likes may indicate interest, but we don’t know if they clicked through, or those that they shared it with clicked through. It’s part of the story (tracking interest in the webinar), but it’s not accurately measuring the goal for this initiative (clicking through to the registration page). Going forward, only URL clicks should be measured – again, we won’t know for sure necessarily if people actually registered after getting there, but it’s a signal of intent. 

When reporting this month, it’s not helpful to measure the current goal of increasing engagement by 20% – you’re only focused on one type of engagement this month – clicks – so the numbers may look very different this time around. Instead, you’ll have to “start fresh” OR go back and extrapolate out the click engagements only from past data if you want to see the true analysis compared to prior months. Actually, you might need to do this anyway so you know what the baseline CTR is before making goals for future growth.

Reporting & analysis is a vital part of the strategic process. Not only does it allow for data driven decisions, improving the strategic process, but it legitimizes the work that we do. It gives stakeholders the information to better execute strategy and meet business goals. Making sure this part of the process is objective and well-constructed as possible will lead to successful marketing initiatives, and your team will thank you for it.

Author: Marianne Hynd, SMS

Marianne Hynd is the Director of Operations at the Social Media Research Association, a global trade organization dedicated to forming a community of researchers who aim to define & promote best practices and share ideas to help enhance the effectiveness and value of conducting research using social media.

Take a listen to the SMRA Podcast featuring fellow NISM board member, Joe Cannata.

0 Comments

Submit a Comment