1. If you aren’t measuring anything else, social media measurement isn’t the problem.
Measurement is a discipline, and it needs to be business-wide. If you’re going to ask about the ROI, value, or impact of social media and how to measure it, I’m going to ask how you’re going about determining those things for other areas of your business, and ask you to translate or adapt some of those practices over to social initiatives.
If you’re not measuring anything else, you’ll have a learning curve. A steep one. It’ll come complete with needing the right tools and platforms to collect data, the right people to analyze it, the buy in from management to spend the time doing all of this, and the commitment to use the measurement as a means to underscore your strategy. The social media data is available for the taking, so that’s not the problem. The *real* issue is connecting the dots. See #4.
2. Measurement is not the goal.
The goal is to derive insights that teach you something of value, and then act on them. Measurement is a waystation, a path, but is not the goal in itself. You don’t get a cookie for measuring.
You probably need to spend three times as much time and effort evaluating and acting on your data than you do collecting and formatting it. Why? Because the analysis is what yields direction, plans, action steps, you name it. You START with the data. You need to end up with a course of action, or the act of measuring (and all the time you spent doing it) is wasted.
3. Measuring activity isn’t as important as measuring results.
Gathering fans on Facebook is an activity. How those fans chose to respond to your offer, sign up for your newsletter, or buy your product (or not) is a result.
Number of forum posts is an activity. How many of those forum posts converted new downloads of your latest ebook is a result. (Even better if you can take it a step further and show the ebook downloads that became leads).
Follow me? We’re very caught up in trying to track all the stuff we’re doing, and not spending enough time connecting dots between those actions and how they drive progress toward the goals we’ve set. Speaking of which…
4. Metrics are determined by goals.
Learn how to create measurable objectives and the metrics practically jump out at you. If you know where you’re headed and have a clear definition of what it means to reach your goal, it becomes pretty apparent which signals (metrics) will tell you whether you’re close, far, or right on target.
And you don’t need 40 different metrics to underscore a hypothesis or progress toward a goal. Typically you just need a few. If your goal is to raise awareness for a cause, you can look at reach of mentions and messages, increased donations, or a surge in volunteer signups. Those go up, you can be reasonably certain that what you’re doing is contributing to those things, and likely justify staying the course. Which leads me to a biggie…
5. Cause and Correlation are different.
Cause means that something you did drove someone to act. Directly, and usually singly. There’s a clear line between initiative and result. (We could argue that nearly every causal relationship has external influences, but that discussion for a headier day).
Correlation is fuzzier, and where most folks get hung up with measurement. It’s about a relationship between two things, usually an action and a result, but that relationship isn’t exclusive of other factors.
We struggle with these two, because we’re often trying to prove cause, when correlation can be just as valuable in terms of justifying our efforts. Think of correlation as “contributing to” or “influencing”. So if you do an outreach campaign in social media and lead numbers through those channels increase, you can say that those two events are likely strongly correlated. (By contrast, if you do a campaign in social media and your offline event attendance increases, they might still be related, but likely more loosely).
Remember that today, we have any number of points where prospects and customers can be impacted by what we do. Proving cause can be tricky, because you can’t trace every interaction someone has with your company.
But we strive for cause why? Because we want CREDIT. We want to be able to say that OUR effort is what moved the needle so we can justify time, budget, headcount. But the only way to truly prove cause to a major degree is to adjust ONLY one thing while leaving all other factors the same. We rarely if ever do that in business, because we’re not conducting science experiments. We’re simply trying to understand what helps and what hinders. Get comfortable with this phrase: reasonable degree of confidence.
6. Analysis is the hard part, not measurement.
The human brain factor is the complicated bit. Data is easy to collect, easy to smash together, easy to do math around. The REAL question is: what does all this MEAN to me and why? What does this tell me about the effect and impact of my actions?
That’s the hard part because no tool in the world can do that for you. No case study will show you precisely the map you need to follow for YOUR business (though it might spark some ideas). No one person can hand you a turn-key set of metrics that will suddenly give you a lightbulb moment and show you the path ahead of you. Put the effort into goal setting on the front end and analysis on the back end, and let measurement be a process in between.
7. Standardization has limitations.
You might have some *types* of metrics that can be bucketed together – such as engagement or awareness metrics – but the unique ones that matter to your business aren’t likely to be standardized anytime soon. That’s a departure from the way we’ve always done it, but then again, some of our “standard” metrics haven’t really gotten us very far (like ad equivalency) and others are standard in name, but not in how they’re calculated (like customer satisfaction).
Instead of striving for metrics that are universally applicable, focus instead on the ones that consistently deliver valuable intelligence for your business. It doesn’t matter what the guy down the street is measuring unless you’re just looking for a little inspiration.
8. Reporting is not an outcome.
Related to #2, delivering the graph isn’t the end of the road. It’s what you outline as the next steps to either a) keep doing what you’re doing or b) adjust something in order to try and change the results.
The report, in fact, is often the starting line. And reports full of data alone aren’t very useful. The art in reporting isn’t just packaging the information, but its in interpreting and translating that. When you give your boss the monthly report of PR impressions or lead volume, do they ever ask you what you attribute those numbers to, and what recommendations you would make based on that information? Have YOU ever thought about that? Why?
9. Measurement doesn’t have to be complex to be effective.
You don’t necessarily need convoluted indices to get you where you’re headed, especially when you’re starting. Sometimes, just a simple correlation between an awareness metric and a sales timeline can tell you whether there might be a positive relationship, and you can act on that. Think of it this way: pair one qualitative metric – like customer satisfaction – with one related quantitative one, like sales or call center costs or website hits. One metric alone rarely tells you anything valuable.
Are they both headed in the direction you want them to be? Over time, do you see them moving together, away from each other, or in unrelated ways? Do the strategies you have in place to move them both tie into one another?
CAN measurement be complex? Sure. Some really detailed measurement formulas can help you get super scientific and granular. But again, if you’re getting mired in the process of measurement instead of the practice of deriving some intelligence from what you measure, you’re doing it wrong. The average business simply needs a guide, not a dissertation.
10. Measurement is a constant evolution.
You set a goal. You back out a few metrics. Then you evaluate, and realize you haven’t learned anything of value, or that you need more clarity, more specifics, a broader view, or whatever. That’s okay. Look, business is an iterative process. It’s part art, part science, and so is measurement. Who wrote the rule that said we had to have the perfect, bulletproof set of metrics before we start measuring?
If something doesn’t get you the information you need, change what you’re doing and try something else. If you’re missing something, add it. Eventually, you’ll settle into a few combinations of metrics that really illustrate to you those Almighty Actionable Insights.
We’re way too caught up with being perfectionists about gathering and presenting information, and not nearly good enough with FRAMING the information in a way that gives us something to chew on.
11. Measurement is cultural as well as operational.
We’re taught to fear failure, so if we track and measure failure, we don’t want to share it. We manipulate numbers to show our work in its best light, instead of showing the hard truth in order to identify what we need to improve to be more effective. That’s a *culture* problem, based in businesses where accountability is absolute, blame is personal, and failure is a dirty word. That’s a conversation that can’t be fixed with a PPT presentation.
12. Measurement is more than ROI.
Measuring ROI is something we can and should do. Track how much we spend (in time and capital), track how much we net in terms of return (usually $$). That’s a smart move.
But we can’t limit the discussion about measurement to ROI. We have to talk about qualitative metrics, like brand perception, customer satisfaction, advocacy. We have to talk about quantitative metrics that tie to things other than revenue, like reduced costs. We also have to understand the difference between justifying something from a “good use of time” perspective, and looking at a financial return as the way of determining success.
Ultimately, all roads lead to Rome. But so much of social media isn’t about being the sales channel, but is about positively impacting the likelihood of sales through all other experiences.
13. “Social media isn’t measurable” is an excuse.
Here’s what people really mean:
- I don’t have the right tools in place to collect the data I need
- When I have all the data, I don’t know where to start
- I don’t know what data might relate to each other to analyze it well
- I don’t want to or am not empowered to spend time doing data collection and analysis as part of my job
- I’m afraid of what measuring will actually tell me about the effectiveness of my work
The first one is a functional problem. The second and third ones are knowledge based, with no exact “right” answers, and require a bit of practice and applied effort, but they’re solvable too. The last two are cultural, and are probably much more firmly rooted in the people rather than the process. That’s a different discussion.
Above all, we have to stop blaming the medium for hindering the measurement process. It’s not social media’s fault at all. If anything, it’s guilty of providing us too much information.
What we need to understand about our own measurement practices is whether we’re equipped with the right tools and data, whether we’re willing to spend the time evaluating that data and extracting the juicy bits, and whether we’re functionally and culturally prepared for what it might show us, for better or for worse.
But make no mistake, folks, basic social media measurement isn’t someone else’s responsibility to sort out for us. And waiting for the manual is simply burning time and money.
Measurement is our job. It’s our responsibility. And it’s within our capabilities, without doubt. So let’s get cracking.
image credit: BigTallGuy