Digital marketers have a problem: We’ve got too much data.
It sounds like a ridiculous complaint coming from a data analyst, but it’s true.
Google Analytics alone has more than 150 default metrics, which can be explored with more than 100 dimensions. And that’s without including advanced implementations.
Currently, a default Facebook export includes seven spreadsheets with 10 or more columns of data each. A default Twitter export includes 40 columns of data.
And we’re expected to choose one KPI?
Having this much data was supposed to make marketing easier. And while companies with the resources to hire data science experts have prospered from ubiquitous measurement, the small business marketer often doesn’t know where to begin.
It’s overwhelming. There’s so much of it, and it seems to be constantly changing. So most of us end up running the same reports over and over, instead of exploring and asking questions.
And when we don’t ask questions, we can miss important changes and new ideas that could help us better serve our audiences.
I’ve written before about how marketers can use prediction to improve their intuition about what works in their marketing.
But today, I’d like to talk about some smaller, more fundamental mindset shifts that can help you cut down on the overwhelm and be more curious about your data — whether it’s website analytics or Facebook ad performance.
Marketing analytics is not (necessarily) science
For a few years now, A/B testing has been the darling of the digital analytics scene.
Want to improve your landing page? Test it! Do your customers respond better to an orange or a purple button? Test it! Should your website home page use a photo of a cute puppy or a smiling little girl? Test it!
It’s a great thought.
Wouldn’t it be excellent if we could conduct conclusive experiments that would tell us exactly how to market to our audiences?
For several years, I was a strong believer in the power of A/B testing. I enthusiastically hypothesized dozens of tests. But more often than not, whenever I ran one, the results were inconclusive.
Usually, the problem was that the sample size was too small to reach statistical significance. Other times, there was a variable beyond my control — a traffic fluctuation or a website issue.
Eventually, I realized that the problem wasn’t the method.
A/B testing just wasn’t always a good fit for the questions I had about what worked in our marketing.
In grade school, we learned that the scientific method depends on having a controlled environment. In digital marketing, that kind of controlled environment is almost impossible to achieve.
That doesn’t mean we should never borrow from academia. We should.
But we have to be cautious about applying it. Because marketing isn’t a controlled experiment — it’s messy, complicated, unpredictable real life.
Some metrics don’t matter
Remember those bloated Facebook and Twitter reports I mentioned in the introduction?
Here’s a secret: I use less than half the metrics in those reports.
If I don’t know specifically what a metric means, or if its definition is too nebulous, I won’t use it. If a metric is something I could calculate myself, I don’t need Facebook to calculate it for me. If it’s not relevant to what I’m measuring, I delete it.
I didn’t always do this. When I first started creating digital marketing reports, I looked at everything. I worried that if I didn’t, I’d miss something vital.
But if a metric isn’t relevant to what you’re measuring, it’s only going to make it harder for you to see important trends. If you don’t know how a metric is calculated, or what it specifically refers to, it will only make it harder for you to spot patterns.
Before you freak out about discounting all that data, remember who developed those reports.
Facebook, Twitter, and Google can’t predict what’s going to be important to you. Only you can know that.
Decide in advance what’s important to you, then delete what’s irrelevant, without worrying that you’re going to miss something.
Don’t ignore your confirmation bias; use it
I used to take my role as an “impartial observer” seriously. I tried to eradicate every bit of bias as soon as I noticed it, so it wouldn’t affect my work.
But over time, I realized that forcing yourself to be impartial is like trying to meditate by forcing yourself to think about nothing. It doesn’t work, and besides that, it’s missing the point.
Meditation isn’t about having an empty mind; it’s about observing your thoughts, so you can learn to detach.
Confirmation bias isn’t good or bad. It’s simply how the mind works. But when you try to pretend it doesn’t exist, you’re more susceptible to it.
Instead, learn to identify your confirmation bias, and then use it to ask better questions.
Before you even start looking at the metrics, ask yourself, “What am I hoping to find in the data?”
Then, once you know what conclusions you’re inclined to find, turn them around. Play devil’s advocate, make a conscious-effort fight for the “other side.” What results would you expect to see if you were wrong?
When you shift your perspective like this, it opens your mind to patterns you might not have caught through your lens of confirmation bias.
Seek good questions, not good answers
If there’s one lesson I’ve learned working with digital marketing data, it’s that you have to be a perpetual skeptic. Of the metrics, of your reporting, of yourself.
If it feels like you’re always questioning everything, you’re doing it right.
But don’t let this paralyze you. It’s easy to want a “final” answer, even though you’re never going to get one.
The world, the platforms your marketing relies on, and your customers are perpetually changing. And that’s not going to change anytime soon.
The only way to stay open to those changes — to notice them and adapt — is to seek good questions, not just good answers.
Reader Comments (3)
sweta says
Data saturation is everywhere. We’ve often had the belief that more is better; however, that actually isn’t true in the case of data. The rapid rise in our ability to collect data hasn’t been matched by our ability to support, filter and manage the data. As an example, think about the first problem that people complain about when a city experiences great growth – the roads are too crowded. The infrastructure can’t keep up.
Loryn Cole says
Totally agree, Sweta. Also, just because something is easy to measure, doesn’t mean it’s useful.
JD Ebberly says
I know what you mean. More data is probably way too much data. Its just like all the explosive growth down here in the Austin, TX region, way too much traffic, no way these roads can handle it. Result: Gridlock and a multi hour trip just to buy groceries. Sometimes I feel like there is way too much data online.
This article's comments are closed.