All That Glitters Isn't Gold

September 24, 2019 | Ryan Hurley

In a digital world where virtually everything is quantifiable, analysts, marketers, and business stakeholders alike turn to data to inform their opinions and tell their stories.

Whether it’s to validate a strategic decision, confirm a hypothesis about a test plan, or simply report on the performance of an initiative, marketers are hungry for insights to guide their decisions. (If you’re reading this, you’re likely one of them).

But despite the abundance and availability of often persuasive performance metrics, it’s best to keep your guard up. For every golden nugget the data turns up, there are entire sifters of fools’ gold.

Though it runs counter to what many case studies suggest – and what many stakeholders would like to believe – large increases in performance are rare. Actionable insights and findings are typically incremental, and the exceptions are almost always artifacts of a misinterpretation of the data. In other words, they’re mirages.

I’m often reminded of the below ad for Adobe, which, while exaggerated for comic effect, encapsulates the risks of under-scrutinizing digital marketing data.

This is why we take a posture of open-mindedness and skepticism when we interpret results. We embrace the null hypothesis; that is, we start with the assumption that nothing “new” is happening in our data, and we then interrogate findings until we can be proven wrong. This approach is especially necessary when the results are strong and exciting. It’s better to not find gold at all than to elevate and celebrate a pound of fool’s gold.

When we dig deeper in this way, we often come to conclusions that run counter to what a surface analysis may suggest. For example:

  • A sudden increase in ad click-thru performance is a great story to share with colleagues, but discovering it’s because of fraudulent activity can really flatten the excitement.
  • A drop in website engagement can be alarming, until you realize it’s simply because of a change in the makeup of where your traffic is originating.
  • A new offer in the market failing to gain measurable traction? An important tracking need that went unknown could be the culprit.

The purpose of this analysis isn’t to skirt accountability for bad results, downplay good ones, or cherry-pick data points that support our priors. It’s to be right. Poorly founded insights steer us, and our stakeholders, off course. But well-vetted data allows us to act with greater confidence, and sometimes even discover solid, useful insights that are hidden from view. In data analytics, all that glitters isn’t gold. And all that’s gold doesn’t glitter.