Ever since the world’s first banner ad in 1994 asked users to click “right here,” clicks have been the default measure of success online.
But clicks, it turns out, are a poor indicator of how successful one’s marketing really is. Consider the consumer who may have seen 10 digital ads before purchasing — by measuring clicks, you can’t definitively attribute 100 percent of credit any one of those individual ads. Because of that, it’s difficult to determine which ad to mirror in the future.
Multi-touch attribution — an alternative to “last click” that awards value to each digital ad a consumer sees — is an advertiser’s dream. But collecting the depth of data to measure every factor affecting a buyer’s decision is not possible in reality.
However, lift measurement — a staple of TV and billboard ad measurement for decades — is a much better gauge of online ad performance than any click-based attribution. While digital advertising may be a quantum leap over traditional mediums (because marketers can target with precision who they want to see their campaigns) lift measurements are one case in which TV and billboard advertising had it right.
How lift studies for TV and billboards work.
Whether examining a drug treatment or an ad campaign, the best way to measure success is via a randomized control trial (RCT). When applied to traditional offline advertising, the preferred RCT is a geographic lift measurement, in which one regional area is put in a treatment group and shown TV commercials or billboard ads. Meanwhile, a similar regional area acts as a holdout or control group and isn’t shown any advertising.
If sales jump for the treatment group compared to the baseline of the other one, then the campaigns were successful. The amount of the rise in sales, commonly known as lift, tells you if the advertising spend was worth it.
For example, imagine if McDonald’s ran TV commercials in Cincinnati for its McRib sandwich but didn’t run them in Cleveland (assuming customers in those cities show similar purchasing behavior). If McDonald’s found that sales were up 20 percent in Cincinnati versus Cleveland, then they can surmise that the commercials prompted that 20 percent increase.
Digital gave birth to flawed attribution approach.
Lift measurements aren’t perfect. Other factors could have increased McRib sales in Cincinnati. Maybe an influential radio DJ talked up the McRib during the morning commute. Also, separating by geography means the groups may not be exactly comparable, due to regional differences in personal preferences, culture, weather, etc.
Regardless, lift studies are far more accurate than click-through rates. Clicks don’t tell you much about whether an ad drove incremental revenue. So why are marketers so focused on them?
One reason is that search pioneers decided that pay-per-click was the best model for sales. It makes sense: clicks provide solid proof that a user responded to an ad. This model underscored the advantage search advertising had over display advertising in identifying short-term intent. Even better, marketers could analyze the on-site behavior of said users.
In the ideal scenario for this model, a consumer sees an ad for a sweater, clicks on it, and buys the sweater online. While that certainly happens, a consumer can also view the sweater ad, think about it for a week, look at other sweater ads, forget about the sweater altogether, and then visit the site and buy the sweater. Or maybe it was the combination of online ads and a billboard that sent the consumer down the purchase funnel. The picture isn’t a clear one for marketers.
TV and billboards have always had one glaring problem: the marketer can’t tell who saw the ads. After all, you can’t click on a billboard or TV commercial. With digital ads, the advertiser does know (at least on an anonymized basis), which creates an opportunity for much better measurement.
But, many advertisers falsely assume that because a consumer clicked on an ad, that ad led to whatever behavior followed. Confusing correlation with causation is what’s holding most marketers back from truly understanding the data behind their campaigns and which ones are proving successful. Those who wish to overcome this challenge should take a hard look at their current measurement practices. Then they should begin experimenting with lift tests.
Lift in a digital world.
The advantage of doing lift measurements in the digital realm is that audiences can be truly randomized and used as part of continuous testing and measurement. Analysts need not find a new city to use as a control group for billboards and TV commercials. This is particularly true for some of the largest online retail brands. It’s no real headache for L.L. Bean or Adidas to withhold digital ads from 10 perfect of their online audiences.
Taking a continuous approach to lift tests is the key takeaway. It’s critical to create these holdouts or control groups more than once every six months, or once a quarter, because the revenue from your holdout group provides your baseline (it represents revenue you’d make without advertising). That baseline will vary over time due to seasonality, competition and changes to your products. Continuous digital lift measurements provide the most accurate baseline. From that you can calculate the true incremental return on your advertising.
Unfortunately, many marketers are still focused on the wrong metrics, with no ability to show the effectiveness of their campaigns when it comes to the bottom line. But, measuring and optimizing your budgets based on lift measurement finally closes this loop. It directly ties the performance of individual campaigns to the revenue they provide. For CMOs, this is the holy grail of measurement insight.
While this approach isn’t new, it is hiding in plain sight for digital advertisers. Billboard, print and traditional TV advertising budgets may be in decline, but the tried-and-true method of lift testing to measure ad performance is timeless. So, marketers, look to the past to update your measurement strategy today. Stop prioritizing clicks, and you’ll gain tremendous insight into the performance of your campaigns.