6 Things You Need to Know About Incrementality

  Rassegna Stampa, Social
image_pdfimage_print

For years, marketers have relied on simple metrics like ROAS and CPA that are clean and easy to interpret. But there’s a problem: These metrics can’t tell you if your ad spend actually made a difference.

Incrementality asks the harder, more important question: Would this result have happened if we didn’t spend the money and invest in this channel? And answering that question is what separates “we got lucky” from “we made a great investment.”

The trouble with incrementality is that many marketers struggle to measure it accurately. A performance agency we spoke with reported that less than 1% of their brand clients had proper incrementality testing in place, leading to misallocated budgets of 10–30%. 

If you want to move beyond guesswork, here are six things you need to know.

Incrementality is a curve, not a number

More precisely and scientifically stated, incrementality is measured as a curve based off of the depreciating ad effect. The impact of an ad starts to decay immediately after it is seen.

When platforms report incrementality, they don’t always tell you when they’re measuring on the curve. Is it a 7-day window? 30? If you’re comparing across channels, this really matters.

This is especially painful for marketers because they usually want to know right away whether a campaign is going well. Did the new creative hit? Is this audience worth retargeting? Are we up or down this week? Incrementality doesn’t operate on that kind of instant gratification timeline.

You can’t tack incrementality onto a weekly campaign dashboard. By design, incrementality requires time, control groups, and a well-structured test. It’s less about what happened yesterday and more about answering, “Did this strategy actually make a measurable difference?”

The right question comes before the test

Here’s where many marketers go wrong: They rush to set up an incrementality test—control groups, creative, the whole nine yards—without ever asking what they’re actually trying to learn. It’s like building a laboratory before deciding whether you’re testing for the flu or baking cookies. Sure, you might get data that looks impressive, but if you don’t answer the business question leadership actually cares about, what’s the point?

Pagine: 1 2 3