Certain statements seem so obviously true that seeing statistics that back them up at first seem superfluous and then, oddly, helpful.
Like, take this. Sharethrough, a native advertising company that works with brands to distribute braded content, conducted a study with five advertisers to determine the brand-building effectiveness of native video ads against standard pre-roll. In this test, all the pre-roll was the standard of the 15-second or 30-second type, and all the native ads were user-initiated, and slightly longer versions of the same ads, and matched the visual style of the publisher page. Sharethrough arranged for Nielsen Online Brand Effect to determine brand lift.
Here are the results: Pre-roll drove no brand lift, or very little. Native advertising drove brand lift, in some cases quite significantly.
I am so not amazed, and not just because Sharethrough did the experimenting and wouldn’t be allowing the case study to be publicized if it hadn’t resulted in a favorable result.
For one thing, users actually initiated the viewing of the native ad, an indication that they were primed to lift some brands. And that they did!
Brand lift by definition—Nielsen unit Vizu’s definition, at least—measures the extent to which advertising has “shifted consumer perception against one of the key purchase funnel metrics” (awareness, attitude, favorability, intent, preference).
And it would seem to me, as a consumer, that an opt-in that shared the visual style of the page I was visiting would seem safe and welcoming. So of course, roll that ad, lemme see it. I’m allowing you to try to shift my perception.
In this study, Jarritos, a naturally flavored soft drink (that’s wildly great with a burrito, but man it’s sweet) ran a campaign to drive favorability. Native ads it ran generated an 82% brand lift among users exposed to the ads, but pre-roll generated just a 2.1 brand lift.
In fact, people exposed to the pre-roll were 29.3% more likely to say they viewed Jarritos “unfavorably” or “very unfavorably” than those who weren’t exposed to the campaign at all.
Well, thanks a lot, pre-roll!
In another study, a consumer packaging goods (CPG) brand was trying to drive purchase intent. Native ads generated a 42.2% brand lift. And like the Jarritos’ ad, those who were exposed to the pre-roll were 18.9% more likely to respond they “probably would not” or “definitely would not” buy the product, compared to those who hadn’t seen the ad at all.
In fact, Nielsen says, “native ads outperformed pre-roll regardless of the category or the marketing objectives of the five measured advertisers.” In this study, all the advertisers had the ability to make in-flight adjustments. Of course, Nielsen says, without having the right data and optimization, an advertiser can lose a bundle—even, as this report says, losing some people they would have still had as potential customers if they hadn’t seen the pre-roll at all.
Certainly, this seems to be a great advertisement for native advertising. But it is also disquieting information for the average advertiser that is planning to launch a flight of pre-roll ads.
pj@mediapost.com
I can't work out if you're supporting the research conclusions or not; maybe I'm reading some cynicism into the article...
But this: "Sharethrough did the experimenting and wouldn’t be allowing the case study to be publicized if it hadn’t resulted in a favorable result." is clearly an important point to remember.
And, without wishing to impugn Vizu's approach too much, as you say, the opt-in nature of the native ads tested means that the native ad exposed group is a self-selecting subgroup of the audience, while the pre-roll may have lower figures but among a larger audience. 1% of 1000 people is still more than 25% of 10 people.