Monday, 30 July 2012

Does the marketing industry bury bad news?

This article turned up on Adage last week. It's a proper, well thought out, scientific piece of marketing research, with an extremely important conclusion.

So why haven't you seen it anywhere?

Well, unfortunately, it strongly suggests that most of the clicks we see on display advertising may be just noise. Accidents. Slips of the mouse. There aren't that many clicks you see - click rates on display ads are around 0.02% to 0.04% - and with a click rate that low, a lot of them could easily be flukes.

Read the Adage article. It's important.

You didn't read it, did you?

OK, quick summary. The authors ran blank display ads and got click through rates on them that were significantly better than industry benchmarks for branded display ads that don't carry a call to action.

Stop trying to think up reasons why that might be able to happen, which wouldn't cast any doubt on the effectiveness of many display campaigns (which is exactly what most commenters on the Adage article immediately did). The authors even covered the possibility that people might click a blank space out of curiosity, by asking those who clicked, why they clicked. Like I said, it's a proper, well thought out, scientific study.

Adage ran the story, which is great. Adage do like a bit of controversy. As far as I can tell, the industry has buried it. If it had definitively proved that display ads made significant numbers of people go and search for products on Google, you can bet your life you'd have heard about it - it would be all over Twitter. Or if it had proved that Facebook advertising had a massive ROI? We've all seen those studies (and they're not very scientific...)

If we're going to take marketing measurement seriously, we need to accept that sometimes ads will be shown not to work as hard as we hoped and the studies that return those results shouldn't be buried without a trace. The authors here are also very careful not to be entirely negative. They're most interested in the fact that we use clicks to tune display placements, but those clicks look largely random. They don't try to make the step to any ROI implications.

Going back to item #3 on last week's top ten, we're going to see this result again. Third time around, maybe the industry might decide it can't just be ignored.

(Adage article originally found via @AdContrarian)

Tuesday, 17 July 2012

Ten rules of marketing analysis

It's been a while since we had a top ten on Wallpapering Fog. Number one on this list came up (again) today, so let's have Wallpapering Fog's top ten rules of marketing analysis.
  1. If you think you've discovered a radical, unexpected, new result that nobody's ever noticed before, your data is wrong.

  2. More complicated analysis can help you measure your marketing much more accurately. But if simple analysis can't find any impact at all from a marketing campaign, then there probably wasn't one.

  3. Nobody ever abandons a campaign that doesn't work, the first time that you prove it doesn't work. Three is the magic number.

  4. ROI means return on investment and it's measured in money. Not clicks, likes, web traffic or re-tweets.

  5. If you're not selling ice-cream, then the weather isn't responsible for your 50% year on year sales decline. Even Noah needed food and clothes.

  6. Never trust a piece of research that was funded by a media owner.

  7. Ten thousand respondents is plenty. A million is very rarely necessary - it just takes much longer to open the spreadsheet. You only need a spoonful of soup to know what the whole bowl tastes like.

  8. That means the BARB TV ratings panel is fine. Leave it alone, online people.

  9. When forecasting next year's sales, assume that your new adverts aren't any better than your old adverts. I'm sorry if that's depressing, but it's almost always true.

  10. The world is never changing so fast that you can't learn something from the past couple of years. People's basic motivations haven't changed since the dark ages.