Driving Online Donations: A Real-World Test

Mar 23, 2012

During pledge week at stations all across the country, every boy's and girl's fancy turns to one thing: Show me the money. Teams focus on the best ways to drive awareness (and conversion), and we wonder how to best use our web site as part of the package. So we try everything: 300x250 ads, big banners, lightboxes, interstitial pages, and more.

But here's the question: What actually works? Which of these online formats actually drive increased conversion rate and pledge amount? And do the intrusive ad formats negatively affect user engagement -- do they annoy people so much that it discourages them from using the site?

We decided to run a test and find out.

In October, we partnered with Michigan Radio during their pledge week. Our plan: run an A/B test with their site traffic split equally between three test formats and a control version. Because we wanted to test only the format in this test, we used the same messaging and visual design in all the formats.

During pledge week, about 23,000 visits to the home page were included in this test. Each user was randomly assigned to one of our four groups, then cookied so that they wouldn't see different versions of the test. We used DoubleClick for Publishers (DFP) to serve the ad formats, and for the tracking, we used session-level custom variables in Google Analytics to track user behavior.

We measured three things for each version:

  • Conversion rate: What percentage of people viewing the home page ended up making an online pledge?
  • Pledge amount: How much was the average individual pledge?
  • Bounce rate: What percentage of home page visits include only the home page (i.e., users left and never clicked deeper into the site)?

Here are the four versions we tested:

  • Control: Standard 300x250 at the top of the right column
  • Lightbox: A larger ad covering much of the home page when users first visit, requiring them to close the ad before seeing the full page
  • Pencil push-down: When the page first loads, a thin ad strip animates to expand into a large banner. Users can click to minimize the ad.
  • Bottoms up: This large banner is "docked" to the bottom of the browser window. As users scroll, it stays fixed and visible. Users can click to minimize the ad.

Which one do you think did the best? Click through the presentation slides to view each version, then keep clicking to see the results!

The results in a nutshell: None of the intrusive formats did any better than the simple 300x250 control version. There were slight differences, but nothing that revealed a statistically significant change in performance overall.

So what does this mean? It suggests that putting in extra effort to create some special, in-your-face format for pledge messaging is basically a waste of time. It doesn't get more people to pledge.

For many stations, resources and time are always limited. It makes sense to dedicate resources on things that we know will increase pledge activity, such as more effective mailers, e-mail campaigns, and offers. Why spend time on something that doesn't move the needle?

Like any test, this one also leads to more questions:

  • Would these results hold true if the test were run in markets other than Michigan? Almost certainly. In this type of behavioral research, location rarely makes a difference.
  • What if we ran this test at other times and not during pledge week? Would more intrusive ad formats increase pledge? Good question, we should test this and find out.
  • What if we tested different messaging or visual design? Good idea. In the meantime, check out this NPR research on why people give.
  • Does the presence of a pledge landing page (before the actual pledge form) help encourage donations? We're ramping up to test this with a station right now. Stay tuned.