In a guest post on Clayton Makepeace's blog yesterday, Daniel Levis revealed the results of a split test between single and double opt in mailing lists. The results were surprising.

First, he showed how many subscribers each list got, and how many of the original subscribers were still subscribed a year later. Single opt-in won both of these competitions handily (by 37% and 27%). No surprise there.

Then came the interesting stuff -- conversion rates. Here's the data:

  Double Opt-in Single Opt-in Difference
Subscribers Mailed 1921 2624 -27%
Opens 199 141 41%
Clicks 43 29 46%
Conversions 1 0  
Average Visitor Value $18.19 $0  

 

Despite the lower number of subscribers, the double opt-in list got more results!

In the comments, a lot of people complained about the results -- that one sale doesn't tell you anything. And they're right that there's no statistical evidence that double opt-in makes more sales. But let's run the numbers and find out what they do tell us. I plugged the numbers into my split test analyzer and this is what I found.

Double opt-in had a 10.36% open rate vs. 5.37% for single opt-in. The difference is 100.00% statistically significant. In other words, there's virtually no chance whatsoever that the difference between the two is random. At least for that mailing list, double opt-in led to higher open rates.

Next, double opt-in had a 2.24% click rate, vs. 1.11% for single opt-in. The statistical analysis shows 99.59% confidence that double opt-in gets more clicks than single. Also pretty darned conclusive (for that mailing list).

Obviously with only one sale between them, the difference in sales can't be proven. So here's where we put on our thinking caps.

The question we need to ask ourselves is whether the people who open the email act differently depending on which list they're on. Here's what the numbers tell us.

Of the 199 on the double opt-in list who opened the email, 21.61% clicked. Of the 141 on the single opt-in list who opened the email, 20.57% clicked. The percentages are almost identical, and according to my split test analyzer, there's only an 18.36% chance that the difference isn't just random. In other words, at least without gathering a lot more data, we have to assume that there's no difference.

Now I can't prove that there's also no difference between the percentage from each list who would buy, but since their clicking behavior is identical, that's good enough for me. I'd be comfortable assuming that their buying behavior is about the same too.

If that's true, then the list with the most opens will also have the most sales. Double opt-in wins.