The other day I was reviewing this great Alertbox article on user experience methods, and I came across a method with which I was unfamiliar--A/B testing. Well, sure enough, just like when you learn a new word and then it suddenly seems that you hear it used a couple times within the same day, this 37 Signals post points to a real world example of A/B testing. In this case, Less Everything simply measured the number of "conversions" (I guess the number of people who tried the application for free?) when a button was placed on the left side of the screen versus when it was placed on the right side and found that putting the button on the right led to a conversion rate of 13.8% versus a conversion rate of 12.3% for when the button was on the left.
The study seems to follow the guidelines for A/B testing in that the sample size is large ("thousands of users") and it was performed on an essentially finished design where it was easy to change only one part of the design and observe the effect of that change in isolation.