October 29, 2007

90 Days Later: RECONSIDERED

My method for determining whether an item has become a hit is flawed and needs to be reconsidered.

That’s a realization I came to recently. What I do now is wait 90 days to see if an item “crosses over” – that is, makes it onto the two charts that monitor DRTV success (produced by the Infomercial Monitoring Service and Jordan Whitney). If an item is around 90 days after launch, I figure it has to be for real. But there are two problems with this:

1. Many items I know are successful aren’t appearing on the charts.

This can happen for several reasons. Sometimes companies intentionally under-report their spending numbers to stay off the radar (thereby preventing knockoffs from entering the market). Such a tactic would only affect the Jordan Whitney chart, however, since they are the only service that factors in self-reported numbers as well as monitoring actual airings.

The IMS chart is based strictly on monitoring, but that brings up a second reason why a hit commercial might be missed: It isn’t airing heavily on the stations being monitored. Such stations are limited for obvious reasons (manpower, logistics). A third and final reason a hit commercial might be missed is that the marketer is busy solving internal problems, such as getting production ramped up or changing agencies, and isn’t ready to roll out (even though the numbers are excellent).

2. Several items I know aren’t successful are appearing on the charts regularly.

This is the dark side of DRTV: People of questionable ethics and integrity manipulating what many perceive to be impartial information. The simplest way is to report intentionally inflated spending numbers, which is one reason I use both lists as a check against each other. Another technique is to spend heavily for a few weeks (at a loss) just to get noticed. There are other techniques, but explaining them here would be a little like printing the recipe for a dirty bomb. Suffice to say, it can be done and it is being done way too often.

So what to do? How can one account for these distortions and know what’s really a hit and what isn’t? Here are my new rules:

  • “Hits” must survive 90 days. It may seem like a subtle change, but it’s going to make a big difference. Instead of waiting 90 days and checking the charts once, I will check the charts each week for new items that have been on the charts for at least 90 days consecutively. Doing so will correct for unethical marketers who are pumping spending to create a false perception and ethical marketers who are just taking time to get their act together.

  • Both charts must be considered together. This is a current rule that is important because it corrects for false reporting and under-reporting. If a product has been appearing on the Jordan Whitney chart for 90 days but hasn’t appeared on the IMS chart at least a few times, something is wrong with that result and it must be discarded. Conversely, if an item has been appearing on the IMS chart week after week and has never appeared on the Jordan Whitney, someone is trying to suppress a winner.

Is this a perfect methodology? No. But short of having inside information and confidential reporting for every DRTV product marketed, there is no better way I can think of right now. Feel free to post your disagreements!

In my next post, I’ll share the results of a recent analysis using this new methodology.