The Future of Radio? Part 1: PPM

David Bray – Hennessy and Bray

The radio industry stands at a crossroads as it fights to ensure its viability and financial survival in the years to come. This is part one in a series of three articles discussing the future of radio.

Were you lying to me then or are you lying to me now? That is the question I have had posed to me time and again when it comes to Diary vs. PPM methodology. It is a fair question given that the results from each format are markedly different.  Essentially, the answer is neither……both are estimates based on two different methodologies. Proponents believe PPM is a better estimate with a panel whose behaviour you can track with precision while the diary struggles with the volatility of having a different group of respondents each week. In essence, truth is relative. Both systems offer estimates with a certain margin of error.

Let’s back up to see how we got here. While BBM’s deal to partner on the development and licensing of PPM with Arbitron was inked in 1992, the breakthrough came with the development of the compact lithium ion battery providing 28 hours of life in 2003. That made it possible to carry a meter for 24 hours before docking. From there, the technology was optimized. Then the stations had to be encoded and the panels polled. Montreal kicked things off with a good deal of tweaking and hand wringing. On Dec. 10, 2009 we completed the first 13 weeks for Toronto, Edmonton, Calgary and Vancouver. From now on, measurement will be continuous, with an updated report coming out every month. Now comes a firestorm of comparative analysis. What are the pros and cons of the two methodologies?

With Diary we have eight weeks of measurement using a different sample every week. As such, there is a lack of continuity. Tuning is broken down into fifteen minute blocks, which is less than ideal since that doesn’t really match radio consumption patterns. Most significantly, reporting is based on recall, which is less than perfect. Sometimes respondents end up effectively voting for a station when they try to re-cap their tuning habits for the week. As you might expect, breakfast tends to be over reported since it is easy to remember your drive to work. Conversely, weekends and evenings are comparatively under reported since it is more difficult to recall exactly what you did on a Saturday afternoon driving to the shopping mall. On the positive side, the fact that we get a different set of diary keepers each week ensures we get a broader sampling base of individuals.

With PPM we get passive measurement. Panel members carry the meters and dock them each evening. We can now see tuning behaviour in tremendous detail. Reporting is reliable. Tuning is broken down into one minute increments which are much more precise. We can watch as a listener moves through his pre-sets or walks from store to store with different stations playing over the sound system. Diary generally reports that people listen to three stations. PPM says six to seven stations. Certain types of stations seem to suffer with PPM. Heritage stations such as CBC can experience a bit of “halo” tuning in diary reporting (which relies on the listener’s memory) which disappears with the passive recording of PPM. In effect, they get a bit of what I’ll call “aspirational reporting” in the diary, then lose that benefit with PPM. Conversely “office” stations thrive under the passive reporting format. Younger rock stations, which had trouble getting young adult males to fill out diaries, seem to come back strong with PPM. Smaller or more tightly targeted stations seem to suffer under PPM. More importantly, it is very difficult to turn these results around given the stability and low turnover of the panels.

Some critics would argue that listening to a background station in the office or shopping mall, for example, is not comparable to actively listening to your favourite station. Under the diary method, this sort of tuning would usually not be reported by the diary keeper. Another example of an anomaly would be the increase of female tuning to sports stations with PPM. The answer is fairly obvious. Girlfriends or wives are sitting in close proximity to their mates listening to sports. The females may actually hate the stations, but their tuning is captured. These same females would never report that tuning in the day of the diary. Yet another anomaly we have observed in recent months is the fact that office stations dominate categories in which they wouldn’t be traditionally strong. For example, office stations (skewed Female 25-54 or Female 35-54) have, in some instances, taken the #1 spot in terms of hours tuned for Females 18-34 and even Males 18-34. It is impossible to believe that a beer advertiser, for example, would put large sums of money on these soft AC stations.

As they phased out the diary in these top 5 markets, the BBM Fall ’09 Diary Numbers and the Fall ’09 PPM numbers were both released covering basically the same time period. There were a lot of heads being scratched and brows being furrowed. BBM is calling the 13 week PPM numbers the currency release. Airware, unlike Micro BBM or Infosys, will just deal with PPM in 13 week increments for buying purposes. While the share of hours tuned ranking from the two books are not radically different, a dilemma pops up when we note the vastly different numbers for ratings, cume, etc.

There is no getting around the fact that PPM reports significantly less overall radio tuning. For any radio believer like myself, that is troubling. Some correctly argue that radio tuning has not changed, just the method of reporting it. So, they argue, buyers should just use a conversion factor when switching from one set of figures to the other. The logic in that may be pragmatic, but is faulty from a research standpoint. Especially for proponents of the three (or four) hit theory. Are we to completely set aside our theories of reach/frequency which are fundamental to the buying of a highly targeted medium like radio? Moreover single source qualitative data, critical to a targeted medium like radio, isn’t quite as detailed as that offered up by Diary. Yes, we have RTS which is excellent. But that is a completely separate issue.

Flip the coin and we have the fact that continuous measurement allows for quick analysis of an ever changing marketplace where formats are flipped more often than pancakes.

Cume or reach does in fact generally increase for each station in PPM as opposed to Diary. The problem is that cume for a 13 week period is almost meaningless. Any listener who has registered even one minute of tuning during the measured period is included in a station’s cume. Office stations can report a cume of virtually everyone in the market. All that means is that panel members have passed by a radio playing a given station. Of course, you can modify that by looking at Average Daily Cume or set a filter to include listeners that listen to a minimum of 15 minutes to a station to avoid transient listenership. Bottom line is the fact that any station that plays its cards correctly can post a spectacular cume. While the figure meant something in the Diary system, it is not nearly as important with PPM.

Buyers and programmers will have to focus on hours tuned.

We have to keep in mind the size of the panels which are limited by expense. They are as follows:

Toronto 600 households Vancouver: 450 households
Calgary 400 households Edmonton: 400 households
Montreal 400 Anglo, 400 Franco households    

Panels experience approximately 2-3% turnover per month. Detractors argue that a station that does poorly with a given panel will have to wait for an extended period of time before enough of the panel has changed to all for a significant change in their fortunes. It is too early to assess whether there will be long term compliance issues with certain age groups in the participant households. Another relevant question, given the use of phones to secure panels, is whether or not panels reflect the appropriate number of cell phone only households. This is something BBM is working toward.

One of the most important aspects of audience analysis is margin of error. It is rarely discussed, but is critically important. Buyers and sellers use numbers like daggers with supposed accuracy ….e.g. “you’re down 10.2%, give me makegoods”. On the programming side, people lose their jobs with a 20% decline which is actually still within the margin of error. Let’s not kids ourselves…the margin of error was a significant challenge with Diary. As to PPM, for a 13 week period for A12+, Mon-Sun. 5a-1a the standard error as a per centage of the total share is a follows:

Toronto 13.5% to 24.5%
Calgary 15.3% to 24.5%
Edmonton 14.1% to 30.4%
Vancouver 15.1 to 42.1%

Keep in mind that these numbers go up as you narrow the demographic. A25-54 would be a bit higher.

F25-54…higher still. By the time you get to M18-24 or M18-24 who own a cell phone, you are in the 60%+ region for margin of error. My point is simply that these are estimates not a census. It is critical to understand the parameters.

In this article I have just skimmed the surface. The months to come will be critical ones in the industry as we adapt to a new reality. There is much more I could say, but that would take a book. For now, I will only say that the truth, when it comes to PPM, is relative.

David Bray is one of the country’s leading radio analysts. David may be contacted at 416-431-5792 or davidbray@hennessyandbray.com  

Archives