Greg Diamond – ByrnesMedia
The owner of a client station recently asked me if there were any trends that have struck me in the last year or so. One music-related thing immediately came to mind. The ‘distance’ between AC, Hot AC, and CHR seems to be decreasing. So, are we headed toward a ‘big crunch’?
Using auditorium music test (AMT) research performed in the Hot AC, middle ground, format at a specific station between this year and last, we can see if and how this is progressing.
The 2012 test was done exclusively with females between 25 and 54, the sample size was a respectable 137, and 800 selections were tested. The 2013 test was also 100% female with the same number of songs, but the sample size was doubled to an even more accurate 237.
Now, there are numerous ways to examine AMT data, including whether a selection is someone’s favourite, if they are tired of it (aka ‘burn’), if they dislike it, if they are unfamiliar with it, etc. For our purposes, though, we will streamline things and just look at the general Appeal Index for each decade of music. Also, keep in mind that any score that is 80 and above can be considered to be the ‘best of the best’. Scores falling in the 60 to 80 range are still playable, but depending on the ‘depth’ of the test (whether or not there were a large number of strong-scoring selections), these can end up being the best and one’s benchmark needs to be lowered accordingly. Ideally, in a sufficiently deep test, songs that come in under the 60 mark can be discarded outright. It should be noted that if a test comes back and is overly shallow, then a rethink is in order. You can either refine your sample to include better targeted participants and retest everything, or in some cases a perceptual callout project could be required to find out if the station is even on the correct path to begin with. Trust me; a deep test is much more fun to work with!
Here, then, are the findings.
1980’s |
||
Test Year |
2012 |
2013 |
Songs |
103* |
97* |
Entire Sample |
59 |
62 |
Hot AC P1’s** |
61 |
65 |
CHR P1’s |
59 |
56 |
AC P1’s*** |
n/a |
67 |
Demo – 25-34 |
52 |
54 |
Demo – 35-44 |
61 |
62 |
Demo – 45-54 |
62 |
68 |
*No Canadian selections were tested for this decade
**P1 indicates listeners who primarily prefer a given format
***AC P1’s were not included in the 2012 sample
You’ll note that with the expanded sample in 2013, the 80’s scored slightly better, but there doesn’t seem to be much passion for this decade, anymore.
Okay, you’re probably thinking “I know where you’re going with this, Greg. Things will get better with each decade.” Um, not so fast.
1990’s |
||
Test Year |
2012 |
2013 |
Songs |
198 |
180 |
Entire Sample |
57 |
59 |
Hot AC P1’s |
57 |
62 |
CHR P1’s |
55 |
54 |
AC P1’s |
n/a |
57 |
Demo – 25-34 |
55 |
57 |
Demo – 35-44 |
55 |
55 |
Demo – 45-54 |
60 |
64 |
Those numbers were from a combination of both foreign and domestic titles. Now, I’m not going to beat the dead Cancon horse that’s seen more than its fair share of the whip, but I will present you with the data and I’ll leave it to you to form your own conclusions. Here are the 90’s broken out for Canadian and International.
1990’s – Canadian |
||
Test Year |
2012 |
2013 |
Songs |
51 |
47 |
Entire Sample |
57 |
58 |
Hot AC P1’s |
57 |
61 |
CHR P1’s |
55 |
50 |
AC P1’s |
n/a |
60 |
Demo – 25-34 |
55 |
54 |
Demo – 35-44 |
55 |
52 |
Demo – 45-54 |
60 |
66 |
1990’s – International |
||
Test Year |
2012 |
2013 |
Songs |
147 |
133 |
Entire Sample |
57 |
59 |
Hot AC P1’s |
57 |
62 |
CHR P1’s |
56 |
55 |
AC P1’s |
n/a |
59 |
Demo – 25-34 |
56 |
58 |
Demo – 35-44 |
55 |
56 |
Demo – 45-54 |
61 |
64 |
When the Cancon is removed, we see no discernible change in the scores in either test. In fact, they were identical or close to being within the statistical margin of error.
That said, though, with only about 25% and 35% of the decade tested being Canadian, the Cancon scores would have to be significantly lower to make a noticeable impact when the list is examined as a whole. In other words, the Canadian songs did not test much lower than the International.
These results from the 1990’s might seem surprising. However, that decade has been mined extensively by all three formats and people have grown largely tired of it. This accounts for an average burn score of 16 on both tests (a burn of 20 and up is considered high). Increasingly we are seeing stations deemphasize the 90’s and these statistics are a good illustration as to why.
So far we’re not seeing much love for the older music, but it’ll get better in the 2000’s, right? Well…
2000’s |
||
Test Year |
2012 |
2013 |
Songs |
372 |
337 |
Entire Sample |
59 |
58 |
Hot AC P1’s |
59 |
60 |
CHR P1’s |
58 |
55 |
AC P1’s |
n/a |
54 |
Demo – 25-34 |
57 |
54 |
Demo – 35-44 |
60 |
56 |
Demo – 45-54 |
61 |
61 |
2000’s – Canadian |
||
Test Year |
2012 |
2013 |
Songs |
126 |
114 |
Entire Sample |
56 |
54 |
Hot AC P1’s |
57 |
57 |
CHR P1’s |
54 |
50 |
AC P1’s |
n/a |
51 |
Demo – 25-34 |
54 |
49 |
Demo – 35-44 |
55 |
52 |
Demo – 45-54 |
60 |
59 |
2000’s – International |
||
Test Year |
2012 |
2013 |
Songs |
246 |
223 |
Entire Sample |
61 |
59 |
Hot AC P1’s |
61 |
61 |
CHR P1’s |
61 |
57 |
AC P1’s |
n/a |
55 |
Demo – 25-34 |
58 |
57 |
Demo – 35-44 |
62 |
59 |
Demo – 45-54 |
62 |
61 |
Once again, the results were surprising with the scores lower than might be expected.
The Canadian songs did show a greater decline in scores when compared to the 90’s, but they still didn’t pull down the combined list to any great extent.
Now, let’s look at 2010 to 2012.
2010’s |
||
Test Year |
2012 |
2013 |
Songs |
129 |
184 |
Entire Sample |
64 |
60 |
Hot AC P1’s |
64 |
62 |
CHR P1’s |
66 |
60 |
AC P1’s |
n/a |
53 |
Demo – 25-34 |
60 |
56 |
Demo – 35-44 |
67 |
62 |
Demo – 45-54 |
64 |
61 |
2010’s – Canadian |
||
Test Year |
2012 |
2013 |
Songs |
48 |
70 |
Entire Sample |
52 |
50 |
Hot AC P1’s |
52 |
53 |
CHR P1’s |
52 |
48 |
AC P1’s |
n/a |
43 |
Demo – 25-34 |
49 |
45 |
Demo – 35-44 |
52 |
51 |
Demo – 45-54 |
54 |
53 |
2010’s – International |
||
Test Year |
2012 |
2013 |
Songs |
81 |
114 |
Entire Sample |
71 |
66 |
Hot AC P1’s |
71 |
68 |
CHR P1’s |
74 |
67 |
AC P1’s |
n/a |
59 |
Demo – 25-34 |
67 |
63 |
Demo – 35-44 |
75 |
68 |
Demo – 45-54 |
71 |
67 |
It’s in this era that Cancon affects the combined list scores with the overall being 12 and 10 full points lower than that of the combined list. Thus, the foreign average jumped considerably when the Canadian was removed.
It’s pretty clear that the passion for newer, foreign music is much greater than that of songs even a few years older. As for the Canadian, like I said, I’ll leave it to you to form your own conclusions.
The 2013 test also showed slightly lower overall scores when AC P1’s were added to the sample. Also noticeable was the slight decline for Hot AC P1’s and a more significant reduction for CHR P1’s. To illustrate further the passion for newer music in these two formats, let’s strip out 2010 from the 2013 test and rerun the numbers.
2011-2012 |
|
Test Year |
2013 |
Songs |
129 |
Entire Sample |
61 |
Hot AC P1’s |
63 |
CHR P1’s |
61 |
AC P1’s |
54 |
Demo – 25-34 |
57 |
Demo – 35-44 |
63 |
Demo – 45-54 |
62 |
2011-2012 – Canadian |
|
Test Year |
2013 |
Songs |
50 |
Entire Sample |
51 |
Hot AC P1’s |
53 |
CHR P1’s |
49 |
AC P1’s |
43 |
Demo – 25-34 |
46 |
Demo – 35-44 |
52 |
Demo – 45-54 |
53 |
2011-2012 – International |
|
Test Year |
2013 |
Songs |
79 |
Entire Sample |
67 |
Hot AC P1’s |
69 |
CHR P1’s |
68 |
AC P1’s |
61 |
Demo – 25-34 |
63 |
Demo – 35-44 |
70 |
Demo – 45-54 |
68 |
The scores inch up a little. Now, we’ll look at just 2012 titles.
2012 |
|
Test Year |
2013 |
Songs |
63 |
Entire Sample |
64 |
Hot AC P1’s |
65 |
CHR P1’s |
64 |
AC P1’s |
55 |
Demo – 25-34 |
60 |
Demo – 35-44 |
66 |
Demo – 45-54 |
64 |
2012 – Canadian |
|
Test Year |
2013 |
Songs |
25 |
Entire Sample |
53 |
Hot AC P1’s |
56 |
CHR P1’s |
51 |
AC P1’s |
45 |
Demo – 25-34 |
48 |
Demo – 35-44 |
55 |
Demo – 45-54 |
55 |
2012 – International |
|
Test Year |
2013 |
Songs |
38 |
Entire Sample |
71 |
Hot AC P1’s |
71 |
CHR P1’s |
73 |
AC P1’s |
62 |
Demo – 25-34 |
68 |
Demo – 35-44 |
73 |
Demo – 45-54 |
70 |
Keep in mind that as the number of songs decreases, the shakier the reliability of the data becomes, but the evidence would continue to suggest that the newer music is playing a key role.
Now let’s take a closer look at how the scores broke out in each decade keeping in mind that anything 80 and up are “killers”, 60-79 are “safe songs”, 50-59 are “unsafe songs” and anything below 50 should be considered a discard.
80+ |
||
Test Year |
2012 |
2013 |
Songs |
52 |
41 |
1980’s |
6% |
12% |
1990’s |
9% |
8% |
2000’s |
37% |
24% |
2010’s |
48% |
56% |
79-60 |
||
Test Year |
2012 |
2013 |
Songs |
328 |
323 |
1980’s |
14% |
16% |
1990’s |
19% |
22% |
2000’s |
50% |
38% |
2010’s |
17% |
24% |
59-50 |
||
Test Year |
2012 |
2013 |
Songs |
258 |
288 |
1980’s |
14% |
12% |
1990’s |
31% |
29% |
2000’s |
46% |
43% |
2010’s |
9% |
16% |
49- |
||
Test Year |
2012 |
2013 |
Songs |
162 |
148 |
1980’s |
13% |
3% |
1990’s |
28% |
14% |
2000’s |
44% |
56% |
2010’s |
15% |
27% |
When looked at as individual decades we find in 2012 that the 2010’s were made up of 19% 80+, 43% 79-60, 19% 59-50, and 19% 49 or less. This gives us a Playable/Unplayable ratio of 62/38.
Songs from the 2000’s had 5% from the killer 80+ range, 44% in the safe 79-60 zone, 32% fell between 59 and 50, and 19% were below 50. The Playable/Unplayable ratio was 49/51.
The 90’s were as follows: 3% 80+, 33% 79-60, 41% 59-50, and 23% under 50. The Playable/Unplayable ratio for this decade is 36/64.
The 80’s were comprised of 3% killers, 44% safe songs, 34% unsafe songs, and 19% discards. This decade had a 47/53 Playable/Unplayable ratio.
And from the “I’m Not Sayin’, I’m Just Sayin’” file, the Canadian songs broke out with 2% being 80+, 34% in the 79-60 range, 37% between 59-50, and 27% under 50. Cancon had a Playable/Unplayable ratio of 36/64. Given the high number of songs that tested below 50, it is clear why Cancon has to be viewed in a different light when sorting the data from an AMT. It also shows us why some stations don’t even bother testing Canadian – it’s kind of depressing… I’m not sayin’, though.
The decade breakout in the 2013 test saw 13% of selections from the 2010’s in the 80+ range, 40% were safe, 25% fell between 59 and 50, and 22% were discards. The Playable/Unplayable ratio was 53/47.
The 2000’s were 3%, 36%, 37% and 24%, respectively, giving us a Playable/Unplayable ratio of 39/61.
The 90’s broke out as follows: 2% 80+, 40% 79-60, 47% 59-50, and 11% below 50. The Playable/Unplayable was 42/58.
And the 80’s were 5%, 54%, 35% and 6% for a Playable/Unplayable ratio of 59/41.
As we did previously with the 2013 test, taking out 2010 and recalculating for just 2011 and 2012 gives us the following: 15%, 43%, 22% and 20%. The Playable/Unplayable ratio was 58/42.
2012 alone came in at 21%, 41%, 21% and 17%. The more current titles had a Playable/Unplayable ratio of 62/38.
While these results do not provide overwhelming evidence of a convergence they do provide us with enough data to get more than just a glimpse of a trend. I chose to showcase this station because it was as close to an average from other stations we have researched. Some showed less of a shift, whereas others showed an unmistakable move toward a more current/recurrent-based direction.
Another less scientific, but in some ways more interesting way to gauge whether convergence is taking place is to look at the crossover percentages of current music over the last few years.
ByrnesMedia provides our clients with weekly suggestions on current music in all formats. By going back and choosing the same week from each of the last 5 years (in this case the first week in May), we see what, if any, duplication trends may have taken place for both individual songs and artists. Take into consideration, though, that this exercise does take into account the varying speed with which each format adds songs. It only gives a snapshot of what is cross-pollinated on the given week being examined. As such, one could and probably should assume that the figures are actually higher.
Year |
AC-Hot AC |
Hot AC-CHR |
CHR-AC |
|
Songs/Artists |
Songs/Artists |
Songs/Artists |
2009 |
3%/12% |
29%/33% |
1%/10% |
2010 |
11%/20% |
33%/35% |
9%/15% |
2011 |
14%/29% |
30%/33% |
11%/23% |
2012 |
20%/30% |
32%/39% |
13%/23% |
2013 |
23%/29% |
30%/33% |
13%/20% |
The above graph gives us a great indication of how the change in currents on AC has made the format more tempo-driven.
ByrnesMedia also provides clients with “Safe Lists” in all formats. These are compiled by looking at airplay data from the most successful stations in the country over the previous 6 months. As such, they take into consideration not only currents, but recurrents and gold. Here we will concentrate only on songs since artist crossover is already understandably high.
Year |
AC-Hot AC |
Hot AC-CHR |
CHR-AC |
|
Songs |
Songs |
Songs |
2009 |
23% |
38% |
13% |
2010 |
24% |
43% |
14% |
2011 |
28% |
45% |
17% |
2012 |
34% |
44% |
19% |
2013 |
35% |
44% |
19% |
While I have been mostly non-committal to this point, a look at this graph and one is compelled to say that convergence is, indeed, taking place. Duplication between all the formats is up from where it was 5 years ago.
Finally, by using another client station’s AMT data over the past 3 years, we see how the era loading on the station has adjusted accordingly.
|
C/R* |
00’s |
90’s |
80’s |
2009 |
27.9% |
17.7% |
26.6% |
27.8% |
2010 |
37.2% |
27.5% |
24.1% |
11.2% |
2011 |
37.9% |
27.4% |
25.1% |
9.6% |
2012 |
37.3% |
28.1% |
25.1% |
9.5% |
2013 |
37.2% |
32.5% |
21.2% |
9.1% |
*Current/Recurrent
This station is an AC/Hot AC Hybrid and what stands out is the jump in new music that occurred in 2010 and has remained at that level since. Also, you can see the de-emphasis on the 80’s and the more recent reduction in 90’s. This is a trend being seen across the country.
So, in fact, AC, Hot AC, and CHR are cozying up to one another more than they have in the past. With less exclusivity in music, what you do between songs takes on even greater importance. Is your imaging helping to set you apart and play up the benefits of your station over the other guy’s? Are your announcers staying relevant to the local listener? Do your promotions effectively dovetail with the direction your station is heading and are they compelling and fun? Is your music rotating properly and are your best songs being exposed sufficiently? These are just some of the questions you need to be asking yourself. If you need help, just send me an email at greg@byrnesmedia.com and I’ll be happy to assist.
After all, it’s “Crunch Time!”