It’s a common and understandable myth that TV shows stay on the air by getting as many viewers as possible. They don’t. In commercial television, shows stay on the air by making money, something that relates to, but does not correlate directly to, getting as many viewers as possible. If you’re a network that airs commercials, you make money chiefly by getting as many viewers that advertisers are willing to pay to reach as possible.
Figuring out how those viewers are, and how to count them, is probably the biggest challenge networks have today in keeping your shows on the air. If you’re over 50, do you matter? (Probably not where advertising is concerned.) If you watch on Tivo and skip the ads? (I am part of the problem.) If you watch TV online? And does anyone even know you’re watching?
New York magazine’s Joe Adalian has an interesting story looking at the conundrum for networks, advertisers and Nielsen: it’s not so easy anymore to even say how many people are watching a show. The more people time-shift and watch shows online, the harder it is to say the next morning whether a show is a hit, since people may watch the program at any point over the coming week, or later.
As Adalian points out, nowadays the live airing of an episode, especially one with a tech-adopting audience, may garner well less than half of its eventual audience. Yet it’s that morning-after number that still creates much of the perception of whether a show is a hit, and creates the lasting impression of the size of series’ respective audiences. (One eye-popping stat: over the week and on various platforms, Mad Men’s audience rises from under three million to over six million.) Nielsen, he writes, is trying to catch up to Hulu and company by measuring how many people watch shows online, which it will start tracking in the spring.
But counting viewers is one thing; valuing them is another. As it is, we have access to “live plus seven ratings”—the number of viewers who watch live plus the number who watch on DVR up to a week later—though these figures take weeks to arrive. But should we pay attention to them? Maybe when it comes to assessing a show’s cultural reach. When it comes to selling ads, though—and thus, to a show’s odds of staying on air—advertisers have been willing to pay for only live plus three days. (The thinking, in part, is that some ads have a shorter shelf life—and come on, how many ads do you think I’m watching on Tivo?)
And online? Anyone’s guess—and you could argue that online ads can theoretically be better targeted and more engaging than TV ones—but for now, advertisers pay much less for them, and there are fewer of them. So if you watch Fringe on Hulu, you may love the show as much as a live viewer, but you’re not doing it as much good. (Though more than if you missed it altogether.)
Bottom line: we now have much different metrics for whether a show is a “hit” (in terms of reach and influence among an audience) and whether its a “hit” (in terms of paying for itself and staying on air). It’s something I’ll increasingly have to think about as a critic. In practical terms, it’s important for me to know whether a show can survive. But if I’m assessing its cultural influence and clout, why should I care how much the show is worth to Procter and Gamble?
Of course, we’ve had different metrics for audience vs. financial hits for a while now, the most famous of which is demographics, especially age. Advertisers pay most for audiences aged 18 to 49, so that rating is what determines whether a show makes it. As Brian Lowry notes in Variety, though, the average viewer ages of most big networks is now older than that demographic. Blue Bloods is a big new hit in terms of eyeballs, but it’s relatively anemic in terms of ad demos—it’s average fan is over 60 years old. (Tom Selleck is the new Andy Griffith!)
So should networks value their older viewers more? Well, maybe in a moral sense. But the problem is, they can’t simply will advertisers to pay more for an older audience.
Now, advertisers’ reasons for targeting younger viewers may or may not be wrongheaded. I’d say it’s a little of both. It’s logical, if not exactly fair, to pay more to reach viewers (like the youngest adults) who watch relatively less TV; if a 60-year-old watches twice as much TV as a 25-year-old, it’s easier for me to find a cheaper program (say, the nightly news) to reach him on, whereas I have limited options for Junior. The notion that younger adults develop hardwired brand preferences seems more like pseudoscience (yes, a younger consumer has more years to live if you hook them early, but I suspect consumers at every age are more fickle than this old Madison Avenue belief assumes). But until and unless TV networks can persuade advertisers otherwise, they can only make money off what someone will pay for.
This all creates an interesting dilemma for those of us who write about TV: do we judge hits on the basis of the criteria advertisers should use, or on the criteria advertisers actually do use? As for you, the viewer, for now here’s the best way to keep your favorite show on the air: be a young person, and watch TV like an old person—live, with all the commercials.