The broadcast TV incentive auction officially kicked off last week with the deadline for stations to declare their participation in the auction. This triggered a number of pieces about what the auction is, how it works, and what the implications of it are. In that vein, I decided to write my own explainer for anyone wondering what this auction thing is they may have heard about.
So what’s this incentive auction thing I’ve heard about?
The FCC is reclaiming spectrum from broadcast TV stations to sell to wireless carriers. The idea is that there are a ton of broadcast stations taking up space on the airwaves, but most people don’t watch TV this way anymore, so it makes sense to free up the spectrum “wasted” by these stations and resell them to wireless carriers with much more demand for the spectrum. The unanswered question this raises, of course, is why people don’t watch TV over-the-air and whether that represents a shortcoming of the technology, or the way it’s regulated and the state of the marketplace, and whether the very technological revolution that’s making such demands on the spectrum might give more usefulness to broadcasting if it’s allowed to do so. Still, thanks to the digital transition stations can offer multiple streams of content within their existing 6 MHz of channel space, so there is some value in compressing that space.
How does the auction work?
Last week was the deadline for stations to declare their participation in the auction – which does not necessarily mean they’ll be surrendering spectrum, only that they’re interested in doing so. Even if stations do surrender spectrum, they won’t necessarily go off the air, instead electing to share a channel with another station. The FCC will take that data and run software to try to optimize the spectrum occupied by stations – even if a station doesn’t surrender spectrum, it’s likely to be moved down to a lower channel. (This won’t affect what channel you tune in to watch a station if you have an antenna, which since the digital transition has no relationship with the channel the station physically occupies.) Once the FCC has reached an estimate of how tightly it can pack stations in, it can release a “clearing target” indicating how much spectrum it can free up based on what stations have declared their participation. As much as 126 MHz of spectrum could be made available at this stage, though that’s considered unlikely; somewhere between that and 100 MHz is more likely. After that, the FCC will collect bidding credits from companies interested in bidding on the spectrum, probably sometime in May.
The auction itself is a complicated, two-step process. The first step is a “reverse auction” where the FCC buys the spectrum back from broadcasters based on the credits received from forward-auction bidders. The commission starts with the highest possible price and moves it down each round until enough stations have taken the bait to meet the clearing target. That will determine how much money the commission needs to make in the forward auction to end the process; if the commission can’t make that much money, it has to repeat the whole process over again with a new reverse auction at a lower clearing target. If the auction only takes one round, it could be finished by August, but if the FCC has to repeat the process it could stretch into next year. The FCC could even call the whole thing off if it doesn’t raise enough money (unlikely, given the value of the spectrum at stake) or doesn’t obtain enough spectrum (still unlikely, but could happen if attitudes towards the value of broadcasting change dramatically).
Once the auction is completed, the FCC will reallocate the 6 MHz television channels into pairs of 5 MHz blocks and assign the resulting frequency bands to the forward auction winners, trying to give each carrier contiguous bands whenever possible for maximum flexibility and to accommodate the real-world demands of the carriers. They won’t be able to move in for several years, however, because of the more complex task of repacking the remaining stations; the FCC has laid out a 39-month timeline for repacking, but broadcasters say that’s not enough time and AT&T has dropped hints it might be fine with a longer timetable as well, noting that a similar process of awarding spectrum that commenced in 2005 and was supposed to take 36 months is still going on over a decade later.
Who is going to give up spectrum in the reverse auction?
That’s kept confidential and anonymous until the auction is over, but some companies have dropped clues about their participation. Companies that own or control multiple stations in a market may look to consolidate their holdings into fewer channels; NBC, Fox, and CBS all own these “duopolies” and have signaled that they could participate. Those that don’t but that use much of their spectrum on low-demand outlets like a shopping channel or a rerun farm may elect to give up that spectrum and share with someone else. Some outfits with other business dealings, like a university that owns a public television station, may decide to give up that spectrum in favor of other means of distributing content, or two smaller stations may elect to share on a single frequency and sell the other. And then there are the speculators that have just gobbled up unwanted stations in hopes of making a payday off of selling them in the auction, including Dell Computer founder Michael Dell.
Who is going to bid on spectrum in the forward auction?
The three largest carriers – AT&T, Verizon, and T-Mobile – are unsurprisingly part of the bidding, with analysts predicting that AT&T is budgeting $10 billion, T-Mobile $8 billion, and Verizon $5 billion. Sprint says it’s fine with its existing spectrum holdings and will sit this out. Smaller carriers are also in the running to participate, as are venture capital firms and some entities outside the wireless industry. Comcast will be in the bidding, though it probably won’t spend heavily, looking primarily for spectrum it can find “strategic value” in, and Dish, which dominated another spectrum auction the FCC held a while back, may have an outfit bidding for them. However, hopes for tech companies like Facebook and Google to drive up the bidding for their own innovative services appear to have been dashed. AT&T and Verizon’s participation will be curbed by the FCC reserving some blocks of spectrum for companies that have less of it, as those two companies have the lion’s share of the low-band spectrum that currently exists.
How much will the auction raise?
Although early forecasts suggested the auction could raise $60 to 80 billion, more recent analysis has suggested it might come in closer to $30 billion given tighter budgets and some would-be participants sitting out. The last major auction Dish dominated raised an unexpected $45 billion and that spectrum wasn’t as valuable as this is, so for it to raise less money could be disappointing for broadcasters and the government.
Why are stations considering giving up their spectrum in the auction?
It’s hard to pass up a piece of even a $30 billion pie, and for many station owners that’s money that can be reinvested in other parts of the business. It’s especially hard to pass it up when this is likely to be the only opportunity they have to get money for their spectrum; if broadcasting turns out to have no future and/or the auction doesn’t produce enough spectrum to meet wireless carriers’ demand, the FCC could simply reclaim the broadcast spectrum by fiat with little or no compensation.
Why is the incentive auction so important to wireless carriers?
Broadcasters are sitting on “low-band” spectrum, in other words, spectrum with lower frequencies than wireless carriers have today. This spectrum is considered “beachfront” real estate because it has unique propagation characteristics that make it valuable to wireless carriers, especially in urban areas – it travels over significantly longer distances and penetrates into buildings much better (both of which, of course, broadcast TV benefits from today). Some believe the spectrum auctioned off here will serve as the basis for next-generation 5G services, or even wireless broadband services that can compete with present-day wired broadband. It’s unlikely there will be any more auctions of low-band spectrum for the foreseeable future, so everyone wants to get in now while the gettin’s good. Wireless companies are also worried about a potential spectrum crunch as more people embrace mobile devices and use them to watch bandwidth-heavy video like – to quote one of the articles linked to above – “sports [and] cat videos”.
Wait, video? Sports? Don’t broadcast stations already show those things?
I see you’re starting to realize why this whole thing is so stupid. But I’m getting ahead of myself; reallocating spectrum to wireless providers means it can be used to deliver whatever content the consumer wants (not necessarily video) rather than whatever a network or station programs (which doesn’t include anything on cable networks like ESPN anyway), and on a wider variety of devices than broadcast stations can reach today, which right now is mostly limited to traditional fixed television sets.
Okay, so why should I care about broadcasting? Why not have them give up all their spectrum to wireless providers with more demand and that can offer more flexibility?
Broadcast television operates on what backers call a “one-to-many” distribution architecture: a signal sent out once can reach any number of devices. So if a large number of people in one area want to access the same thing at the same time, it’s more efficient to beam it directly into their phones or tablets with one signal. If you had to rely on the broadband network to watch the Super Bowl, it would have to deliver the game to each and every one of the 100 million people who want to watch it. Perhaps, with technological advancement and investment in infrastructure, that’s entirely doable! But if a large number of people want to watch something on their mobile devices, there are physical constraints on the ability to deliver that content to each device individually, and the additional spectrum freed up by the auction can ease those constraints but not get rid of them. Every wireless provider relies on spectrum to deliver content, and each provider has its own chunk of spectrum which may not match up with what provider’s customers are taxing it the most. Even if you add up all the spectrum wireless providers have, including that freed up by the auction, there are all sorts of other uses competing for that spectrum even beyond those broadcasters that stay put, not to mention the buffers that have to be put in-between each use to prevent them from interfering with each other. Again, technological advancements and infrastructure investments might make it possible, but would it be worth the effort when we already have a technology perfectly situated to take the load off of wireless providers?
By the way, this technology isn’t necessarily limited to video! It’s also possible to do something called “datacasting”, beaming out sports scores, weather reports, stock quotes, you name it directly into various devices. But video is likely to be where this sees the most use and where it would be most needed, because of how bandwidth-heavy it is.
Can’t wireless providers use this technology themselves, though, if they really need to?
Sure! In fact AT&T and Verizon have already started investing in their own “multicast” networks to serve just this purpose. But that’s the thing: in order to access those networks, you presumably need to subscribe to those providers. At least in theory, broadcast television is completely decentralized, not dependent on a subscription to any wireless provider or pay-TV company, not directly controlled by a small number of behemoths. Sure, the major networks may be owned by a small number of companies, but the stations they directly control don’t encompass much more than a third of the country, if that, with affiliates owned by other groups covering the rest of the country, not to mention the various independent, public, niche-audience, and other stations there are. At the very least, if we’re looking to maximize spectrum efficiency, which is preferable: one station broadcasting the Super Bowl out to everyone, or a separate broadcast for each wireless carrier?
So if broadcast is so useful for all of this, why don’t people use it? And why don’t more people embrace or even hear about this argument?
The short answer is that in the 80s, 90s, and early part of the new millennium, consumers embraced cable as a means of providing more video options than was possible from broadcast TV at the time, when Internet video was a joke that could never replace traditional TV, and the large majority of people still subscribe to cable for this reason. The longer answer is twofold: first, broadcasters have effectively fallen under a self-imposed reliance on cable for their distribution, because the monetary model of cable, where cable networks can collect money directly from your cable bill in addition to advertising, can’t help but outcompete the broadcast business model of advertising alone, something I go into more detail on in my book The Game to Show the Games. The “retransmission consent” system laid out by Congress in 1992 allows broadcasters to collect their own cable subscription revenue, but it doesn’t actually correct for the superiority of the cable business model, meaning broadcasters (especially those with investments in actual cable networks) have no incentive to make it easier for people to actually consume their over-the-air signal, whether just by improving how much of the population it reaches or increasing the methods by which it can be accessed.
That brings us to the other problem of cord-cutting and the rallying cry of its adherents of watching whatever you want whenever you want – effectively a backlash against the entire notion of linear television, broadcast or cable. Between that and the disincentive broadcasters have to embrace cord-cutting, fewer cord-cutters use broadcast television than you might otherwise think, and cord-cutting has been less of a boon to broadcasting than one might otherwise have thought. It doesn’t help that retransmission consent has resulted in broadcasters receiving much of the blame for the high price of cable, arguably unfairly. While broadcasters’ need to play catch-up has resulted in much of the recent increase in cable bills being attributable to them, the underlying monetary motivation for cutting the cord is the overall lack of value people perceive in the cable bundle, and that’s because a disproportionate amount of it is tied up in sports, namely ESPN and regional sports networks, and to a lesser extent other outlets with significant investments in sports that rely on subscriber fees to pay for sports rights, including, ultimately, broadcasters. As I suggest in my book, that’s because sports (and live events more generally) is precisely the sort of content linear television is best suited for: something people have to consume at a specific time, leaving them a captive audience for whatever ads are inserted into the feed.
This is all nice in theory, but is there any evidence that it’s actually possible to use broadcasting in this way? I mean, if it were possible to receive a broadcast signal on a mobile device there would be SOME application of it, right?
Well, the digital television standard was developed in the late 90s, and see what I said earlier about Internet video being a joke then; “mobile devices” referred to pagers and bulky cellular phones that no one would mistake for anything remotely comparable to the desktop computers that existed at the time, let alone the television set. Because of the different challenges of digital transmission, the standard effectively shut out devices like the portable TVs that used to be able to pick up analog signals, which was understandable considering fixed television sets were still the norm, but turned out to be the exact sort of audience broadcasting needed to court to survive. During the late 2000s the industry released an addendum to the standard allowing for the transmission of lower-quality feeds that can be picked up by mobile devices, but it was very poorly supported with virtually no devices having native support for it (meaning you had to pick up an antenna dongle to plug into your device) and because of all the dismotivation to support broadcasting mentioned above, there weren’t very many stations that even offered a mobile feed. Right now the industry is working on a brand-new standard, called ATSC 3.0, that will include such things as native mobile reception, datacasting, and other advanced features, enough to justify the massive changes, on the scale of the original digital transition, such a move would require, but the FCC isn’t waiting for it to finish before commencing the auction. Ideally, broadcasters would switch to the new standard in conjunction with their post-auction repack, but instead they may have to go through two major shakeups requiring massive infrastructural investments instead of one, if they go through the second one at all.
Well, that doesn’t sound good, but is the auction all that bad? You yourself said it made sense to repack stations to take up less space.
The problem is that not only will the auction not address any of the obstacles facing broadcasting, it could lock in their effects, as well as create new ones. I don’t think the auction will physically relocate any stations or allow them to broadcast at higher power to reach wider audiences, other than moving them to a different part of the spectrum. In fact, the FCC may reclaim different amounts of spectrum in different markets, meaning it’s likely the band will be compressed to have just enough room for whatever stations stay in the business, with no room for new stations if it turns out we really did need some of those stations that decided to get out of the business and potentially little to no flexibility for existing stations to optimize their transmission pattern. To my knowledge the FCC is going to try to refrain from compressing the broadcast band tighter than what would be sufficient to preserve all stations in markets serving the majority of the population, meaning low-demand areas would still have some free spectrum, but there could still be interference between areas where a part of the band is used for wireless services and where it’s used for TV, with deleterious effects for both. In effect, precisely because it is the only chance to determine the new broadcasting landscape, the auction will ensure that afterwards, broadcast television can only continue to shrink, never grow, no matter how useful it may prove to be afterwards, and provide new reasons for it to do so.
The crunch against the creation of new stations is especially problematic because of the incentives surrounding channel-sharing. After the auction is over, a “channel” in a given area could be home to one or two licenced “stations”, and FCC rules regarding duopolies are based around the number of stations a company owns in a market. So if you’re an NBC, Fox, or CBS who owns one of those duopolies mentioned earlier, you could elect to have one of your duopoly stations channel-share with the other, or you could simply give up one licence outright and achieve the exact same effect but be counted as owning one station instead of two. Considering that some other companies have already been using various legal constructs to get away with owning more stations than they’re supposed to for over two decades now, you can see why they’d want to take the latter option – and indeed some have already been doing so, separately from the auction, as the FCC has attempted to crack down on those legal constructs. The downside is supposed to be that channel-sharing gives you two stations with “must-carry” rights on cable systems instead of one, but most duopoly owners can use retransmission consent to bully cable operators to carry whatever feeds they want them to carry, so the end effect is going to be to reinforce and strengthen the power of the largest programmers. But it gets worse: at least if two stations exist on a single channel, one of them can be sold to someone else that wants to get into the industry, but I don’t know if there will be any provision to “split” a single-channel licence in two, so not only will the auction preclude the creation of new stations and reinforce the power of the largest station owners, it’ll also effectively “lock in” today’s duopolies into perpetuity. (And if any of the alternative uses of spectrum the ATSC 3.0 standard makes possible turn out to be fruitful, channel-sharers might not be able to take advantage of them at all.)
By the way, the FCC is currently undertaking a statutorily-mandated review of its ownership rules that might serve as an occasion to address this problem (which also encompassed the aforementioned crackdown on legal constructs). Last I heard it was slated to wrap up in June. That’s also the month the reverse auction is likely to begin. So if the FCC did anything that would warrant a change in anyone’s auction strategies, there would be no time to react to it, which in and of itself suggests the FCC isn’t likely to do anything that would affect the auction, unless maybe it would have the effect of spurring greater participation. And I haven’t even gotten to the issue the industry is perhaps most vocal about, to the point of causing legal headaches for the FCC: unless they’ve already signed up for a specific type of protection, certain stations licenced to operate at “low power” could be forced off the air without even having a chance to cash in in the incentive auction, and those stations are more likely to have diverse ownership and serve as an outlet for independent content, especially in larger markets.
Okay, that’s a problem, but the way you described it it seems like broadcasting serves a fairly specific purpose in fairly specific situations, so how likely is it that we would need that many more stations? Would we even have use for the four major broadcast networks that exist today?
It doesn’t take as much use to justify the use of broadcasting as you might think. If just three people in a given area want to watch the same thing at the same time, using a single linear television signal to reach them cuts the bandwidth demand to a third of what it would otherwise be. That might be less than 1% of the people covered by a single cell tower, let alone a broadcast station with much broader reach. Anything that’s even remotely popular by Internet standards can benefit from the use of broadcasting if it can arrange for people to consume it at a single time. We probably don’t need 100+ linear feeds (just look at the state of cable television networks today), but I could easily see enough content benefit from broadcasting to justify more than the four major networks we have today, and potentially more than survive the auction. That is, if that content is viable to place on broadcast stations under current regulations, but that’s another question entirely.
I don’t watch any video content of any kind. Why should I care about any of this?
If you care about net neutrality, this is of vital importance. As it stands video makes up over three-quarters of all Internet traffic in the United States, with Netflix and YouTube alone accounting for close to half of all peak traffic combined, even if they aren’t that popular on a per-capita basis, and that’s with the limited degree that cord-cutting has already taken hold. If cord-cutting really takes off – if anything remotely close to the level of video consumed on broadcast and cable television today moves to the Internet, if all the biggest events and shows are consumed that way – the Internet would effectively become a conduit for the delivery of video that happens to be robust enough to allow other purposes to ride along it as well. If broadcast television isn’t able to take some of the load off, it would make too much sense to find some way to prioritize video content over other types of content, using such schemes as T-Mobile’s Binge On or Comcast’s Stream TV. You would have to spend that much more effort defending net neutrality against Internet providers’ attempts to dismantle it – assuming you didn’t decide it was worth sacrificing considering the loads the industry would be facing.
But isn’t broadcast television a form of prioritization itself? You’re stuck with whatever whoever owns the station programs at any given time, and that’s the content that gets to benefit from all the benefits of broadcasting. You’re basically saying we have no hope but to give up on net neutrality for the sake of video.
As I mentioned earlier, under our current broadcast television system the content that benefits from it is the same for everyone, not dependent on which carrier you subscribe to. With strong ownership restrictions and outlets for independent content creators, the broadcast realm can be fairly democratic. But the main point is that broadcast television is applicable to a fairly specific situation that happens to apply primarily to the most popular content and which a particular type of content, live video, benefits most from. Making sure that type of content benefits is an acceptable compromise to net neutrality, and properly managed, any independent producer that becomes popular enough to benefit from broadcasting, indeed to actually need it, should be rewarded by the free market (or at least an outlet specifically designated for the purpose like today’s public television stations) with a broadcast slot. The drawbacks of broadcasting are significant enough and the situations it benefits most in are specific enough that I don’t think it will crowd out any other content – after all, the mere existence of cable television isn’t stopping the cord-cutting revolution.
You’re awfully pessimistic about the ability of the Internet to handle the load. With better compression technologies, fatter pipes, and other infrastructure investments (especially if ISPs would stop being lazy and upgrade to gigabit technologies we already have), it shouldn’t be a problem. We’ve handled the Internet’s growth just fine in the past despite what the doomsayers keep saying, we can handle this.
Well, see what I said before about whether it’s worth all the effort when we can at least reduce the need for it, with the added wrinkle that any advances in compression are likely to be matched by increases in the quality of the content – witness the hype that’s already gearing up for “4K” and other next-generation technologies. But this isn’t really about the level of traffic as the balance of it. If the architecture of the Internet is going to be oriented to prioritize video, as I suspect the “infrastructure investments” are going to increasingly constitute, let’s minimize the degree to which it compromises net neutrality and use all the tools we have to work with and that make sense.
I speak of broadcast television as separate from the Internet, but really forces are already in the works to blur the lines between them. The FCC’s current set-top box proceeding, no matter the outcome, seems destined to create an IP interface to access wired linear television content, and embracing IP is also part of broadcast’s ATSC 3.0 standard. The day may come where you won’t be able to tell whether a feed is coming from a linear television stream or over the Internet, a distinction Comcast is already intentionally blurring with Stream TV. If that sounds like all the more reason to fear linear television morphing into a means to circumvent net neutrality, then it’s all the more reason to preserve a linear television architecture that’s free to everyone and not dependent on any particular ISP – something that can be done even with cable.
You said it might be a good idea to shrink the amount of spectrum devoted to broadcasting, you just don’t like how the FCC’s doing it. So how would you go about the auction differently, or would you hold it at all?
I wouldn’t hold it right now, at a time when the benefits of broadcasting aren’t widely recognized and broadcasters are dependent on retransmission consent. The legislation authorizing the auctions allows them to be held any time from now to 2022, so if the FCC cared at all about broadcasting they’d wait and see if the cord-cutting revolution increased appreciation for broadcasting among the general public and if the resulting collapse of the cable bundle obviated the need for retransmission consent; at the very least they’d wait to complete the ownership review and for ATSC 3.0 to be ready, and frankly the fact they aren’t suggests they’re trying to get it done as quickly as possible on behalf of the wireless industry, before broadcasters start having second thoughts about the value of their medium. If that value doesn’t become apparent on its own, I’d articulate a clear vision for it and institute programs to put it into effect and transition broadcasters to it. If all else fails, Congress could force the issue by instituting a la carte cable for at least the most expensive channels.
I would institute ownership rules based on the amount of spectrum controlled, not the number of stations owned, and have a clear process for splitting a single-channel station. At the most extreme I might force all stations in a market to effectively channel-share and allocate two stations to each channel. That would give every station some auction revenue and make it easier to compress the band while potentially creating new station opportunities. Making it even easier to make the repack work would be colocating as many stations in a market as possible in a single place and assigning them contiguous frequencies, effectively designating subsets of the television band to each market. That would require a lot of cooperation, but it could serve as an opportunity to improve broadcast stations’ coverage areas along with other salutary effects, giving the industry more of a fighting chance. (Again, the fact that the FCC is doing none of this suggests their goal is just getting the broadcasters out of the way rather than actually optimizing spectrum usage.)
Although those moves would have the effect of making it possible to obtain more spectrum, my preference for the amount of spectrum obtained would be about 80 MHz – on the low end of what wireless providers would prefer – thanks to a quirk in the existing bandplan. Currently, physical channel 37 is completely devoid of stations, instead being used by radioastronomy, and that will remain the case no matter what else happens in the auction. About 7-11 MHz of buffer would be needed between the remaining TV stations and the new wireless spectrum under most of the post-auction plans, but if about 84 MHz is taken, only a 3 MHz buffer would be needed between Channel 37 and the new wireless spectrum, maximizing efficient use of spectrum. (Anything over that creates 3 MHz buffers on either side of Channel 37.) This strikes a balance between giving wireless providers enough spectrum for the auction to succeed while maximizing the amount of leeway in case broadcasting recovers more than expected.
That seems unrealistic at this late date. Is there anything anyone can do now?
Realistically, not much. It basically comes down to trying to convince as many station owners as possible that broadcasting has more value than they think and to hold on to their spectrum or at best channel-share, in hopes of derailing the whole auction or at least ensuring an oversupply of broadcast spectrum going forward by keeping the amount of reclaimed spectrum to the 84 MHz range, but even that won’t really solve the structural issues with the auction. The best thing you can do is to spread the word and see if enough public outcry can convince the FCC, Congress, and broadcasters to hold off and make changes, but realistically that probably needed to happen at least a year ago, and Congress, at least, has high hopes for using auction proceeds to help balance the budget. At the very least, though, the people trying to explain the incentive auction to the general public can do a lot better than to simply parrot the notion of broadcasters as outdated spectrum hogs without giving a full appreciation of the obstacles facing them and the trade-offs to society of having them surrender their spectrum.