On Friday, Mother Nature willing, we will file our first set of comments analyzing the data submitted in the special access proceeding. As a refresher course for the uninitiated, the special access services at issue here are not very special at all. They are services sold mainly to businesses which provide pretty low speed data connections – almost 90% of which are 1.5 Mbps. Businesses have traditionally used those connections to provide voice and data services as well as Internet access service. Special access services were deregulated by the FCC back in 1999 (during the Bill Clinton administration) because of the significant competitive fiber investment that occurred in the wake of the Telecommunications Act of 1996. The theory was that regulation was not necessary where competitive fiber had been deployed in a market. However, after deregulation, some carriers continued to complain to the FCC that the agency’s rules had deregulated services even in areas where no one had built competitive facilities.
So, some 13 years later, the FCC announced it was going to re-examine the level of competition in these markets. A funny thing had happened in those 13 years, of course. The Internet – and particularly the broadband Internet market – exploded. The FCC commissioned a massive team to craft a National Broadband Plan, which concluded that the focus of Internet policy should be to encourage investment in fiber and other technologies that could deliver gigabit services. The Broadband Plan also cited the need to develop a strategy to retire legacy technologies – like “not so special” access services – to make that fiber investment more economic. The Commission has even more recently upped the definition of broadband to 25 Mbps (17x faster than the bulk of these services). Consequently, when the FCC began its tortured trek over this 1.5 Mbps Bridge to Nowhere, we predicted that this backwards looking regulatory effort would waste an enormous amount of industry and agency time on a meaningless and futile exercise over services that would be obsolete by the time we crossed the bridge. Guess what? While we are still – 3 ½ years later – just beginning our trek across that bridge, our predictions have quite predictably already come to pass.
To refresh, back in 1999 when the Bill Clinton FCC issued its order, it realized it did not have perfect insight (or data) into the level of actual facilities competition in the marketplace. Therefore, it attempted to approximate that competition by looking at the number of competitors who had built fiber into a particular area (an MSA) and connected that fiber to the incumbent telephone company’s facilities. To ensure that it did not over deregulate that market, the FCC permitted deregulation only in those areas where a very high percentage of telephone company offices were connected to competitive fiber (the specific criteria were called “triggers”). To be sure, those triggers only captured some of the competitive providers in any particular MSA. To give just one example, cable companies generally don’t build to telephone company offices, choosing to rely instead solely on their own network investment. Consequently, they are not captured by the triggers Despite that, the application of the triggers resulted in more than 250 MSAs across the country realizing some level of price deregulation, including the ability to lower prices from the tariffed rate.
Fast forward to 2012. The FCC decided that it was finally going to do a mandatory data collection to determine once and for all whether the triggers really worked (mandatory because when the FCC previously asked the competitive carriers to voluntarily cough up data on the location of their facilities in 2010-11, the majority of them politely declined the FCC’s data playdate). So, the FCC has spent the past 3 ½ years defining and collecting data about these legacy technologies and level of competitive fiber that exists across the country, as well as collecting data on the IP services that are quickly replacing the special access services at issue in this proceeding. Of course, given the nature and speed of regulatory data collection, all of that information is more than 2 years old already and doesn’t necessarily resemble the market that exists today.
But despite that, the data provided in this proceeding shows not that the FCC prematurely deregulated special access services, as competitors claim, but that the FCC did not go far enough. Specifically, the data shows two things: first, that facilities-based competitors are serving 95% of all MSA census blocks (on average, about 1/7 of a square mile in an MSA) nationally where there is demand for special access services, and, second, that 99% of all business establishments are in those census blocks. To be sure, there are areas of the country where there is no competitive fiber, but the data shows that there is virtually no special access demand in those areas.
Given these facts, it is evident that the triggers actually understated the level of facilities-based fiber competition in the marketplace. In cities like Chicago and Atlanta, the data collected by the Commission shows that competitors are literally tripping over each other even though application of the triggers did not result in deregulation. Thus, the true facts demonstrate that the triggers need to be relaxed, not tightened. A real Dewey defeats Truman moment!
So, after 3 ½ years of data gathering, we can finally assert on a data-driven basis that competitive facilities exist wherever we see demand for the obsolete, not-broadband technology of “not so special” access service, and that deregulation of this obsolete technology should actually be accelerated. This shows the FCC was right back in 1999. It also shows that we’ve likely wasted years at the behest of some interests who are looking for a price break on their use of antique technology even if the facts don’t support them. They will no doubt keep arguing, but it’s time for the FCC to move on to more important tasks like advancing investment in broadband. And, one doesn’t do that by re-regulating an old technology that’s already being replaced by far faster and competitive alternatives.