Last Friday, a group of lower 700 MHz A block licensees submitted new interference testing and a report purporting to analyze the relative impact of Channel 51 and E block signals on Band 12 and Band 17 devices. The lengthy report claims to show that Band 12 LTE devices are unlikely to experience interference levels high enough to translate to reduced performance in a 700 MHz B and C block LTE deployment like that being completed by AT&T.
While we have not yet had a chance to fully review the submission, even a cursory review of the report raises significant credibility issues for both the testing methodology employed and the field results submitted.
Consider these two examples:
First, the report presents the results of a purported Channel 51 interference field test using actual Band 12 devices in Waterloo, Iowa, where U.S. Cellular is apparently now operating a Band 12 network using the B and C blocks. The report asserts that this field test demonstrates that the Band 12 devices worked fine in the presence of Channel 51 interference. But the nearest Channel 51 transmitter (Cedar Rapids/Crowley) is located about 30 miles away from Waterloo proper, and the drive route used for the testing was largely in the surrounding countryside even farther from the Channel 51 tower.
It is not surprising that Channel 51 transmissions originating up to 40 or 50 miles away would have little or no measurable impact on the performance of a Band 12 device – at that distance the Channel 51 signal is simply too weak to cause a strong interfering reverse intermodulation product. And because the report discloses only averages of the field test readings, any poor performance measured in the very small portion of the drive test route that ventured within 20 miles of the Channel 51 station would certainly be masked by the large number of test points in areas where Channel 51 signal levels are necessarily very low.
The Waterloo field test therefore offers little evidence relevant to the interference that can be expected in the many areas where a Channel 51 broadcaster’s tower is located in or nearby major urban areas.
For example, the Waterloo field test says nothing about the harmful interference that would be expected from KPXE’s CH 51 transmitter, which is located in the south central part of Kansas City, or WPWR‘s CH 51 transmitter, which is located on top of Chicago’s Willis Tower in the middle of the Loop – or even the harmful interference customers actually in the vicinity of the Cedar Rapids/Crowley Channel 51 tower would experience.
Second, the report claims that its lab tests confirm that Band 12 devices are unlikely to experience much harmful interference from Channel 51 or E Block transmissions at expected Channel 51 and E Block signal levels. But the report omits critical information needed to assess the validity of that claim and to evaluate why the report reached such different results from other lab testing performed by well-regarded independent wireless testing facilities with similar equipment.
For example, the report never discloses the LTE signal level (from the simulated base station to the tested device) generated in the lab tests. That is one of the most critical values when using lab tests to evaluate the impact of interference, because device performance is a function of the ratio of LTE signal level to interference signal level. Where the LTE signal is particularly strong, the device can tolerate higher interference levels. But in the real world, LTE signal levels will not be uniformly strong at all locations and times, and to determine the likelihood that interference will degrade service quality in the real world, a proper lab test must use an LTE signal level that is close to where the device is just able to detect the signal and receive packets at the target quality of service (e.g., a certain packet error rate).
The reports of two independent labs submitted on Monday with AT&T’s reply comments demonstrate that when such tests are properly designed and executed, Band 12 devices experience severe performance degradation at the Channel 51 and E Block signal levels that this report claims would have no impact.
A detailed response to this report will be prepared and submitted into the FCC record. But even these preliminary examples illustrate how mistaken the Commission would be to rely on it to impose a Band 12 mandate.
At stake is the performance of AT&T’s multi-billion dollar LTE deployment, one that is currently operating free from performance degrading interference from CH 51 and the E Block. This ill-conceived and badly executed analysis provides no basis to adopt an unprecedented technology mandate that would undermine the rigors of the 3GPP standards-setting process, the incremental deployment of LTE networks in the United States, and the rollout of significant performance-enhancing features of LTE-Advanced, such as carrier aggregation, depriving customers of the best possible service experience.
On even a more fundamental level, this submission is enormously discouraging. The challenges of the lower 700 MHz band will be resolved only through an honest and credible assessment of the problem and a commitment by the industry to work together on real solutions.
This submission instead chooses to obscure and conceal the real world challenges in the hopes of securing an unlawful mandate that would still leave the A block significantly impaired and largely undeployable. AT&T, for its part, will now be required to spend significant time and resources responding to this submission when we would prefer to be pursuing the merits of our proposed solutions. In that regard, this report does much to move the industry away from the consensus solutions that the FCC has strongly encouraged.