Broadcasters Flag Problems With FCC Wireless Interference Plan
WASHINGTON—Last week, the last group of comments about methodology proposed by the FCC’s Office of Engineering and Technology for determining interference between TV broadcasters and broadband wireless was made public on the FCC’s Electronic Comment Filing System. While comments from the wireless side generally supported the FCC’s approach, the National Association of Broadcasters, in the largest filing about the proceeding, disassembled it piece-by-piece, showing how it was fundamentally flawed and how its use would be in violation of the law Congress passed allowing the Incentive Auctions.
With few exceptions, the wireless industry accepted the FCC’s analysis, although most players stated that it was imperative the FCC clearly reveal any interference impairments in the forward auction. Their support for the FCC proposal isn’t surprising, considering it was skewed to provide the most spectrum for auction--even if this provided less protection to TV viewers--by using unrealistic assumptions about LTE base station facilities and applying clutter loss to interfering wireless signals.
Some commenters from the wireless industry argued that the FCC should use the F(50.10) probability for analyzing interference from DTV to wireless LTE base stations instead of the proposed F(50,50) probability, explaining that even if LTE base stations had more options when it came to reducing interference, having that interference present 50 percent of the time would impair service from the base station.
Sprint’s filing, at 30 pages, was the largest next to NAB’s 50-page filing. That company provided maps showing that “the Longley-Rice model yields remarkably similar results from those derived from the Hata model for the markets studied.” Sprint’s maps show coverage from WPBT in Miami, Fla.
Longley-Rice is a terrain-sensitive model, while Hata does not include terrain in calculations, but does include urban clutter. It isn’t surprising that the two models match in South Florida, given there is really no terrain to speak of. The results would be a different if Hata had been used to generate the coverage map of WCBS-TV that Sprint used later in its comments to justify use of the F(50,10) instead of F(50,50) probabilities for DTV interference to base stations. The impact of terrain is obvious in the WCBS-TV map.
In its comments, Sprint said: “the commission should not bar auction of spectrum in areas where desired to undesired (D/U) ratios are minimally exceeded.” Sprint suggested wireless operators could use directional antennas or reduced power to operate in these areas. It also suggested raising the allowed interference to TV viewers from 0.1 to 0.5 percent, the same level used for interference between TV stations. These comments might make sense if the service areas and interference were modeled based on LTE stations exactly as they would be deployed instead of in a 10-km grid with reduced antenna height and reduced power. The errors inherent in the assumptions used in predicting the interference from LTE base stations are so large as to completely mask any distinction based on tenths of percent of interference.
CTIA, the Wireless Association, raised concerns about use of Longley-Rice for modeling propagation from LTE base stations in its comments, stating: “CTIA also believes that use of the Longley-Rice radio propagation model is appropriate for modeling the effects from DTV transmissions. However, CTIA believes that other propagation models may be better suited for modeling the propagation losses from wireless LTE systems given the inherent operational differences between LTE and high powered broadcast television transmissions. CTIA encouraged the Commission to continue exploring which technical assumptions will best and most accurately model the potential interference environment in the 600 MHz band.”
CTIA emphasized the information generated from this approach must be publicly disclosed “with sufficient granularity to permit a thorough assessment of every grid point.” CTIA further stated: “Such data will be necessary for wireless bidders to be able to assess whether, how, and to what extent it may be possible to mitigate the interference and degree of impairment to the license. Only then can a bidder assign a value to a license area containing the grid points at issue.” CTIA noted that the data must include the number and identity of any interferers, whether the interference is to uplink or downlink LTE blocks, the interference field strength limits for DTV into wireless, to what extend a DTV contour overlaps the license area and to what extend restricted areas within the license area would be subject to exclusion zones due to the presumed presence of one or more base stations that would cause harmful interference to a TV station.
Would it be unreasonable for broadcasters to ask for a similar analysis for each of the blocks in their coverage area?
If the results are in the same form as currently used in TV study, that data would be available. There is, however, one problem: that data is useless.
The wireless operator can determine exactly what interference will be caused by a TV station at any point as the TV station’s location, antenna pattern antenna height and operating power level are all known. A TV station can’t accurately determine what viewers would be impacted by LTE interference because the analysis, as proposed, is based on hypothetical LTE base stations that, as the FCC admits, are positioned on a large grid to ease computation time, and as NAB’s comments show, don’t come close to matching real world deployments.
The NAB, ABC Television Affiliates Association, FBC Television Affiliates Association, CBS Television Network Affiliates, NBC Television Affiliates, the Association of Public Television Stations, the Corp. for Public Broadcasting, and the Public Broadcasting Service all filed joint comments stating: “The proposed OET Methodology relies on clearly erroneous assumptions and inputs that significant underestimate inter-service interference.”
Broadcasters point out that the OET methodology assumes that wireless base station antennas (receive and transmit) have a height of 100 feet and operate with an ERP of 720 Watts or 120 Watts per MHz. However, the FCC’s proposed rules for the 600 MHz allow heights up to 305 meters with 1,000 Watts per MHz--significantly higher than antenna heights used for the interference analysis.
The NAB examined average tower heights of existing wireless facilities using American Tower’s National Site List. Of the 704 ATC wireless facilities in Alabama, only 19 are listed at 100 feet or less--the height the FCC would use for interference analysis. More than 500 sites have tower height exceeding 200 feet above ground, and the average tower height in Alabama is 247.4 feet. The story is the same in other markets--only 15 of 336 wireless facilities in Maryland are 100 feet or less and almost a quarter of them exceed 200 feet. The average tower height in that state is 192.5 feet. The NAB also cited tower heights in New Jersey and Pennsylvania to show the vast majority of wireless sites exceed 100 feet.
The NAB stated: “Thus, the OET methodology makes assumptions inconsistent with both commission’s proposed rules and readily ascertained information about actual wireless service deployment. In both cases, OET’s erroneous assumptions are slanted towards predicting less interference than would likely actually be observed.”
The organization suggested that the commission either align the service rules for 600 MHz operation with the OET methodology assumptions, or adjust the parameters used in the methodology to be consistent with the proposed rules.
This isn’t the only place where the assumptions used in the OET methodology don’t match FCC rules. Broadcasters that want to use multiple transmitters in a distributed transmission system have to calculate interference to other broadcasters based on the root-sum-square method to combine signals from multiple transmitters when calculating adjacent channel and co-channel interference. Signals from two transmitters, even if the signal from each one of the individual transmitters falls below the threshold for causing interference, must be combined to see if together they will cause interference.
This simple concept has been ignored in the OET methodology. The impact of multiple interfering signals is not considered. The NAB provided an example to highlight how each of two signals could meet the 15 dB D/U threshold for interference, but when combined would result in interference at the receiver.
The NAB stated: “The commission has a statutory responsibility to protect broadcasting and DTV viewers and cannot ignore ‘real world’ interference from multiple wireless operations; thus, any methodology used to predict potential inter-service interference should take this into account. The Joint Broadcasters urge the commission to consider interference from multiple wireless base stations using either a simple direct summation method or the RSS method utilized for calculating interference from multiple DTS transmitters under the current rules.”
The NAB pointed out that the Public Notice requesting comment acknowledges wireless facilities in practice can be located every few fractions of a kilometer in dense urban markets. The OET methodology not only uses height and powers below those allowed by proposed 600 MHz service rules and ignores contribution of multiple transmitters, but it bases the interference analysis on a 10 km grid of these under-powered, low- height base stations. The problem with this approach should be obvious.
The NAB noted that according to the Public Notice, this 10 kilometer spacing was based primarily on computation limitation and not on modeling practical interference issues associated with real world deployments.
The NAB commented that: “While the Joint Broadcasters are sensitive to computational limitations associated with using the complex OET methodology, the appropriate solution to that problem is to use a simpler methodology-- not to make demonstrably false assumptions that will underestimate the potential for inter-service interference.”
Both the NAB and broadcasters argue that the OET methodology would “inaccurately and artificially” reduce the predicted impact of interference from wireless operators to television broadcast by incorrectly apply clutter factors. The argument continued that such methodology would add a 5 to 8 dB clutter loss to the interfering wireless signal, which would add to the errors already introduced by assuming low power and low height and ignoring multiple transmitters.
The group of broadcasters further noted: “To suggest, for example, that all of the trees in an area classified as ‘Forest Land,’ or all of the buildings in an area characterized as ‘Residential’ would be aligned so as to block the undesired interfering signals from multiple wireless base stations to viewers’ DTV antennas, but those same DTV antennas would be completely free of any obstruction to the desired DTV signals, is simply irrational. Clutter should not be considered on the interfering wireless signal to DTV viewers’ home reception.”
The NAB noted that the OET staff indicated the use of clutter was based on OET Bulletin 73 and the Public Notice cites that as a reference. The problem is that OET-73 was designed for point-to-point analysis to determine if a specific location is served. For the proposed interference methodology, the same clutter loss would be applied to interference over the entire 4 km square area in the 2 km grids OET proposes uses for determining interference to DTV reception. NAB provided a map showing clutter can vary significantly even over short distances.
The analysis OET used to determine appropriate D/U ratios when calculating interference between LTE OFDM transmissions and DTV reception is likely to be wrong. It hasn’t been tested. While testing has been done on interference between NTSC and ATSC signals and between two ATSC signals, the NAB noted there is no substantial body of testing on the co-channel characteristics of LTE OFDM based signals into ATSC receivers.
Depending on receiver design, some parts of the 8-VSB spectrum are more susceptible to interference, meaning the amount of interference won’t be proportional to the amount of overlap. For example, receivers relying on the pilot carrier will be more susceptible to interference in the lower one MHz of the channel.
In its comments, the NAB presented maps showing the areas gained by using the FCC’s overly complex and inaccurate methodology are minuscule. In one map that showed the difference between the OET and contour based approaches, the NAB noted: “To the extent that certain ‘white spaces’ are present within the contour, these generally represent mountain ranges with little or no population to be served. They certainly do not represent areas where widespread wireless deployment is possible or would be anticipated.”
The NAB said: “These two examples suggest that a straightforward contour calculation does not produce meaningful inefficiencies as compared to the unduly complex and computationally intensive OET methodology. Contrary to OET’s assertions, a separation distance approach is not spectrally inefficient in any meaningful sense, and the proposed methodology will not produce significant additional spectrum.”
There is much more detail and other examples in the Comments of the NAB, ABC Television Affiliates Association, et al.
I don’t see how the OET can justify adoption of such a flawed methodology--one that gives precise but bogus results. I hope the policy makers and economists at the FCC don’t try to put the engineers at the OET, who understand the complexity of the analysis and know the limitations of the input data, in the uncomfortable position of justifying such a flawed methodology when there are alternatives--as the NAB indicates--that are much simpler. If the areas defined for auction are to be trusted by both broadcasters and bidders, they can’t be built on input data that doesn’t reflect the real world.
While the wireless industry supported the OET methodology, it wasn’t without caveats, and this is even after OET weighted the analysis in their favor, as NAB explained. Rather than waste more time and money trying to refine the proposed methodology to the point it can be trusted, if that is possible, it would be better to put the effort into an alternative proposal broadcasters and bidders accept.
You can view all comments by visiting the FCCs Search for Filings website and entering “14-14” as the proceeding number.
Get the TV Tech Newsletter
The professional video industry's #1 source for news, trends and product and tech information. Sign up below.
Doug Lung is one of America's foremost authorities on broadcast RF technology. As vice president of Broadcast Technology for NBCUniversal Local, H. Douglas Lung leads NBC and Telemundo-owned stations’ RF and transmission affairs, including microwave, radars, satellite uplinks, and FCC technical filings. Beginning his career in 1976 at KSCI in Los Angeles, Lung has nearly 50 years of experience in broadcast television engineering. Beginning in 1985, he led the engineering department for what was to become the Telemundo network and station group, assisting in the design, construction and installation of the company’s broadcast and cable facilities. Other projects include work on the launch of Hawaii’s first UHF TV station, the rollout and testing of the ATSC mobile-handheld standard, and software development related to the incentive auction TV spectrum repack. A longtime columnist for TV Technology, Doug is also a regular contributor to IEEE Broadcast Technology. He is the recipient of the 2023 NAB Television Engineering Award. He also received a Tech Leadership Award from TV Tech publisher Future plc in 2021 and is a member of the IEEE Broadcast Technology Society and the Society of Broadcast Engineers.