You are here

Effects of Radar Interference on LTE (FDD) eNodeB and UE Receiver Performance in the 3.5 GHz Band

Report ID: 
NTIA Technical Report TR-14-506
July 01, 2014
Geoffrey A. Sanders, John E. Carroll, Frank H. Sanders, Robert L. Sole

Abstract: In response to proposals to introduce new radio systems into 3.5 GHz radio spectrum in the United States, the authors have performed measurements and analysis on effects of interference, from a variety of radar waveforms, to the performance of a prototype 3.5 GHz Long Term Evolution (LTE) network, consisting of one base station (an eNodeB) and one client (referred to as user equipment or UE) utilizing frequency-division duplexing (FDD). This work has been prompted by the possibility that LTE receivers may eventually share spectrum with radar operations in this spectrum range. Radar pulse parameters used in this testing spanned the range of both existing and anticipated future radar systems in the 3.5 GHz spectrum range. Effects of radar interference on the LTE uplink and downlink throughput, block error rate (BLER), and modulation coding scheme (MCS) were measured. Additionally, for the uplink tests, resource block (RB) usage and UE transmit power were recorded. Effects on LTE performance are presented as a function of radar pulse parameters and the incident power level of radar pulses into the LTE receivers. The authors do not determine the interference protection criterion for LTE networks. Rather, the data presented can be used by spectrum managers and engineers as a building block in the construction of band sharing criteria for radar transmitters and LTE receivers, supporting possible future spectrum sharing at 3.5 GHz.

Keywords: Block error rate (BLER); chirped pulses; Long Term Evolution (LTE); P0N pulses; radar; spectrum sharing; frequency-division duplexing (FDD); evolved-Node B (eNB); Q3N pulses

The full report is available at: