Abstract
This paper evaluates the performance of a cell-specific reference signal (RS)-assisted residual carrier-frequency offset (CFO) detection scheme for a long-term evolution (LTE) downlink in a time-varying channel. Channel variations severely degrade the performance of OFDM by introducing both a complicated multiplicative distortion and an additive inter-carrier interference. Taking into consideration the effect of time-variant fading channels, an analytical closed-form expression of the mean square error (MSE) of the post-FFT CFO detection scheme is derived. The analytical expression is verified through simulations using the parameters of the LTE standard.
Acknowledgments
This work is supported by Seoul R&BD Program (SS100009), and this work was supported by the Materials & Components development program funded by the Ministry of Trade, Industry & Energy (MOTIE, Korea).