Sie befinden Sich nicht im Netzwerk der Universität Paderborn. Der Zugriff auf elektronische Ressourcen ist gegebenenfalls nur via VPN oder Shibboleth (DFN-AAI) möglich. mehr Informationen...
Numerical calculation of information rates and capacity of quadrature Gaussian mixture channels
Ist Teil von
2016 IEEE Sixth International Conference on Communications and Electronics (ICCE), 2016, p.99-104
Ort / Verlag
IEEE
Erscheinungsjahr
2016
Quelle
IEEE Electronic Library Online
Beschreibungen/Notizen
This paper presents novel methods to accurately calculate the information rates and capacity of quadrature Gaussian mixture (GM) noise channels without the need of time-consuming Monte Carlo simulations or numerical integrations. The focus is on three important input signals: i) a Gaussian input; ii) a complex input with discrete amplitude and independent uniform phase, which is a capacity-achieving input; and iii) finite-alphabet signaling schemes, such as practical quadrature amplitude modulation (QAM). To this end, a novel piecewise-linear curve fitting (PWLCF) method is first proposed to estimate the entropy of a complex GM random variable to achieve any desired level of accuracy. The result can then be used to calculate the information rate when a Gaussian input is used. For a complex input with discrete amplitude and independent uniform phase, the output entropy is estimated in a similar manner but using polar coordinates and the Kernel function. When a finite-alphabet input is used, we exploit the Laguerre-Gauss quadrature formula for an effective calculation of the output entropy. Combining with the noise entropy, we show that in all cases, the information rates can be computed accurately.