Sie befinden Sich nicht im Netzwerk der Universität Paderborn. Der Zugriff auf elektronische Ressourcen ist gegebenenfalls nur via VPN oder Shibboleth (DFN-AAI) möglich. mehr Informationen...
Multi-Way Compression for Channel Neural Decoding with Quantization
Ist Teil von
2023 9th International Conference on Computer and Communications (ICCC), 2023, p.968-973
Ort / Verlag
IEEE
Erscheinungsjahr
2023
Quelle
IEEE Electronic Library Online
Beschreibungen/Notizen
The performance of model-driven channel neural decoding has surpassed that of traditional channel decoding algorithms, but at a higher complexity, making it difficult to implement on resource-constrained communication hardware. In this paper, we propose a quantization scheme for model-driven channel neural decoding, and combine the quantization scheme with TR decomposition and weight sharing algorithms to form different types of multi-way compression methods. Experimental results on LDPC, BCH and Hamming codes show that the proposed quantization and multi-way compression methods can effectively reduce the complexity of channel neural decoding without significant performance degradation.