Neural Network Decoders (NNDs) have been recently considered and investigated as an alternative to the classical algorithms for decoding Polar Codes. In particular, a class of partitioned decoders has been proposed where modified Successive Cancellation (SC) or Belief Propagation (BP) algorithms exploit the neural network decoding of short sublocks. Although NNDs work as one-shot decoders with small decoding latency, the performance of these decoders, in comparison with classic approaches, still suffers from some losses; this fact limits their application to codes with short/medium length. In this article we introduce the novel Neural Successive Cancellation List (NSCL) decoder, as a partitioned version of the SC decoder operating on a list of L multiple decoding candidates that are built with the decoded sub-blocks obtained from a set of NNDs working in parallel at each partition. This decoder significantly improves the BER performance of the non-list version under the same code length. Hence, it reduces the intrinsic loss of performance given by the use of NNDs and allows the application of NN-based decoding to longer codewords.

Neural successive cancellation list decoding of polar codes

Negrini S.;Tralli V.
2020

Abstract

Neural Network Decoders (NNDs) have been recently considered and investigated as an alternative to the classical algorithms for decoding Polar Codes. In particular, a class of partitioned decoders has been proposed where modified Successive Cancellation (SC) or Belief Propagation (BP) algorithms exploit the neural network decoding of short sublocks. Although NNDs work as one-shot decoders with small decoding latency, the performance of these decoders, in comparison with classic approaches, still suffers from some losses; this fact limits their application to codes with short/medium length. In this article we introduce the novel Neural Successive Cancellation List (NSCL) decoder, as a partitioned version of the SC decoder operating on a list of L multiple decoding candidates that are built with the decoded sub-blocks obtained from a set of NNDs working in parallel at each partition. This decoder significantly improves the BER performance of the non-list version under the same code length. Hence, it reduces the intrinsic loss of performance given by the use of NNDs and allows the application of NN-based decoding to longer codewords.
2020
9781728144900
Polar Codes; Neural Network Decoding; Successive Cancellation List Decoding; Successive Cancellation Decoding
File in questo prodotto:
File Dimensione Formato  
negrini2020.pdf

solo gestori archivio

Descrizione: Full text editoriale
Tipologia: Full text (versione editoriale)
Licenza: NON PUBBLICO - Accesso privato/ristretto
Dimensione 176.5 kB
Formato Adobe PDF
176.5 kB Adobe PDF   Visualizza/Apri   Richiedi una copia

I documenti in SFERA sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11392/2437732
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 1
  • ???jsp.display-item.citation.isi??? 0
social impact