### Abstract

Neural network decoding algorithms are recently introduced by Nachmani et al. to decode high-density parity-check (HDPC) codes. In contrast with iterative decoding algorithms such as sum-product or min-sum algorithms in which the weight of each edge is set to 1, in the neural network decoding algorithms, the weight of every edge depends on its impact in the transmitted codeword. In this paper, we provide a novel feed-forward neural network lattice decoding algorithm suitable to decode lattices constructed based on Construction A, whose underlying codes have HDPC matrices. We first establish the concept of feed-forward neural network for HDPC codes and improve their decoding algorithms compared to Nachmani et al. We then apply our proposed decoder for a Construction A lattice with HDPC underlying code, for which the well-known iterative decoding algorithms show poor performances. The main advantage of our proposed algorithm is that instead of assigning and training weights for all edges, which turns out to be time-consuming especially for high-density parity-check matrices, we concentrate on edges which are present in most of 4-cycles and removing them gives a girth-6 Tanner graph. This approach, by slight modifications using updated LLRs instead of initial ones, simultaneously accelerates the training process and improves the error performance of our proposed decoding algorithm.

Original language | English |
---|---|

Title of host publication | 2018 IEEE Information Theory Workshop (ITW) |

Editors | Qin Huang, Shenghao Yang |

Place of Publication | Piscataway NJ USA |

Publisher | IEEE, Institute of Electrical and Electronics Engineers |

Number of pages | 5 |

ISBN (Electronic) | 9781538635995, 9781538635988 |

ISBN (Print) | 9781538636008 |

DOIs | |

Publication status | Published - 2018 |

Event | Information Theory Workshop 2018 - Guangzhou, China Duration: 25 Nov 2018 → 29 Nov 2018 http://www.itw2018.org/ |

### Conference

Conference | Information Theory Workshop 2018 |
---|---|

Abbreviated title | ITW 2018 |

Country | China |

City | Guangzhou |

Period | 25/11/18 → 29/11/18 |

Internet address |

### Keywords

- Deep learning
- Lattices
- Tanner graph
- Trellis graph

### Cite this

*2018 IEEE Information Theory Workshop (ITW)*[8613440] Piscataway NJ USA: IEEE, Institute of Electrical and Electronics Engineers. https://doi.org/10.1109/ITW.2018.8613440

}

*2018 IEEE Information Theory Workshop (ITW).*, 8613440, IEEE, Institute of Electrical and Electronics Engineers, Piscataway NJ USA, Information Theory Workshop 2018, Guangzhou, China, 25/11/18. https://doi.org/10.1109/ITW.2018.8613440

**A neural network lattice decoding algorithm.** / Sadeghi, Mohammad-Reza; Amirzade, Farzane; Panario, Daniel; Sakzad, Amin.

Research output: Chapter in Book/Report/Conference proceeding › Conference Paper › Research › peer-review

TY - GEN

T1 - A neural network lattice decoding algorithm

AU - Sadeghi, Mohammad-Reza

AU - Amirzade, Farzane

AU - Panario, Daniel

AU - Sakzad, Amin

PY - 2018

Y1 - 2018

N2 - Neural network decoding algorithms are recently introduced by Nachmani et al. to decode high-density parity-check (HDPC) codes. In contrast with iterative decoding algorithms such as sum-product or min-sum algorithms in which the weight of each edge is set to 1, in the neural network decoding algorithms, the weight of every edge depends on its impact in the transmitted codeword. In this paper, we provide a novel feed-forward neural network lattice decoding algorithm suitable to decode lattices constructed based on Construction A, whose underlying codes have HDPC matrices. We first establish the concept of feed-forward neural network for HDPC codes and improve their decoding algorithms compared to Nachmani et al. We then apply our proposed decoder for a Construction A lattice with HDPC underlying code, for which the well-known iterative decoding algorithms show poor performances. The main advantage of our proposed algorithm is that instead of assigning and training weights for all edges, which turns out to be time-consuming especially for high-density parity-check matrices, we concentrate on edges which are present in most of 4-cycles and removing them gives a girth-6 Tanner graph. This approach, by slight modifications using updated LLRs instead of initial ones, simultaneously accelerates the training process and improves the error performance of our proposed decoding algorithm.

AB - Neural network decoding algorithms are recently introduced by Nachmani et al. to decode high-density parity-check (HDPC) codes. In contrast with iterative decoding algorithms such as sum-product or min-sum algorithms in which the weight of each edge is set to 1, in the neural network decoding algorithms, the weight of every edge depends on its impact in the transmitted codeword. In this paper, we provide a novel feed-forward neural network lattice decoding algorithm suitable to decode lattices constructed based on Construction A, whose underlying codes have HDPC matrices. We first establish the concept of feed-forward neural network for HDPC codes and improve their decoding algorithms compared to Nachmani et al. We then apply our proposed decoder for a Construction A lattice with HDPC underlying code, for which the well-known iterative decoding algorithms show poor performances. The main advantage of our proposed algorithm is that instead of assigning and training weights for all edges, which turns out to be time-consuming especially for high-density parity-check matrices, we concentrate on edges which are present in most of 4-cycles and removing them gives a girth-6 Tanner graph. This approach, by slight modifications using updated LLRs instead of initial ones, simultaneously accelerates the training process and improves the error performance of our proposed decoding algorithm.

KW - Deep learning

KW - Lattices

KW - Tanner graph

KW - Trellis graph

UR - http://www.scopus.com/inward/record.url?scp=85062084471&partnerID=8YFLogxK

U2 - 10.1109/ITW.2018.8613440

DO - 10.1109/ITW.2018.8613440

M3 - Conference Paper

SN - 9781538636008

BT - 2018 IEEE Information Theory Workshop (ITW)

A2 - Huang, Qin

A2 - Yang, Shenghao

PB - IEEE, Institute of Electrical and Electronics Engineers

CY - Piscataway NJ USA

ER -