Publication Cover
Automatika
Journal for Control, Measurement, Electronics, Computing and Communications
Volume 63, 2022 - Issue 4
676
Views
0
CrossRef citations to date
0
Altmetric
Regular Papers

Inter-node compression with LDPC joint source–channel coding for highly correlated sources

ORCID Icon, &
Pages 779-784 | Received 08 Jul 2021, Accepted 27 May 2022, Published online: 13 Jun 2022

Abstract

This paper investigates a new communication system where two nodes want to disseminate highly correlated contents to a single destination and can be applied for densely deployed wireless sensors networks applications. Motivated by their capacity-achieving performance and existing practical implementations, the proposed communication scheme is fully based on Low-Density Parity-Check (LDPC) codes for data compression and channel coding. More specifically, we consider a network of two correlated binary sources with two orthogonal communication phases. Data are encoded at the first source with an LDPC channel code and broadcast in the first phase. Based on the first source received data, the second source computes the correlation vector and applies a Joint Source–Channel (JSC) LDPC code, which output is communicated in the second phase. At the receiver, the whole network is mapped on a joint factor graph over which an iterative message-passing joint decoder is proposed. The aim of the joint decoder is to exploit the residual correlation between the sources for better estimation. Simulation results are investigated and compared to the theoretical limits and to an LDPC-based distributed coding system where no inter-node compression is applied.

1. Introduction

In densely deployed wireless sensor or internet of things networks, several nodes disseminate physical information to a central node for decision-making. Due to the redundant nature of the collected data, these nodes of information are generally correlated in space (different camera views, temperature or pressures samples from different sensors, etc.). In several cases, many nodes measure the same target phenomenon; hence, the correlation of the data is very high, which dominates the transmission bandwidth and can cause uplink congestion.

In the information and coding theory community, the problem of sending correlated data, with no communicating sources, to a unique destination is called distributed source coding, where the uncompressed correlation is exploited at the destination. The information theoretical lossless compression bounds on distributed source coding of two correlated sources are defined by the Slepian-Wolf (SW) theorem [Citation1]. In fact, the latter states that two independent and identically distributed (i.i.d) sources can be compressed to a source rate Rs=H(U1,U2), where H(U1,U2) is the joint entropy of the sources U1 and U2 [Citation1]. The result is also combined with the well-known Shannon theorem for the case of the transmission of correlated sources over noisy channels [Citation2].

In this framework, several studies proposed practical joint source–channel coding and decoding techniques for the transmission of correlated sources [Citation3–6]. In Ref. [Citation3], the authors proposed Turbo-based JSC decoding that exploits the inter-source dependence to perform channel error-control for Additive White Gaussian Noise (AWGN) channels. Also, a similar system was investigated in [Citation4] to transmit correlated sources over fading multiple access channels with three different source coding strategies. Besides, the authors in Ref. [Citation5] proposed a hybrid digital/analog coding scheme to send correlated sources over discrete-memoryless two-way channels. Other contributions used channel codes to exploit the inter-sources correlation in the framework of distributed source coding [Citation7–11]. LDPC-based iterative joint decoding schemes were proposed in [Citation7, Citation9, Citation10], and the contributions were utilizing two types of iterations, named local and global iterations, to this aim. In the local iterations, the LDPC decoding process performs the Sum-Product (SP) algorithm independently for the two sources, then the estimation of the source’s correlation is performed at the global iterations. It was also demonstrated in [Citation12] that Turbo Codes could reach the SW limit for distributed lossless source encoding.

All the presented contributions assume no inter-node communication. Hence, all the correlation is transmitted and then exploited by adequate JSC decoding. However, in wireless communications with orthogonal signalling, if the first source transmits its data, the latter is naturally broadcasted and received by the second source, and the correlation can be evaluated and compressed before transmission in the second phase. This remark motivates the current work where, unlike the described solutions, inter-node compression is used based on source LDPC codes, and JSC decoding is applied to exploit the residual correlation that remains after compression.

The proposed system aims to transmit two correlated binary sources to a destination over two independent AWGN channels. At the first source, an LDPC channel code is used. At the second source, based on the received data from the first source, we apply a JSC LDPC [Citation13] coding to compress and protect the correlation vector. The main contributions behind this work are: First, inter-node compression coding system is introduced to transmit correlated binary sources based on JSC LDPC codes. Second, a global network Tanner graph is presented and used to apply a message-passing iterative joint decoding, that exploits the residual correlation of the two sources after compression. Simulation results are investigated and compared to Shannon-SW limits and to an equivalent LDPC-based system with distributed source coding.

This paper is organized as follows. Section 2 gives a detailed overview of the system model. Section 3 provides the message transfer decoding process. Investigation of the system efficiency based on simulation is emphasized in Section 4.

2. The proposed communication system with inter-node compression based on JSC LDPC codes

As mentioned, we study a joint coding system composed of two memoryless binary sources S1,S2 and one single destination. Each source communicates its own data to the destination through an independent AWGN channel, based on LDPC codes. As shown in Figure , the source S1 generates a binary i.i.d sequence b1. We assume that the sequence b1 is n1 bits length, and is encoded by an LDPC channel code providing a codeword c1, such as (1) c1=Gcc1T×b1(1) with the generator matrix Gcc1 of size (n1×m1). Then, the encoded bit-stream c1 is broadcasted over AWGN channels by means of BPSK modulation. We denote by Rcc1=n1/m1 the first source LDPC channel coding rate. Besides, the second source S2 generates a binary data b2, which is supposed to be correlated with b1 by means of a crossover probability p. Otherwise, b2=b1z, where z is a random variable with a probability Pr(z=1)=p and Pr(z=0)=1p. The probability (1-p) also denoted by ρ is defined as the correlation ratio between the two sources. The small the p is, the more correlated the sources are.

Figure 1. System model for inter-node correlation compression using LDPC codes.

Figure 1. System model for inter-node correlation compression using LDPC codes.

At the second source, we can recover a noisy version of b1, after the wireless broadcast transmits phase of S1. The received version will then be decoded at the second source to reconstruct an estimate of b1 called b1^. The latter is obtained after a BP iterative decoding on the LDPC channel coding 1 Tanner graph constructed based on the ((m1n1)×m1) parity check matrix Hcc1. Then, we construct the correlation vector as e=b1^b2 being the XOR of the vectors b1^ and b2. The obtained vector e is very redundant, which motivated the use of an LDPC source encoder [Citation13] with an (l×n2) parity check matrix Hsc providing a compressed sequence as (2) d=Hsc×e(2) The compressed sequence d is then encoded by a second LDPC channel code using the generator matrix Gcc2 with (l×m2) dimensions as (3) c2=Gcc2T×d(3)

The codeword c2 is modulated by BPSK and transmitted to the destination over an AWGN channel. Rsc=l/n2 denotes the source coding rate of S2 and Rcc2=l/m2 is the corresponding channel coding rate. Since we consider orthogonal signalling, the overall rate of the proposed system is R=2Rcc1Rcc2Rcc2+Rcc1Rsc.

3. The joint network-mapped iterative decoder

The aim of the joint decoder, applied at the receiver, is to generate the best estimates of original sources data based on error-correcting LDPC decoding, while exploiting the residual correlation between the two sources after LDPC correlation compression.

Based on the factor graph representation available for source and channel LDPC codes, the network joint decoder of correlated sources can be described by two elementary connected decoders as shown in Figure . The global graph components are the channel decoder graph of source 1 and the joint source–channel decoder graph of source 2. The decoding process uses message transfer based on the Belief Propagation (BP) algorithm and is composed of two types of iterations: local iterations l and global iterations g. First, a local processing is applied where the LDPC channel decoder 1 of source 1 and the joint source–channel decoder 2 of source 2 apply the BP algorithm with a maximum number of local iterations, denoted by lmax. After these lmax iterations, the outputs of the elementary decoders provide the estimates of b1 and e, denoted respectively by b1^ and e^, which are inherently related to the second source original source sequence b2. We propose at a second decoding stage to compute an estimate of the source’s correlation using global iterations to enhance the performance of the joint decoders by providing extra a priori information. The latter is provided by the correlation update process depicted in Figure , where b2^ vector is estimated as (4) b2^=b1^e^(4) As shown in Figure , the correlation estimate applied by updating the log-likelihood ratios (LLRs) during global iterations provides messages Lest(g)(b1^) and Lest(g)(e^), passed respectively to the channel decoder graph 1 and the joint source–channel LDPC decoder 2 for further local iterations. We note that at the first global iteration g = 0, Lest(g)(b1^) and Lest(g)(e^) are initialized to zero, and for the next iterations, they will be appended to the systematic variable nodes of each corresponding decoder. The joint global decoder messages transfer is described in more detail as follows for a specific l-th local and g-th global iteration.

Figure 2. The proposed joint decoder factor graph for S1 and S2 sources reconstruction.

Figure 2. The proposed joint decoder factor graph for S1 and S2 sources reconstruction.

Figure 3. The global iteration joint decoder block diagram with correlation exploitation and LLRs updating.

Figure 3. The global iteration joint decoder block diagram with correlation exploitation and LLRs updating.

For the elementary LDPC channel decoder 1, the messages of the variable nodes v=1,,m1 sent to a check nodes c are (5) mv,ccc1,(l)=Zvcc1+ccmc,vcc1,(l1)+Lest(g1)(b1^)(5) where Zvcc1=2rvσn2 are the channel observations LLRs used to initialize the variable nodes. Lest(g1)(b1^) are the LLRs corresponding to the estimates of the vector b1^ components at the global iteration g. Lest(g)(b1^) expression is (6) Lest(g)(b1^)=2tanh1×(tanh(L(g)(b2^)2)×tanh(L(g)(e^)2))(6) where L(g)(b2^) are the LLRs corresponding to the vector b2 components estimated as [Citation9] (7) L(g)(b2^)=log(kwHwH)(7) with k and wH being respectively the length and the Hamming weight of sequence b2^.

The messages L(g)(e^) are computed after lmax local iterations at the second source joint graph by (8) L(g)(e^)=Zvsc+cmc,vsc(lmax1)(8) The messages delivered by the check nodes c=1,,n1 of the channel decoder 1 to the connected variable nodes are evaluated as (9) mc,vcc1,(l)=2tanh1(vvtanh(mv,ccc1,(l)2))(9)

The joint source–channel LDPC decoding process message transfer for the second source elementary decoder behaves the same [Citation13]. However, in this proposed system, we append the correlation update messages to the variable nodes of the LDPC source decoding part. Hence, after a fixed number of local iterations, the updating LLR for the variable nodes of the LDPC source decoder is calculated as (10) mv,csc,(l)=Zvsc+ccmc,vsc,(l1)+Lest(g1)(e^)(10) where Lest(g1)(e^) are the LLRs associated to a vector e^. The LLRs Lest(g)(e^) are evaluated as (11) Lest(g)(e^)=2tanh1(tanh(L(g)(b2^)2)×tanh(L(g)(b1^)2))(11) and L(g)(b1^) is given by (12) L(g)(b1^)=Zvcc1+cmc,vcc1,(lmax1)(12) After a fixed number of global iterations of the joint global decoder, the sequence b1^ is estimated based on the following a posteriori LLR (13) L(b1^)=Zvcc1+cmc,vcc1,(lmax)+Lest(g)(b1^) for v=1,,n1(13)

Also, the information vector b2^ is estimated based on the correlation vector, LLRs calculated by (14) L(e^)=Zvsc+cmc,vsc,(lmax)+Lest(g)(e^) for v=1,,n2(14)

4. Simulation results

In this section, we investigate the performance of the proposed inter-node compression scheme with JSC LDPC codes. Also, we compare the performance of our proposed system to a distributed coding system [Citation11] where no communication is allowed between the two sources, and the correlation is fully exploited at the decoder (not compressed). The distributed coding system is composed of two binary sources, where data are encoded independently by LDPC channel codes and then sent to the destination through AWGN channels.

We consider a binary i.i.d sequence with information length n1 = 1800 bits, a regular LDPC channel code with a code rate Rcc1=1/2, and degrees defined by (dv = 3, dc = 6) at source S1. At the second source, we assume n2=1800 bits and apply a JSC source and channel LDPC regular codes with respective rates Rsc=1/2 and Rcc2=1/2 with (3, 6) degrees. The overall rate of the system is R = 2/3. At the decoder, we apply the described message passing algorithm with a maximum number of local iterations equal to lmax=100 and different number of global iterations denoted by GI. Also, we assume that the two sources have the same distance to the destination, and consequently the same signal-to-noise ratio. However, we suppose that the S1is too close to S2 and the inter-nodes link suffers from no communication errors (b1^=b1). We studied in [Citation14] the case where wireless link between sources is noisy with a bit error probability pe. This case can be tackled by applying an LLR updating function [Citation14] and is omitted in this paper. For the reference distributed system, we consider channel coding rates Rcc1=2/3 and Rcc2=2/3 with constant degrees (3, 9), which means a global rate equal to 2/3.

In Figure , we plot the BER performance as a function of the Eso/N0 for the inter-node compression system (system 1) and distributed coding system (system 2). We note that the crossover probability p between the two sources, used in this experiment, is equal to 0.01. In our investigations, to analyze the efficiency of the proposed joint decoder, we also consider the theoretical SW/Shannon bound proposed in [Citation15], where Eso/N0 should be greater than (15) EsoN0|lim=2H(S1,S2)R12R(15) where Eso is the energy per source bit, N0=2σn2 is the noise power spectral density, H(S1,S2) is the source’s joint entropy, and R is the overall coding rate. The joint entropy is given by H(S1,S2)=H(S1)+h(p) and depends on the first source entropy H(S1) (equal to 1 here) and on the crossover probability p.

Figure 4. BER performance for equivalent rates compression of inter-source correlations and distributed coding systems with crossover probability p = .01.

Figure 4. BER performance for equivalent rates compression of inter-source correlations and distributed coding systems with crossover probability p = .01.

From the obtained results, we observe that the performance of the system 1 in the waterfall region is better than the system 2. The proposed system provides a gain of about 2.2 dB for BER equal to 102 and with no global iterations (GI = 0).

The gap of the proposed system with respect to the SW/Shannon theoretical limit is less than 1 dB. However, while it is clear that the performance of the distributed system 2 is substantially improved after exploiting the source’s correlation during the global iterations, the latter has no impact on the performance of our proposed system with inter-node compression. Indeed, this behaviour is explained by the fact that the compression scheme applied at the second source reduces drastically the correlation between the sources, hence there is no residual redundancy to exploit. Otherwise, if no extra computation is allowed at the destination node, the inter-node compression can be beneficial since we can reach fair performance with no need of global iterations. Finally, we also remark a higher error floor, justified by the compression loss, which depends on the source coding operation, applied at the second source and is specific to the joint LDPC source coding process.

We previously demonstrated in [Citation16] that in the case of point-to-point communication, the joint LDPC system can reach better improvements as the source is highly correlated and provides less error floors. In the second set of experiments, we study, in Figure , the effect of the crossover probability p on the performance of the inter-node compression system in the waterfall and error floor regions.

Figure 5. BER performance for system 1 with different crossover probability p.

Figure 5. BER performance for system 1 with different crossover probability p.

We propose three crossover probabilities p: p = .02 with H(S1,S2)=1.14, p = .01 and H(S1,S2)=1.08, and p = .003 with H(S1,S2)=1.029. Also, we use the theoretical limit of Eso/N0 as benchmark for the different values of p. We recall that the number of global iterations is oversized especially for high values of p where no improvement is obtained, due to LDPC correlation compression. We can conclude from the results that having a lower parameter p involves better improvement in both waterfall and error floor regions. We observe a gain of about 0.3 dB with p = 0.003 compared to the case where p = .01 for a BER=102 with a reduced error floor. The practical system behaviour agrees also with the theoretical limits, and always a gap less than 1 dB is achieved.

5. Conclusion

In this paper, we study a joint source–channel coding system that compresses the inter-node correlations based on LDPC codes. First, we presented the system model, which is composed of two binary sources and one destination, and with two communication phases. Second, we developed a joint decoder with two stages of iterations to exploit the source correlation. Based on computer simulations, we demonstrated that the proposed system could provide slightly better results in the waterfall region compared to a distributed coding system exploiting the source correlation without applying any global iterations at the cost of a quality loss for high SNRs. The proposed system can be beneficial for downlink wireless sensor network systems, where computation cost is increased at the intermediate source with XORing and LDPC compression, but very low processing is needed at the destination by omitting global iterations.

Disclosure statement

No potential conflict of interest was reported by the author(s).

Additional information

Funding

This work has received a French government support granted to the Cominlabs excellence laboratory and managed by the National Research Agency in the Investing for the Future program under reference ANR-10-LABX-07-01.

References

  • Slepian D, Wolf J. Noiseless coding of correlated information sources. IEEE Trans Inf Theory. 1973;19(4):417–480.
  • Garcia-Frias J, Zhao Y. Near-Shannon/Slepian-Wolf performance for unknown correlated sources over AWGN channels. IEEE Trans Comm. 2005;53(4):555–559.
  • Garcia-Frias J. Joint source-channel decoding of correlated sources over noisy channels. Data Compression Conf.; 2001. p. 283–292.
  • Argyriou A, Alay O, Palantas P. Modeling the lossy transmission of correlated sources in multiple access fading channels. Phy Comm. 2017;24:34–45.
  • Weng JJ, Alajaji F, Linder T. Joint source-channel coding for the transmission of correlated sources over two-way channels. IEEE ISIT; 2019. p. 1322–1329.
  • Huo Y, Zhu C, Hanzo L. Spatio-temporal iterative source–channel decoding aided video transmission. IEEE Trans Veh Technol. 2013;62(4):1597–1609.
  • Daneshgaran F, Laddomada M, Mondin M. LDPC-based channel coding of correlated sources with iterative joint decoding. IEEE Trans Comm. 2006;54(4):577–582.
  • Aljohani AJ, Ng SX. Distributed joint source-channel coding-based adaptive dynamic network coding. IEEE Access. 2020;8:86715–86731.
  • Asvadi R, Matsumoto T, Juntti M. Joint distributed source-channel decoding for LDPC-coded binary Markov sources. IEEE PIMRC; 2013. p. 807–811.
  • Khas M, Saeedi H, Asvadi R. LDPC code design for correlated sources using EXIT charts. IEEE ISIT; 2017. p. 2945–2949.
  • Nangir M, Asvadi R, Ahmadian-Attari M, et al. Analysis and code design for the binary CEO problem under logarithmic loss. IEEE Trans Commun. 2018;66(12):6003–6014.
  • Aaron A, Girod B. Compression with side information using turbo codes. Data Compression Conf.; 2002. p. 252–261.
  • Fresia M, Perez-Cruz F, Poor HV, et al. Joint source and channel coding. IEEE Signal Process Mag. 2010;27(6):104–113.
  • Abdessalem MB, Zribi A, Matsumoto T, et al. LDPC-based joint source channel coding and decoding strategies for single relay cooperative communications. Phys Commun. 2020;38:100947.
  • Garcia-Frias J. Joint source-channel decoding of correlated sources over noisy channels. Data Compression Conf.; 2001. p. 283–292.
  • Ben Abdessalem M, Zribi A, Bouallegue A. Analysis of joint source channel low-density parity-check (LDPC) coding for correlated sources transmission over noisy channels. Intl. Conf Communications, Networking and Mobile Computing, 2017.