Turbo Source Coding. Laurent Schmalen and Peter Vary. FlexCode Public Seminar June 16, 2008

Similar documents
Noise-Adaptive Perceptual Weighting in the AMR-WB Encoder for Increased Speech Loudness in Adverse Far-End Noise Conditions

Malicious User Suppression for Cooperative Spectrum Sensing in Cognitive Radio Networks using Dixon s Outlier Detection Method

Seminar - Organic Computing

Eli Yamamoto, Satoshi Nakamura, Kiyohiro Shikano. Graduate School of Information Science, Nara Institute of Science & Technology

Evolutive Neural Net Fuzzy Filtering: Basic Description

Introduction to Simulation

BMBF Project ROBUKOM: Robust Communication Networks

Moderator: Gary Weckman Ohio University USA

Text Compression for Dynamic Document Databases

Robust Speech Recognition using DNN-HMM Acoustic Model Combining Noise-aware training with Spectral Subtraction

WHEN THERE IS A mismatch between the acoustic

SARDNET: A Self-Organizing Feature Map for Sequences

Semi-Supervised GMM and DNN Acoustic Model Training with Multi-system Combination and Confidence Re-calibration

GDP Falls as MBA Rises?

A study of speaker adaptation for DNN-based speech synthesis

Quantitative Evaluation of an Intuitive Teaching Method for Industrial Robot Using a Force / Moment Direction Sensor

Reinforcement Learning by Comparing Immediate Reward

Soft Computing based Learning for Cognitive Radio

RANKING AND UNRANKING LEFT SZILARD LANGUAGES. Erkki Mäkinen DEPARTMENT OF COMPUTER SCIENCE UNIVERSITY OF TAMPERE REPORT A ER E P S I M S

Using EEG to Improve Massive Open Online Courses Feedback Interaction

Strategies for Solving Fraction Tasks and Their Link to Algebraic Thinking

Data Fusion Models in WSNs: Comparison and Analysis

Master s Programme in Computer, Communication and Information Sciences, Study guide , ELEC Majors

Bluetooth mlearning Applications for the Classroom of the Future

Designing a Rubric to Assess the Modelling Phase of Student Design Projects in Upper Year Engineering Courses

Computer Science. Embedded systems today. Microcontroller MCR

Class-Discriminative Weighted Distortion Measure for VQ-Based Speaker Identification

Learning Methods for Fuzzy Systems

Human Emotion Recognition From Speech

ISFA2008U_120 A SCHEDULING REINFORCEMENT LEARNING ALGORITHM

Speech Emotion Recognition Using Support Vector Machine

AMULTIAGENT system [1] can be defined as a group of

INPE São José dos Campos

Autoregressive product of multi-frame predictions can improve the accuracy of hybrid models

A Simple VQA Model with a Few Tricks and Image Features from Bottom-up Attention

A Neural Network GUI Tested on Text-To-Phoneme Mapping

International Journal of Computational Intelligence and Informatics, Vol. 1 : No. 4, January - March 2012

FUZZY EXPERT. Dr. Kasim M. Al-Aubidy. Philadelphia University. Computer Eng. Dept February 2002 University of Damascus-Syria

Learning Methods in Multilingual Speech Recognition

Speech Synthesis in Noisy Environment by Enhancing Strength of Excitation and Formant Prominence

Speech Recognition at ICSI: Broadcast News and beyond

A Comparison of Annealing Techniques for Academic Course Scheduling

HybridTechniqueforArabicTextCompression

Calibration of Confidence Measures in Speech Recognition

PREDICTING SPEECH RECOGNITION CONFIDENCE USING DEEP LEARNING WITH WORD IDENTITY AND SCORE FEATURES

An empirical study of learning speed in backpropagation

Non intrusive multi-biometrics on a mobile device: a comparison of fusion techniques

Curriculum Vitae FARES FRAIJ, Ph.D. Lecturer

Evidence for Reliability, Validity and Learning Effectiveness

A Variation-Tolerant Multi-Level Memory Architecture Encoded in Two-state Memristors

DOUBLE DEGREE PROGRAM AT EURECOM. June 2017 Caroline HANRAS International Relations Manager

An Introduction to Simulation Optimization

Machine Learning and Data Mining. Ensembles of Learners. Prof. Alexander Ihler

QuickStroke: An Incremental On-line Chinese Handwriting Recognition System

Planning a Webcast. Steps You Need to Master When

Transfer Learning Action Models by Measuring the Similarity of Different Domains

BAUM-WELCH TRAINING FOR SEGMENT-BASED SPEECH RECOGNITION. Han Shu, I. Lee Hetherington, and James Glass

AUTOMATIC DETECTION OF PROLONGED FRICATIVE PHONEMES WITH THE HIDDEN MARKOV MODELS APPROACH 1. INTRODUCTION

Research at RWTH Aachen University. Turning waste into resources

On the Combined Behavior of Autonomous Resource Management Agents

The dilemma of Saussurean communication

An Online Handwriting Recognition System For Turkish

P. Belsis, C. Sgouropoulou, K. Sfikas, G. Pantziou, C. Skourlas, J. Varnas

GACE Computer Science Assessment Test at a Glance

A Case-Based Approach To Imitation Learning in Robotic Agents

Major Milestones, Team Activities, and Individual Deliverables

ADVANCES IN DEEP NEURAL NETWORK APPROACHES TO SPEAKER RECOGNITION

OCR for Arabic using SIFT Descriptors With Online Failure Prediction

Analysis of Speech Recognition Models for Real Time Captioning and Post Lecture Transcription

Unsupervised Acoustic Model Training for Simultaneous Lecture Translation in Incremental and Batch Mode

Module 12. Machine Learning. Version 2 CSE IIT, Kharagpur

Python Machine Learning

Artificial Neural Networks written examination

A Context-Driven Use Case Creation Process for Specifying Automotive Driver Assistance Systems

Analysis of Hybrid Soft and Hard Computing Techniques for Forex Monitoring Systems

Softprop: Softmax Neural Network Backpropagation Learning

Your Partner for Additive Manufacturing in Aachen. Community R&D Services Education

Courses in English. Application Development Technology. Artificial Intelligence. 2017/18 Spring Semester. Database access

Introduction to Ensemble Learning Featuring Successes in the Netflix Prize Competition

A Comparison of DHMM and DTW for Isolated Digits Recognition System of Arabic Language

Improving Fairness in Memory Scheduling

Georgetown University at TREC 2017 Dynamic Domain Track

Mathematics subject curriculum

UCC2: Course Change Transmittal Form

EMBA 2-YEAR DEGREE PROGRAM. Department of Management Studies. Indian Institute of Technology Madras, Chennai

Xinyu Tang. Education. Research Interests. Honors and Awards. Professional Experience

OPTIMIZATINON OF TRAINING SETS FOR HEBBIAN-LEARNING- BASED CLASSIFIERS

Active Learning. Yingyu Liang Computer Sciences 760 Fall

International Series in Operations Research & Management Science

THE world surrounding us involves multiple modalities

Speaker recognition using universal background model on YOHO database

A heuristic framework for pivot-based bilingual dictionary induction

Institutionen för datavetenskap. Hardware test equipment utilization measurement

Analysis of Enzyme Kinetic Data

INVESTIGATION OF UNSUPERVISED ADAPTATION OF DNN ACOUSTIC MODELS WITH FILTER BANK INPUT

Mental Models of a Cellular Phone Menu. Comparing Older and Younger Novice Users

Cooperative Game Theoretic Models for Decision-Making in Contexts of Library Cooperation 1

Noisy Channel Models for Corrupted Chinese Text Restoration and GB-to-Big5 Conversion

Lecture 1: Machine Learning Basics

An Introduction to Simio for Beginners

Transcription:

Institute of Communication Systems and Data Processing Prof. Dr.-Ing. Peter Vary Turbo Source Coding Laurent Schmalen and Peter Vary FlexCode Public Seminar June 16, 28 IND - Institute of Communication Systems and Data Processing, RWTH Aachen University 1

Outline FlexCode in a nutshell Entropy coding Turbo coding and decoding Application of Turbo codes as source codes A joint-source channel coding scheme with iterative decoding for compression Possible Application to FlexCode IND - Institute of Communication Systems and Data Processing, RWTH Aachen University 2

Who? Ericsson KTH Nokia RWTH Aachen Orange (France Telecom) IND - Institute of Communication Systems and Data Processing, RWTH Aachen University 3

The Problem Heterogeneity of networks increasing Networks inherently variable (mobile users) But: Coders not designed for specific environment Coders inflexible (codebooks and FEC) Feedback channel underutilized IND - Institute of Communication Systems and Data Processing, RWTH Aachen University 4

Adaptation and Coding adaptive coding flexible FlexCode FlexCode FlexCode FlexCode set of models, fixed quantizer AMR CELP with FEC fixed model fixed quantizer PCM rigid IND - Institute of Communication Systems and Data Processing, RWTH Aachen University 5

Conventional Transmission Transmitter Receiver IND - Institute of Communication Systems and Data Processing, RWTH Aachen University 6

Lossless Source Coding Prominent examples of lossless entropy coders Huffman coding Lempel-Ziv coding Arithmetic coding Example: Huffman code AACABA 11 1 No bit pattern is prefix of another Symbol S P(S) Unambiguous decoding A.5 Bit pattern B.3 1 C.15 11 D.5 111 IND - Institute of Communication Systems and Data Processing, RWTH Aachen University 7

Lossless Source Coding Usability of lossless source codes High sensitivity against transmission errors Huffman code: synchronization loss Arithmetic code: selection of wrong interval, complete decoding failure IND - Institute of Communication Systems and Data Processing, RWTH Aachen University 8

Lossless Source Coding Usability of lossless source codes High sensitivity against transmission errors Huffman code: synchronization loss Arithmetic code: selection of wrong interval, complete decoding failure Very strong channel codes required Error floor, i.e., seldom bit errors leading to decoding failures Iterative source-channel decoding schemes for Entropy Codes High complexity Difficult to apply to arithmetic coding [Guionnet Channel 4] quality E b /N BER IND - Institute of Communication Systems and Data Processing, RWTH Aachen University 9

Lossless Source Coding Usability of lossless source codes High sensitivity against transmission errors Huffman code: synchronization loss Arithmetic code: selection of wrong interval, complete decoding failure Very strong channel codes required Error floor, i.e., seldom bit errors leading to decoding failures Iterative source-channel decoding schemes for Entropy Codes High complexity Difficult to apply to arithmetic coding [Guionnet 4] IND - Institute of Communication Systems and Data Processing, RWTH Aachen University 1

Wanted: Lossless Source Coding A flexible compression scheme (entropy coder) which Has similar performance as known compression schemes Is robust against transmission errors Can instantaneously adapt to varying channel conditions by exchanging compression ratio against error robustness Analogy between channel codes and source codes A good channel code is often also a good source code Use of LDPC codes for compression [Caire 3] Can Turbo codes be used for compression? IND - Institute of Communication Systems and Data Processing, RWTH Aachen University 11

Turbo Codes, Concept Concatenated Encoding parallel scheme [Berrou 93] serial scheme [Benedetto 98] IND - Institute of Communication Systems and Data Processing, RWTH Aachen University 12

Turbo Codes, Concept Concatenated Encoding parallel scheme [Berrou 93] serial scheme [Benedetto 98] Iterative Turbo Decoding advantages by iterative feedback! IND - Institute of Communication Systems and Data Processing, RWTH Aachen University 13

Turbo Codes, Concept Concatenated Encoding parallel scheme [Berrou 93] serial scheme [Benedetto 98] Iterative Turbo Decoding advantages by iterative feedback! IND - Institute of Communication Systems and Data Processing, RWTH Aachen University 14

Turbo Coding & Decoding Turbo Code Encoder and Decoder [Berrou 93] Transmission Channel denotes an interleaver is an (almost) arbitrary channel encoder IND - Institute of Communication Systems and Data Processing, RWTH Aachen University 15

Turbo Coding & Decoding Turbo Code Encoder and Decoder [Berrou 93] Transmission Channel denotes an interleaver is an (almost) arbitrary channel encoder IND - Institute of Communication Systems and Data Processing, RWTH Aachen University 16

.4.35.3.25.2.15.1.5.4.35.3.25.2.15.1.5 Turbo Coding & Decoding Turbo Decoding (1st iteration, 1st step) Decoder 1 Decoder 2 IND - Institute of Communication Systems and Data Processing, RWTH Aachen University 17

.4.35.3.25.2.15.1.5.4.35.3.25.2.15.1.5.4.35.3.25.2.15.1.5 Turbo Coding & Decoding Turbo Decoding (1st iteration, 1st step) Transmission over AWGN channel Decoder 1 data symbols noisy symbols Decoder 2 additive Gaussian noise IND - Institute of Communication Systems and Data Processing, RWTH Aachen University 18

.4.35.3.25.2.15.1.5.4.35.3.25.2.15.1.5 Turbo Coding & Decoding Turbo Decoding (1st iteration, 1st step) Decoder 1 Decoder 2 IND - Institute of Communication Systems and Data Processing, RWTH Aachen University 19

.4.35.3.25.2.15.1.5.4.35.3.25.2.15.1.5.4.35.3.25.2.15.1.5 Turbo Coding & Decoding Turbo Decoding (1st iteration, 1st step) Decoder 1 Decoder 2 IND - Institute of Communication Systems and Data Processing, RWTH Aachen University 2

.4.35.3.25.2.15.1.5.4.35.3.25.2.15.1.5.4.35.3.25.2.15.1.5.4.35.3.25.2.15.1.5 Turbo Coding & Decoding Turbo Decoding (1st iteration, 2nd step) Decoder 1 Decoder 2 IND - Institute of Communication Systems and Data Processing, RWTH Aachen University 21

.4.35.3.25.2.15.1.5.4.35.3.25.2.15.1.5.4.35.3.25.2.15.1.5.4.35.3.25.2.15.1.5.5.45.4.35.3.25.2.15.1.5 Turbo Coding & Decoding Turbo Decoding (1st iteration, 2nd step) Decoder 1 Decoder 2 IND - Institute of Communication Systems and Data Processing, RWTH Aachen University 22

.4.35.3.25.2.15.1.5.4.35.3.25.2.15.1.5.5.45.4.35.3.25.2.15.1.5.5.45.4.35.3.25.2.15.1.5 Turbo Coding & Decoding Turbo Decoding (2nd iteration, 1st step) Decoder 1 Decoder 2 IND - Institute of Communication Systems and Data Processing, RWTH Aachen University 23

.4.35.3.25.2.15.1.5.4.35.3.25.2.15.1.5.5.45.4.35.3.25.2.15.1.5.5.45.4.35.3.25.2.15.1.5.6.5.4.3.2.1 Turbo Coding & Decoding Turbo Decoding (2nd iteration, 1st step) Decoder 1 Decoder 2 IND - Institute of Communication Systems and Data Processing, RWTH Aachen University 24

.4.35.3.25.2.15.1.5.4.35.3.25.2.15.1.5.5.45.4.35.3.25.2.15.1.5.5.45.4.35.3.25.2.15.1.5.6.5.4.3.2.1.4.35.3.25.2.15.1.5 Turbo Coding & Decoding Turbo Decoding (2nd iteration, 1st step) Decoder 1 Previous iteration Decoder 2 IND - Institute of Communication Systems and Data Processing, RWTH Aachen University 25

.4.35.3.25.2.15.1.5.4.35.3.25.2.15.1.5.6.5.4.3.2.1 Turbo Coding & Decoding Turbo Decoding (2nd iteration, 2nd step) Decoder 1 Decoder 2 IND - Institute of Communication Systems and Data Processing, RWTH Aachen University 26

.4.35.3.25.2.15.1.5.4.35.3.25.2.15.1.5.6.5.4.3.2.1.8.7.6.5.4.3.2.1 Turbo Coding & Decoding Turbo Decoding (2nd iteration, 2nd step) Decoder 1 Decoder 2 IND - Institute of Communication Systems and Data Processing, RWTH Aachen University 27

.4.35.3.25.2.15.1.5.4.35.3.25.2.15.1.5.6.5.4.3.2.1.8.7.6.5.4.3.2.1.5.45.4.35.3.25.2.15.1.5 Turbo Coding & Decoding Turbo Decoding (2nd iteration, 2nd step) Decoder 1 Previous iteration Decoder 2 IND - Institute of Communication Systems and Data Processing, RWTH Aachen University 28

.4.35.3.25.2.15.1.5.4.35.3.25.2.15.1.5 1.8.6.4.2 Turbo Coding & Decoding Turbo Decoding (n th iteration, 1st step) Decoder 1 Decoder 2 IND - Institute of Communication Systems and Data Processing, RWTH Aachen University 29

.4.35.3.25.2.15.1.5.4.35.3.25.2.15.1.5 2.5 1.8.6.4.2 2 1.5 1.5 Turbo Coding & Decoding Turbo Decoding (n th iteration, 1st step) Decoder 1 Decoder 2 IND - Institute of Communication Systems and Data Processing, RWTH Aachen University 3

.4.35.3.25.2.15.1.5.4.35.3.25.2.15.1.5 2.5 2 1.5 1.5 Turbo Coding & Decoding Turbo Decoding (n th iteration, 2nd step) Decoder 1 Decoder 2 IND - Institute of Communication Systems and Data Processing, RWTH Aachen University 31

.4.35.3.25.2.15.1.5.4.35.3.25.2.15.1.5 2.5 2 1.5 1.5 5 4.5 4 3.5 3 2.5 2 1.5 1.5 Turbo Coding & Decoding Turbo Decoding (n th iteration, 2nd step) Decoder 1 Decoder 2 IND - Institute of Communication Systems and Data Processing, RWTH Aachen University 32

Turbo Principle - Interleaver Design Block interleaver vs. random interleaver Example: Propagation of a single information Block: (5 x 5) Random: Better information distribution with random interleavers Careful interleaver design required IND - Institute of Communication Systems and Data Processing, RWTH Aachen University 33

Turbo Principle - Interleaver Design Block interleaver vs. random interleaver Example: Propagation of a single information Block: (5 x 5) Random: Better information distribution with random interleavers Careful interleaver design required IND - Institute of Communication Systems and Data Processing, RWTH Aachen University 34

Turbo Source Coding Can Turbo codes be used for entropy coding? Yes! Turbo Codes for compressing binary memoryless sources [Garcia-Frias 2] Only transmitting a fraction of the output bits such that the overall coding rate > 1 Decoder has to take into consideration the statistics of the source (unequal distribution of bits) Source statistics can be estimated at the decoder IND - Institute of Communication Systems and Data Processing, RWTH Aachen University 35

Transmitter Transmission with Turbo Source Coding Receiver IND - Institute of Communication Systems and Data Processing, RWTH Aachen University 36

Transmitter Transmission with Turbo Source Coding Receiver IND - Institute of Communication Systems and Data Processing, RWTH Aachen University 37

Transmitter Transmission with Turbo Source Coding Mapping of quantizer reproduction levels to bit patterns Receiver 1 1 2 1 7 111 IND - Institute of Communication Systems and Data Processing, RWTH Aachen University 38

Turbo Source Coding Turbo coder, channel coding rate 1/3 (1 bit 3 bit) Before puncturing: After puncturing (compression ratio.5, 1 bit.5 bit) IND - Institute of Communication Systems and Data Processing, RWTH Aachen University 39

Turbo Source Coding Turbo coder, channel coding rate 1/3 (1 bit 3 bit) Before puncturing: After puncturing (compression ratio.5, 1 bit.5 bit) IND - Institute of Communication Systems and Data Processing, RWTH Aachen University 4

Turbo Source Coding Turbo coder, channel coding rate 1/3 (1 bit 3 bit) Before puncturing: After puncturing (compression ratio.5, 1 bit.5 bit) IND - Institute of Communication Systems and Data Processing, RWTH Aachen University 41

Turbo Source Coding Puncturing unit corresponds to binary erasure channel Adapt puncturing w.r.t. source statistics Theoretical minimum rate: source entropy Realization of puncturing Regular puncturing Pseudo-random puncturing Puncturing has to be known at the receiver Adaptively increase number of transmitted bits with increasing channel noise IND - Institute of Communication Systems and Data Processing, RWTH Aachen University 42

Turbo Source Coding Simulation results for a binary source [Garcia-Frias 2] Comparison with standard Unix compression tools compress, gzip, and bzip2 Source entropy Achieved compression ratios H(X) Turbo compress gzip bzip2.88.16.16.16.15.2864.38.41.41.44.469.58.65.6.67.698.75.83.72.83.7219.87.98.8.96 IND - Institute of Communication Systems and Data Processing, RWTH Aachen University 43

Turbo Source Coding Advantages: Robustness against transmission errors If the channel quality drops, puncturing can be changed in order to transmit more parity bits High flexibility by adapting puncturing on the fly Disadvantages: Higher computational costs than conventional entropy coding schemes Lossless compression is not guaranteed, Turbo decoder might not decode every bit correctly Difficult to adapt to varying parameter statistics for the use in speech and audio codecs IND - Institute of Communication Systems and Data Processing, RWTH Aachen University 44

Lossless Turbo Source Coding Lossless Turbo source coding [Hagenauer 4] Adapt puncturing such that lossless decoding is possible Analysis-by-Synthesis encoder: change puncturing and test if decodable without error Small amount of side information required IND - Institute of Communication Systems and Data Processing, RWTH Aachen University 45

Non-Binary Sources Turbo Source Coding only for binary sources Feasible only if bit pattern after source coding has low entropy H(X) < 1 Extension towards non-binary sources Utilization of non-binary Turbo codes [Zhao 2] Utilization of special binary LDPC [Zhong 5] Utilization of non-binary LDPC codes [Potluri 7] Joint source-channel coding approach However, any redundancy in the source will usually help if it is utilized at the receiving point. [...] redundancy will help combat noise., Shannon 1948 IND - Institute of Communication Systems and Data Processing, RWTH Aachen University 46

Iterative Source-Channel Decoding Approach: Enhancement of the robustness of transmission of variable length codes Iterative Source-Channel Decoding (ISCD) of variable length codes (VLC) [Guyader 1] Combats adverse effects of channel noise Exploits the structure and redundancy of variable length codes Achieve near-capacity system performance Works considerably well for Huffman codes Difficult to adapt to arithmetic codes Very high computational complexity Limited flexibility! IND - Institute of Communication Systems and Data Processing, RWTH Aachen University 47

Iterative Source-Channel Decoding Approach: Enhancement of the robustness of transmission of variable length codes Iterative Source-Channel Decoding (ISCD) of variable length codes (VLC) [Guyader 1] Combats adverse effects of channel noise Exploits the structure and redundancy of variable length codes Achieve near-capacity system performance Works considerably well for Huffman codes Difficult to adapt to arithmetic codes Very high computational complexity Limited flexibility! IND - Institute of Communication Systems and Data Processing, RWTH Aachen University 48

Iterative Source-Channel Decoding Iterative Source-Channel Decoding (ISCD) [Adrat 1] for fixed-length codes (FLC) Combat adverse effects of channel noise Exploit residual source redundancy Achieve near-capacity overall system performance [Clevorn 6] Can also be used effectively for compression [Thobaben 8] Leave all redundancy in the source symbols Utilization of redundant bit mappings Puncture output of convolutional code in order to obtain compression IND - Institute of Communication Systems and Data Processing, RWTH Aachen University 49

Turbo Codes, Concept Concatenated Encoding parallel scheme [Berrou 93] serial scheme [Benedetto 98] Iterative Turbo Decoding advantages by iterative feedback! IND - Institute of Communication Systems and Data Processing, RWTH Aachen University 5

Turbo Codes, Concept Concatenated Encoding parallel scheme [Berrou 93] serial scheme [Benedetto 98] Iterative Turbo Decoding advantages by iterative feedback! IND - Institute of Communication Systems and Data Processing, RWTH Aachen University 51

Transmission with ISCD Transmitter Receiver IND - Institute of Communication Systems and Data Processing, RWTH Aachen University 52

Transmission with ISCD Transmitter Receiver IND - Institute of Communication Systems and Data Processing, RWTH Aachen University 53

Transmission with ISCD Transmitter Mapping of quantizer repro-duction levels to bit patterns. Additional redundancy, e.g., by parity check bit 1 1 1 2 1 1 7 111 1 Receiver IND - Institute of Communication Systems and Data Processing, RWTH Aachen University 54

Soft Decision Source Decoding Quantization & Bit mapping Exploitation of residual redundancy for quality improvement 1D a priori knowledge (parameter distribution) Channel Decoder Bit pattern redundancy (e.g., parity check bits) Soft Decision Source Dec. Calculate extrinsic feedback information using source statistics and bit mapping redundancy [Adrat 1] IND - Institute of Communication Systems and Data Processing, RWTH Aachen University 55

Soft Decision Source Decoding Quantization & Bit mapping Exploitation of residual redundancy for quality improvement 1D a priori knowledge (parameter distribution) 2D a priori knowledge (parameter correlation) Channel Decoder Bit pattern redundancy (e.g., parity check bits) Soft Decision Source Dec. Calculate extrinsic feedback information using source statistics of order and 1 and bit mapping redundancy [Adrat 1] IND - Institute of Communication Systems and Data Processing, RWTH Aachen University 56

ISCD Source Compression Simulation example System components Scalar quantization with Q levels Single parity check code index assignment R > 1 convolutional code, random puncturing [Thobaben 7] bits K parameters bits IND - Institute of Communication Systems and Data Processing, RWTH Aachen University 57

Iterative Source-Channel Decoding Simulation results for simple experiment Gauss-Markov source (AR(1) process) Lloyd-Max Quantization with Q = 16 levels 25 decoding iterations Unoptimized standard system components! Source correlation Entropy Achieved compression ratios ρ H(U t U t-1 ) / 4 ISCD compress gzip bzip2.8.76.83.99.86.82.85.72.78.94.83.78.9.66.74.87.8.72.95.55.62.73.71.62 IND - Institute of Communication Systems and Data Processing, RWTH Aachen University 58

Possible Application in FlexCode Entropy coding for constrained entropy quantization Quantizer reproduction levels Utilization of entropy coding, e.g., arithmetic coding Can be replaced by the presented compression scheme IND - Institute of Communication Systems and Data Processing, RWTH Aachen University 59

Possible Application in FlexCode Entropy coding for constrained entropy quantization Flexible adaptation to varying channel conditions by puncturing adaptation IND - Institute of Communication Systems and Data Processing, RWTH Aachen University 6

Conclusions Channel codes can be used for compression Turbo codes for compressing binary sources Iterative Source-Channel Decoding for fixed length codes Joint Source-Channel Coding with Iterative Decoding for Source Compression Promising results already with unoptimized system components IND - Institute of Communication Systems and Data Processing, RWTH Aachen University 61

References [Guionnet 4] [Caire 3] [Berrou 93] [Benedetto 98] [Garcia-Frias 2] [Hagenauer 4] [Zhao 2] [Zhong 5] [Potluri 7] [Guyader 1] [Adrat 1] T. Guionnet and C. Guillemot, Soft and joint source-channel decoding of quasi-arithmetic codes, EURASIP Journal on Applied Signal Processing, vol. 24, no. 3, pp. 393 411, Mar. 24. G. Caire, S. Shamai and S. Verdú, "Universal Data Compression with LDPC Codes," Third International Symposium On Turbo Codes and Related Topics, Brest, France, September 1-5, 23 C. Berrou, A. Glavieux, P. Thitimajshima, Near Shannon Limit Error-Correcting Coding and Decoding: Turbo Codes, International Conference on Communications, Geneve, Switzerland, Mai, 1993 S. Benedetto, D. Divsalar, G. Montorsi, F. Pollara, Analysis, Design, and Iterative Decoding of Double Serially Concatenated Codes with Interleavers, IEEE Journal on Sel. Areas in Comm., vol. 16, no. 2, February 1998 J. Garcia-Frias, Y. Zhao, Compression of Binary Memoryless Sources Using Punctured Turbo Codes, IEEE Comm. Letters, vol. 6, no. 9, September 22 J. Hagenauer, J. Barros, A. Schaefer, Lossless Turbo Source Coding with Decremental Redundancy, ITG Conference on Source and Channel Coding (SCC), Erlangen, 24 Y. Zhao, J. Garcia-Frias, Data Compression of Correlated Non-Binary Sources Using Punctured Turbo Codes, IEEE Data Compression Conference (DCC), 22 W. Zhong and J. García-Frías: Compression of Non-Binary Sources Using LDPC Codes, CISS, March 25, Baltimore, Maryland. M. Potluri, S. Chilumuru, S. Kambhampati, K. R. Namuduri, Distributed Source Coding using non-binary LDPC codes for sensor network applications, Canadian Workshop on Information Theory, June 27 A. Guyader, E. Fabre, C. Guillemot, and M. Robert, Joint source-channel turbo decoding of entropy-coded sources, IEEE Journal on Sel. Areas in Comm., vol. 19, no. 9, pp. 168 1696, Sept. 21. M. Adrat, J.-M. Picard, P. Vary, Soft-Bit Source Decoding Based on the Turbo Principle, IEEE Vehicular Technology Conference (VTC-Fall), Atlantic City, Oct. 21. IND - Institute of Communication Systems and Data Processing, RWTH Aachen University 62

References [Clevorn 6] [Thobaben 8] [Thobaben 7] T. Clevorn, L. Schmalen, P. Vary On the Optimum Performance Theoretically Attainable for Scalarly Quantized Correlated Sources, International Symposium on Information Theory and its Applications (ISITA), Seoul, Korea, Oct. 26. R. Thobaben, L. Schmalen, P. Vary, Joint Source-Channel Coding with Inner Irregular Codes, International Symposium on Information Theory (ISIT), Toronto, CA, July 28 R. Thobaben, A new transmitter concept for iteratively-decoded source-channel coding schemes, IEEE SPAWC, Helsinki, June 27 IND - Institute of Communication Systems and Data Processing, RWTH Aachen University 63

Institute of Communication Systems and Data Processing Prof. Dr.-Ing. Peter Vary Turbo Source Coding Laurent Schmalen and Peter Vary http://www.flexcode.eu IND - Institute of Communication Systems and Data Processing, RWTH Aachen University 64