Lectures

Jump to other IT Society Websites:

Spatially coupled LDPC codes: from theory to practice
Daniel J. Costello, Jr. (University of Notre Dame)

 

Abstract

Following a review of the basic concepts of low-density parity-check (LDPC) block codes, we examine spatially coupled LDPC codes from several different perspectives. First, asymptotic ensemble average properties are presented, including both iterative decoding thresholds and minimum distance growth rates. Protograph-based code ensembles form the basis for this discussion. Next, finite-length code properties, such as bit-error-rate (BER) performance, decoding latency, and decoding complexity are reviewed. With sliding window decoding, we see that spatially coupled codes achieve a BER performance advantage compared to block codes for fixed decoding latency and practical block (window) lengths, and decoding complexity results are presented for both binary and non-binary codes. Finally, the BER performance advantages of spatially coupled codes are shown to also hold when puncturing is employed to achieve higher code rates.

Biography

Daniel J. Costello, Jr. was born in Seattle, WA, on August 9, 1942. He received the B.S.E.E. degree from Seattle University, Seattle, WA, in 1964, and the M.S. and Ph.D. degrees in Electrical Engineering from the University of Notre Dame, Notre Dame, IN, in 1966 and 1969, respectively. Dr. Costello joined the faculty of the Illinois Institute of Technology, Chicago, IL, in 1969. In 1985 he became Professor of Electrical Engineering at the University of Notre Dame, Notre Dame, IN, and from 1989 to 1998 served as Chair of the Department of Electrical Engineering. In 1991, he was selected as one of 100 Seattle University alumni to receive the Centennial Alumni Award in recognition of alumni who have displayed outstanding service to others, exceptional leadership, or uncommon achievement. In 1999, he received a Humboldt Research Prize from the Alexander von Humboldt Foundation in Germany. In 2000, he was named the Leonard Bettex Professor of Electrical Engineering at Notre Dame, and in 2009 he became Bettex Professor Emeritus. Dr. Costello has been a member of IEEE since 1969 and was elected Fellow in 1985. He served 18 years as a member of the Information Theory Society Board of Governors, and in 1986 he was President of the BOG. In 2000, the IEEE Information Theory Society selected him as a recipient of a Third-Millennium Medal. In 2009, he was co-recipient of the IEEE Donald G. Fink Prize Paper Award, which recognizes an outstanding survey, review, or tutorial paper in any IEEE publication issued during the previous calendar year. In 2012, he was a co-recipient of the joint IEEE Information Theory Society/Communications Society Prize Paper Award, which recognizes an outstanding research paper in the IT or COM Transactions during the previous two calendar years. In 2013, he received the Aaron D. Wyner Distinguished Service Award from the IEEE Information Theory Society, which recognizes outstanding leadership in and long standing exceptional service to the Information Theory community. In 2015 he received the IEEE Leon K. Kirchner Graduate Teaching Award, which recognizes inspirational teaching of graduate students in the IEEE fields of interest. Dr. Costello's research interests are in digital communications, with special emphasis on error control coding and coded modulation. He has numerous technical publications in his field, and in 1983 he co-authored a textbook entitled "Error Control Coding: Fundamentals and Applications", the 2nd edition of which was published in 2004.

Personal webpage

 


 

Channel reliability: from ordinary to zero-error capacity
Marco Dalai (University of Brescia)

 

Abstract

Bounds on the probability of error of optimal fixed length coding schemes have a long history in information theory. In their simplest form, such bounds allow one to determine the capacity C of a channel, the highest rate at which such probability of error vanishes for increasing block-length. At fixed rates below channel capacity, the probability of error is known to decrease exponentially in the block-length, and a central question is understanding precisely how fast, that is determining the so called error exponent.

While we have good understanding of error exponents near capacity, the region of low rates remains mysterious, even for such a simple case as the binary symmetric channel. The situation is even more baffling when the channel has a positive zero-error capacity C_0, that is, it allows communication at positive rates with probability of error precisely equal to zero. Determining the zero-error capacity is in itself perhaps one of the hardest open problems in information theory; whenever C_0 is positive then, even in those cases where it is easy to compute, we have very little understanding of error exponents at slightly higher rates. Our knowledge becomes even more incomplete when we come to consider variations on the problem such as list decoding schemes or so called classical-quantum channels.

In this lecture, we will review some of the most important classical results on error exponents and zero-error capacity, we will present some recent advances, and finally discuss some fundamental open problems.

Biography

Marco Dalai is an assistant professor in the Department of Information Engineering at the University of Brescia, Italy. He received his Laurea degree in Electronics Engineering and his PhD in Information Engineering both from University of Brescia in 2003 and 2007, respectively. His research interests include information theory, signal processing and statistical inference. He received the 2014 IEEE Information Theory Society Paper Award.

Personal webpage

 


 

Fundamental limits in asynchronous communication
Aslan Tchamkerten (Telecom ParisTech)

 

Abstract

The traditional information theory approach to reliable communication ignores issues related to synchronization and assumes that the receiver is cognizant of the message transmission period. This is justified when messages are long and sent continuously as synchronization overhead is amortized over many information bits. By contrast when messages are relatively short, synchronization can no longer be ignored. What are the fundamental limitations due to the lack of a priori synchrony between the transmitter and the receiver? In this tutorial we will develop a theory of asynchronous communication and will address basic questions such as "Is it always possible to communicate asynchronously?", "Does asynchronism always impact communication?", or "What is the energy needed to convey 1 bit asynchronously?".

Biography

Dr. Tchamkerten got his physics diploma and Ph.D. degree in information theory at EPFL in 2000 and 2005, respectively.  Over 2005-2008 he was a postdoc at MIT. Since 2008, Dr. Tchamkerten has been on the faculty at Telecom ParisTech, Communications and Electronics department, where where he is currently Associate Professor. In 2009 he was awarded a junior excellence chair grant from the French National Research Agency (ANR), in 2011 he got his habilitation, and during the 2014-2015 academic year he was on sabbatical at Stanford. In 2016 he was the general chair of the thematic program "Nexus of Information and Computation Theories'' held at the Institut Henri Poincaré. Dr. Tchamkerten currently serves as a Shannon Theory AE for the IEEE Transactions on Information Theory.

Personal webpage 

 


 

Information-theoretic signal processing
Ram Zamir (Tel Aviv University)

 

Abstract

Information and spectrum are parallel concepts in information theory and signal processing. Entropy and spectral bandwidth play similar roles in measuring the richness of a signal, while compression and sampling aim at reducing a signal to its minimal representation. On the practical level, signal processing techniques can simplify the structure of information theoretic solutions. For example, Wiener estimation can replace "joint typicality decoding" on the AWGN channel, while linear prediction can reduce the realization of the rate-distortion function of a colored Gaussian process to a scalar mutual information. A key element in these paradigms is "dithering" (codebook randomization), which elegantly smooths the transition between the analog regime (of signal processing) and digital regime (of information theory).

After surveying these basic ideas in the first part of my lecture, I will turn to describe the problem of multiple description source coding. This problem suggests interesting challenges both in the information theoretic side, as well as in applying signal processing techniques toward a simple solution. We shall explore various elements, such as oversampling and interpolation, noise shaping, reconstruction from non-uniform sampling, analog coding, good frames, and more.

Biography

Ram Zamir was born in Ramat-Gan, Israel in 1961. He received the B.Sc., M.Sc. (summa cum laude) and D.Sc. (with distinction) degrees from Tel-Aviv University, Israel, in 1983, 1991, and 1994, respectively, all in electrical engineering. In the years 1994 - 1996 he spent a post-doctoral period at Cornell University, Ithaca, NY, and at the University of California, Santa Barbara. In 2002 he spent a Sabbatical year at MIT, and in 2008 and 2009 short Sabbaticals at ETH and MIT. Since 1996 he has been with the department of Elect. Eng. - Systems at Tel Aviv University.

Ram Zamir has been consulting in the areas of radar and communications (DSL and WiFi), where he was involved with companies like Orckit and Actelis. During the period 2005-2014 he was the Chief Scientist of Celeno Communications. He has been teaching information theory, data compression, random processes, communications systems and communications circuits at Tel Aviv University. He is an IEEE fellow since 2010. He served as an Associate Editor for Source Coding in the IEEE transactions on Information Theory (2001-2003), headed the Information Theory Chapter of the Israeli IEEE society (2000-2005), and was a member of the BOG of the society (2013-2015). His research interests include information theory (in particular: lattice codes for multi-terminal problems), source coding, communications and statistical signal processing. His book "Lattice coding for signals and networks" was published in 2014.

Personal webpage