Information Theory of Cooperative Communication

Speaker: Dr.Michael Gastpar, University of California-Berkeley
Abstract: Wireless communication networks inherently offer many possibilities for nodes to cooperate, owing to the fact that any node can overhear all transmissions in a certain geographical range. The challenge for the communications engineer is to understand under what conditions such cooperation offers substantial benefits, and to design strategies that permit to exploit this potential. What can information theory tell us about this? In this talk, we will use something old (max-flow min-cut bounds) to obtain something new (capacity theorems for relay situations), and we extend something borrowed (dependence-balance arguments: to obtain capacity results for the many-user Gaussian MAC with feedback and bounds for noisy feedback), but in the end, of course, we still feel kind of blue (the union between information theory and communication networks seems to remain unconsummated). This talk is based on joint work with Gerhard Kramer (Bell Labs) and Martin Vetterli (EPFL).
Biography: Michael Gastpar received his Engineering degree from ETH in Zurich (1997), his MS from the University of Illinois at Urbana-Champaign (1999), and his Ph.D. from EPFL in Lausanne (2002). He was also student in engineering and philosophy at the Universities of Edinburgh and Lausanne, and a summer researcher in the Mathematics of Communications Department at Bell Labs, Lucent Technologies, Murray Hill, NJ. He is now an Assistant Professor at the University of California, Berkeley. His research interests are in network information theory and related coding and signal processing techniques, with applications to sensor networks and neuroscience. He won the 2002 EPFL Best Thesis Award and an NSF CAREER award in 2004.
Presentation On: Friday,28 April, 2006,
11:00 a.m. in room 1115, CSIC
Videotape: <Not available.>