Videos

Shared Information

Presenter
September 29, 2015
Keywords:
  • Communication theory
MSC:
  • 94A05
Abstract
Shannon’s mutual information between two random variables is a fundamental and venerable concept in information and communication theory, statistics and beyond. What is a measure of mutual dependence among an arbitrary number of random variables? A notion of "shared information" among multiple terminals, that observe correlated random variables and communicate interactively among themselves, is shown to play a useful role in certain problems of distributed processing and secure computation. A larger role for shared information, which for two terminals particularizes to mutual information, is an open question. This talk is based on joint works with Imre Csiszár, Sirin Nitinawarat, Himanshu Tyagi and Shun Watanabe.