06
Lug
16

Understanding Interdependency Through Complex Information Sharing

See on Scoop.itInformation Processing in Complex Systems

The interactions between three or more random variables are often nontrivial, poorly understood and, yet, are paramount for future advances in fields such as network information theory, neuroscience and genetics. In this work, we analyze these interactions as different modes of information sharing. Towards this end, and in contrast to most of the literature that focuses on analyzing the mutual information, we introduce an axiomatic framework for decomposing the joint entropy that characterizes the various ways in which random variables can share information. Our framework distinguishes between interdependencies where the information is shared redundantly and synergistic interdependencies where the sharing structure exists in the whole, but not between the parts. The key contribution of our approach is to focus on symmetric properties of this sharing, which do not depend on a specific point of view for differentiating roles between its components. We show that our axioms determine unique formulas for all of the terms of the proposed decomposition for systems of three variables in several cases of interest. Moreover, we show how these results can be applied to several network information theory problems, providing a more intuitive understanding of their fundamental limits.

See on mdpi.com



Time is real? I think not

luglio: 2016
L M M G V S D
« Giu   Ago »
 123
45678910
11121314151617
18192021222324
25262728293031

Commenti recenti

Inserisci il tuo indirizzo e-mail per iscriverti a questo blog e ricevere notifiche di nuovi messaggi per e-mail.

Segui assieme ad altri 833 follower

Latest Tweets

Alessandro Cerboni


%d blogger cliccano Mi Piace per questo: