ORCID

https://orcid.org/0000-0002-5565-5747

Date of Award

Spring 5-11-2024

Author's School

McKelvey School of Engineering

Author's Department

Electrical & Systems Engineering

Degree Name

Master of Science (MS)

Degree Type

Thesis

Abstract

Mutual information between two random variables is a well-studied notion, whose understanding is fairly complete. Mutual information between one random variable and a pair of other random variables, however, is a far more involved notion. Specifically, Shannon's mutual information does not capture fine-grained interactions between those three variables, resulting in limited insights in complex systems. To capture these fine-grained higher-order interactions among variables, Williams and Beer proposed a framework called Partial Information Decomposition (PID) to decompose this mutual information to information atoms, called unique, redundant, and synergistic, and proposed several operational axioms that these atoms must satisfy. This conceptual framework provides a potential data-driven approach to reveal higher-order relationships between multiple source variables and a target variable, but still faces many problems, such as incomplete numerical calculations and restricted decomposition scales (multivariable mutual information). In this report, we introduce two works completed in the past semesters, in which one solved numerical calculations problem and another further expanded it to the system scale (whole entropy). In this way, we have the opportunity to implement data-driven methods to reveal higher-order interactions.

The first work is an explicit formula for Partial Information Decomposition. In spite of numerous efforts, a general formula that satisfies all the axioms of PID has yet to be found. Inspired by Judea Pearl's do-calculus, we resolve this open problem by introducing the do-operation, an operation over the variable system which sets a certain marginal to a desired value, which is distinct from any existing approaches. Using this operation, we provide the first explicit formula for calculating the information atoms so that Williams and Beer's axioms are satisfied, as well as additional properties from subsequent studies in the field.

The second work is a framework called System Information Decomposition. Diverging from the PID framework, which concentrates on the directional interactions from an array of source variables to a single target variable, we introduce a novel framework termed System Information Decomposition (SID). By proving all the information atoms are symmetric, the framework can further decompose the whole entropy of the system to capture all interactions among variables. This positions SID as a promising framework with the potential to foster a deeper understanding of higher-order relationships within complex systems across disciplines.

Language

English (en)

Chair

Andrew Clark

Committee Members

Andrew Clark, Xudong Chen, Netanel Raviv

Share

COinS