{"id":198691,"date":"2024-11-02T08:22:46","date_gmt":"2024-11-02T13:22:46","guid":{"rendered":"https:\/\/lifeboat.com\/blog\/2024\/11\/decomposing-causality-into-its-synergistic-unique-and-redundant-components"},"modified":"2024-11-02T08:22:46","modified_gmt":"2024-11-02T13:22:46","slug":"decomposing-causality-into-its-synergistic-unique-and-redundant-components","status":"publish","type":"post","link":"https:\/\/lifeboat.com\/blog\/2024\/11\/decomposing-causality-into-its-synergistic-unique-and-redundant-components","title":{"rendered":"Decomposing causality into its synergistic, unique, and redundant components"},"content":{"rendered":"<p><a class=\"aligncenter blog-photo\" href=\"https:\/\/lifeboat.com\/blog.images\/decomposing-causality-into-its-synergistic-unique-and-redundant-components2.jpg\"><\/a><\/p>\n<p>Information theory, the science of message communication<sup><a data-track=\"click\" data-track-action=\"reference anchor\" data-track-label=\"link\" data-test=\"citation-ref\" aria-label=\"Reference 44\" title=\"Shannon, C. E. A mathematical theory of communication. Bell Labs Tech. J. 27379&ndash;423 (1948).\" href=\"https:\/\/www.nature.com\/articles\/s41467-024-53373-4#ref-CR44\" id=\"ref-link-section-d125689063e764\">44<\/a><\/sup>, has also served as a framework for model-free causality quantification. The success of information theory relies on the notion of information as a fundamental property of physical systems, closely tied to the restrictions and possibilities of the laws of physics<sup><a data-track=\"click\" data-track-action=\"reference anchor\" data-track-label=\"link\" data-test=\"citation-ref\" aria-label=\"Reference 45\" title=\"Lozano-Dur\u00e1n, A. & Arranz, G. Information-theoretic formulation of dynamical systems: causality, modeling, and control. Phys. Rev. Res. 4, 023195 (2022).\" href=\"https:\/\/www.nature.com\/articles\/s41467-024-53373-4#ref-CR45\" id=\"ref-link-section-d125689063e768\">45<\/a>,<a data-track=\"click\" data-track-action=\"reference anchor\" data-track-label=\"link\" data-test=\"citation-ref\" aria-label=\"Reference 46\" title=\"Yuan, Y. & Lozano-Dur\u00e1n, A. Limits to extreme event forecasting in chaotic systems. Phys. D 467, 134246 (2024).\" href=\"https:\/\/www.nature.com\/articles\/s41467-024-53373-4#ref-CR46\" id=\"ref-link-section-d125689063e771\">46<\/a><\/sup>. The grounds for causality as information are rooted in the intimate connection between information and the arrow of time. Time-asymmetries present in the system at a macroscopic level can be leveraged to measure the causality of events using information-theoretic metrics based on the Shannon entropy<sup><a data-track=\"click\" data-track-action=\"reference anchor\" data-track-label=\"link\" data-test=\"citation-ref\" aria-label=\"Reference 44\" title=\"Shannon, C. E. A mathematical theory of communication. Bell Labs Tech. J. 27379&ndash;423 (1948).\" href=\"https:\/\/www.nature.com\/articles\/s41467-024-53373-4#ref-CR44\" id=\"ref-link-section-d125689063e775\">44<\/a><\/sup>. The initial applications of information theory for causality were formally established through the use of conditional entropies, employing what is known as directed information<sup><a data-track=\"click\" data-track-action=\"reference anchor\" data-track-label=\"link\" data-test=\"citation-ref\" aria-label=\"Reference 47\" title=\"Massey, J. Causality, feedback and directed information. In Proc. 1990 Int. Symp. on Inform. Theory and its Applications, 27&ndash;30 (1990).\" href=\"https:\/\/www.nature.com\/articles\/s41467-024-53373-4#ref-CR47\" id=\"ref-link-section-d125689063e779\">47<\/a>,<a data-track=\"click\" data-track-action=\"reference anchor\" data-track-label=\"link\" data-test=\"citation-ref\" aria-label=\"Reference 48\" title=\"Kramer, G. Directed information for channels with feedback. PhD Thesis, ETH Z\u00fcrich (1998).\" href=\"https:\/\/www.nature.com\/articles\/s41467-024-53373-4#ref-CR48\" id=\"ref-link-section-d125689063e782\">48<\/a><\/sup>. Among the most recognized contributions is transfer entropy (TE)<sup><a data-track=\"click\" data-track-action=\"reference anchor\" data-track-label=\"link\" data-test=\"citation-ref\" aria-label=\"Reference 49\" title=\"Schreiber, T. Measuring information transfer. Phys. Rev. Lett. 85,461 (2000).\" href=\"https:\/\/www.nature.com\/articles\/s41467-024-53373-4#ref-CR49\" id=\"ref-link-section-d125689063e786\">49<\/a><\/sup>, which measures the reduction in entropy about the future state of a variable by knowing the past states of another. Various improvements have been proposed to address the inherent limitations of TE. Among them, we can cite conditional transfer entropy (CTE)<sup><a data-track=\"click\" data-track-action=\"reference anchor\" data-track-label=\"link\" data-test=\"citation-ref\" title=\"Verdes, P. Assessing causality from multivariate time series. Phys. Rev. E 72, 026222 (2005).\" href=\"https:\/\/www.nature.com\/articles\/s41467-024-53373-4#ref-CR50\" id=\"ref-link-section-d125689063e791\">50<\/a>,<a data-track=\"click\" data-track-action=\"reference anchor\" data-track-label=\"link\" data-test=\"citation-ref\" title=\"Lizier, J. T., Prokopenko, M. & Zomaya, A. Y. Local information transfer as a spatiotemporal filter for complex systems. Phys. Rev. E 77, 026110 (2008).\" href=\"https:\/\/www.nature.com\/articles\/s41467-024-53373-4#ref-CR51\" id=\"ref-link-section-d125689063e791_1\">51<\/a>,<a data-track=\"click\" data-track-action=\"reference anchor\" data-track-label=\"link\" data-test=\"citation-ref\" title=\"Lizier, J. T., Prokopenko, M. & Zomaya, A. Y. Information modification and particle collisions in distributed computation. Chaos 20, 037109 (2010).\" href=\"https:\/\/www.nature.com\/articles\/s41467-024-53373-4#ref-CR52\" id=\"ref-link-section-d125689063e791_2\">52<\/a>,<a data-track=\"click\" data-track-action=\"reference anchor\" data-track-label=\"link\" data-test=\"citation-ref\" aria-label=\"Reference 53\" title=\"Bossomaier, T., Barnett, L., Harr\u00e9, M. & Lizier, J. T. An Introduction to Transfer Entropy: Information Flow in Complex Systems 1st edn (Springer International Publishing, 2016).\" href=\"https:\/\/www.nature.com\/articles\/s41467-024-53373-4#ref-CR53\" id=\"ref-link-section-d125689063e794\">53<\/a><\/sup>, which stands as the nonlinear, nonparametric extension of conditional GC<sup><a data-track=\"click\" data-track-action=\"reference anchor\" data-track-label=\"link\" data-test=\"citation-ref\" aria-label=\"Reference 27\" title=\"Barnett, L., Barrett, A. B. & Seth, A. K. Granger causality and transfer entropy are equivalent for Gaussian variables. Phys. Rev. Lett. 103, 238701 (2009).\" href=\"https:\/\/www.nature.com\/articles\/s41467-024-53373-4#ref-CR27\" id=\"ref-link-section-d125689063e798\">27<\/a><\/sup>. Subsequent advancements of the method include multivariate formulations of CTE<sup><a data-track=\"click\" data-track-action=\"reference anchor\" data-track-label=\"link\" data-test=\"citation-ref\" aria-label=\"Reference 45\" title=\"Lozano-Dur\u00e1n, A. & Arranz, G. Information-theoretic formulation of dynamical systems: causality, modeling, and control. Phys. Rev. Res. 4, 023195 (2022).\" href=\"https:\/\/www.nature.com\/articles\/s41467-024-53373-4#ref-CR45\" id=\"ref-link-section-d125689063e802\">45<\/a><\/sup> and momentary information transfer<sup><a data-track=\"click\" data-track-action=\"reference anchor\" data-track-label=\"link\" data-test=\"citation-ref\" aria-label=\"Reference 54\" title=\"Pompe, B. & Runge, J. Momentary information transfer as a coupling measure of time series. Phys. Rev. E 83, 051122 (2011).\" href=\"https:\/\/www.nature.com\/articles\/s41467-024-53373-4#ref-CR54\" id=\"ref-link-section-d125689063e806\">54<\/a><\/sup>, which extends TE by examining the transfer of information at each time step. Other information-theoretic methods, derived from dynamical system theory<sup><a data-track=\"click\" data-track-action=\"reference anchor\" data-track-label=\"link\" data-test=\"citation-ref\" title=\"Liang, X. S. & Kleeman, R. Information transfer between dynamical system components. Phys. Rev. Lett. 95, 244101 (2006).\" href=\"https:\/\/www.nature.com\/articles\/s41467-024-53373-4#ref-CR55\" id=\"ref-link-section-d125689063e810\">55<\/a>,<a data-track=\"click\" data-track-action=\"reference anchor\" data-track-label=\"link\" data-test=\"citation-ref\" title=\"Liang, X. S. Information flow and causality as rigorous notions ab initio. Phys. Rev. E 94, 052201 (2016).\" href=\"https:\/\/www.nature.com\/articles\/s41467-024-53373-4#ref-CR56\" id=\"ref-link-section-d125689063e810_1\">56<\/a>,<a data-track=\"click\" data-track-action=\"reference anchor\" data-track-label=\"link\" data-test=\"citation-ref\" title=\"Liang, X. S. Information flow within stochastic dynamical systems. Phys. Rev. E 78, 031113 (2008).\" href=\"https:\/\/www.nature.com\/articles\/s41467-024-53373-4#ref-CR57\" id=\"ref-link-section-d125689063e810_2\">57<\/a>,<a data-track=\"click\" data-track-action=\"reference anchor\" data-track-label=\"link\" data-test=\"citation-ref\" aria-label=\"Reference 58\" title=\"Liang, X. S. The Liang-Kleeman information flow: theory and applications. Entropy 15327&ndash;360 (2013).\" href=\"https:\/\/www.nature.com\/articles\/s41467-024-53373-4#ref-CR58\" id=\"ref-link-section-d125689063e813\">58<\/a><\/sup>, quantify causality as the amount of information that flows from one process to another as dictated by the governing equations.<\/p>\n<p>Another family of methods for causal inference relies on conducting conditional independence tests. This approach was popularized by the Peter-Clark algorithm (PC)<sup><a data-track=\"click\" data-track-action=\"reference anchor\" data-track-label=\"link\" data-test=\"citation-ref\" aria-label=\"Reference 59\" title=\"Spirtes, P. & Glymour, C. An algorithm for fast recovery of sparse causal graphs. Soc. Sci. Comput. Rev. 9, 62&ndash;72 (1991).\" href=\"https:\/\/www.nature.com\/articles\/s41467-024-53373-4#ref-CR59\" id=\"ref-link-section-d125689063e820\">59<\/a><\/sup>, with subsequent extensions incorporating tests for momentary conditional independence (PCMCI)<sup><a data-track=\"click\" data-track-action=\"reference anchor\" data-track-label=\"link\" data-test=\"citation-ref\" aria-label=\"Reference 23\" title=\"Runge, J., Nowack, P., Kretschmer, M., Flaxman, S. & Sejdinovic, D. Detecting and quantifying causal associations in large nonlinear time series datasets. Sci. Adv. 5, eaau4996 (2019).\" href=\"https:\/\/www.nature.com\/articles\/s41467-024-53373-4#ref-CR23\" id=\"ref-link-section-d125689063e824\">23<\/a>,<a data-track=\"click\" data-track-action=\"reference anchor\" data-track-label=\"link\" data-test=\"citation-ref\" aria-label=\"Reference 60\" title=\"Runge, J. Conditional independence testing based on a nearest-neighbor estimator of conditional mutual information. In Proc. Twenty-First International Conference on Artificial Intelligence and Statistics, Vol. 84 (eds Storkey, A. & Perez-Cruz, F.) 938&ndash;947 (PMLR, 2018).\" href=\"https:\/\/www.nature.com\/articles\/s41467-024-53373-4#ref-CR60\" id=\"ref-link-section-d125689063e827\">60<\/a><\/sup>. PCMCI aims to optimally identify a reduced conditioning set that includes the parents of the target variable<sup><a data-track=\"click\" data-track-action=\"reference anchor\" data-track-label=\"link\" data-test=\"citation-ref\" aria-label=\"Reference 61\" title=\"Runge, J. Modern causal inference approaches to investigate biodiversity-ecosystem functioning relationships. Nat. Commun. 14, 1917 (2023).\" href=\"https:\/\/www.nature.com\/articles\/s41467-024-53373-4#ref-CR61\" id=\"ref-link-section-d125689063e831\">61<\/a><\/sup>. This method has been shown to be effective in accurately detecting causal relationships while controlling for false positives<sup><a data-track=\"click\" data-track-action=\"reference anchor\" data-track-label=\"link\" data-test=\"citation-ref\" aria-label=\"Reference 23\" title=\"Runge, J., Nowack, P., Kretschmer, M., Flaxman, S. & Sejdinovic, D. Detecting and quantifying causal associations in large nonlinear time series datasets. Sci. Adv. 5, eaau4996 (2019).\" href=\"https:\/\/www.nature.com\/articles\/s41467-024-53373-4#ref-CR23\" id=\"ref-link-section-d125689063e835\">23<\/a><\/sup>. Recently, new PCMCI variants have been developed for identifying contemporaneous links<sup><a data-track=\"click\" data-track-action=\"reference anchor\" data-track-label=\"link\" data-test=\"citation-ref\" aria-label=\"Reference 62\" title=\"Runge, J. Discovering contemporaneous and lagged causal relations in autocorrelated nonlinear time series datasets. In Conference on Uncertainty in Artificial Intelligence 1388&ndash;1397 (PMLR, 2020).\" href=\"https:\/\/www.nature.com\/articles\/s41467-024-53373-4#ref-CR62\" id=\"ref-link-section-d125689063e839\">62<\/a><\/sup>, latent confounders<sup><a data-track=\"click\" data-track-action=\"reference anchor\" data-track-label=\"link\" data-test=\"citation-ref\" aria-label=\"Reference 63\" title=\"Gerhardus, A. & Runge, J. High-recall causal discovery for autocorrelated time series with latent confounders. In Advances in Neural Information Processing Systems Vol. 33 (eds Larochelle, H. et al.) 12615&ndash;12625 (Curran Associates, Inc., 2020).\" href=\"https:\/\/www.nature.com\/articles\/s41467-024-53373-4#ref-CR63\" id=\"ref-link-section-d125689063e844\">63<\/a><\/sup>, and regime-dependent relationships<sup><a data-track=\"click\" data-track-action=\"reference anchor\" data-track-label=\"link\" data-test=\"citation-ref\" aria-label=\"Reference 64\" title=\"Saggioro, E., de Wiljes, J., Kretschmer, M. & Runge, J. Reconstructing regime-dependent causal relationships from observational time series. Chaos 30, 113115 (2020).\" href=\"https:\/\/www.nature.com\/articles\/s41467-024-53373-4#ref-CR64\" id=\"ref-link-section-d125689063e848\">64<\/a><\/sup>.<\/p>\n<p>The methods for causal inference discussed above have significantly advanced our understanding of cause-effect interactions in complex systems. Despite the progress, current approaches face limitations in the presence of nonlinear dependencies, stochastic interactions (i.e., noise), self-causation, mediator, confounder, and collider effects, to name a few. Moreover, they are not capable of classifying causal interactions as redundant, unique, and synergistic, which is crucial to identify the fundamental relationships within the system. Another gap in existing methodologies is their inability to quantify causality that remains unaccounted for due to unobserved variables. To address these shortcomings, we propose SURD: Synergistic-Unique-Redundant Decomposition of causality. SURD offers causal quantification in terms of redundant, unique, and synergistic contributions and provides a measure of the causality from hidden variables. The approach can be used to detect causal relationships in systems with multiple variables, dependencies at different time lags, and instantaneous links. We demonstrate the performance of SURD across a large collection of scenarios that have proven challenging for causal inference and compare the results to previous approaches.<\/p>\n","protected":false},"excerpt":{"rendered":"<p>Information theory, the science of message communication44, has also served as a framework for model-free causality quantification. The success of information theory relies on the notion of information as a fundamental property of physical systems, closely tied to the restrictions and possibilities of the laws of physics45,46. The grounds for causality as information are rooted [\u2026]<\/p>\n","protected":false},"author":661,"featured_media":0,"comment_status":"open","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[20,41],"tags":[],"class_list":["post-198691","post","type-post","status-publish","format-standard","hentry","category-futurism","category-information-science"],"_links":{"self":[{"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/posts\/198691","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/users\/661"}],"replies":[{"embeddable":true,"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/comments?post=198691"}],"version-history":[{"count":0,"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/posts\/198691\/revisions"}],"wp:attachment":[{"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/media?parent=198691"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/categories?post=198691"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/tags?post=198691"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}