摘要(英) |
Understanding the causality relationships in complex systems is an important aspect of determining the direction of information flow. Transfer entropy is regarded as a method for detecting information flow. However, traditional approaches using transfer entropy require determining the history length beforehand, and different history lengths can lead to contradictory results. This is particularly challenging for anticipation systems, as transfer entropy requires longer history lengths to obtain the correct direction of information flow. However, excessively long history lengths can introduce significant bias in transfer entropy, making the results difficult to interpret. In this study, we propose a new method called ”compressed transfer entropy” that estimates transfer entropy based on compression algorithms. This method enables the detection of information flow direction without the need for specifying a history length in advance. Two models with unidirectional information flow, along with real-world zebrafish data, were utilized to evaluate the detection capability of compressed transfer entropy, comparing it with traditional transfer entropy methods. The results demonstrate that compressed transfer entropy provides accurate results without the requirement of specifying a historical length. |
參考文獻 |
[1] Norbert Wiener. “The Theory of Prediction”. In: Modern Mathematics for the Engineer: First Series. Ed. by E F Beckenbach. New York: McGraw-Hill, 1956.
[2] C W J Granger. “Investigating Causal Relations by Econometric Models and Cross- spectral Methods”. In: Econometrica 37.3 (1969), pp. 424–438.
[3] T Schreiber. “Measuring information transfer”. In: Phys Rev Lett 85.2 (2000), pp. 461– 464.
[4] S Kullback. Information Theory and Statistics. Wiley, 1959.
[5] L Barnett, A B Barrett, and A K Seth. “Granger causality and transfer entropy are
equivalent for Gaussian variables”. In: Phys Rev Lett 103.23 (2009), p. 238701.
[6] Can-Zhong Yao and Hong-Yu Li. “Effective Transfer Entropy Approach to Information Flow Among EPU, Investor Sentiment and Stock Market”. en. In: Frontiers in Physics 8 (June 2020).
[7] M Wibral et al. “Measuring information-transfer delays”. In: PLoS One 8.2 (2013), e55809.
[8] Michael Wibral and Raul Vicente. Directed Information Measures in Neuroscience. Un-
derstanding Complex Systems. 2014.
[9] S Butail, V Mwaffo, and M Porfiri. “Model-free information-theoretic approach to infer
leadership in pairs of zebrafish”. In: Phys Rev E 93 (Apr. 2016), p. 042411.
[10] S Butail et al. “Information Flow in Animal-Robot Interactions”. en. In: Entropy 16.3
(Mar. 2014), pp. 1315–1330.
[11] M Porfiri and M Ruiz Marin. “Symbolic dynamics of animal interaction”. In: J Theor Biol 435 (2017), pp. 145–156.
[12] X S Liang. “Unraveling the cause-effect relation between time series”. In: Phys Rev E Stat Nonlin Soft Matter Phys 90.5-1 (Nov. 2014), p. 052150.
[13] Claude Elwood Shannon. “A mathematical theory of communication”. In: The Bell system technical journal 27.3 (1948), pp. 379–423.
[14] Kozachenko, L And Leonenko. “On statistical estimation of entropy of a random vector”. In: Problems In-formation Transmission (1987).
[15] Joseph T Lizier. “JIDT: An Information-Theoretic Toolkit for Studying the Dynamics of Complex Systems”. en. In: Frontiers in Robotics and AI 1 (Dec. 2014).
BIBLIOGRAPHY
[16] L Paninski. “Estimation of entropy and mutual information”. en. In: Neural Computation 15.6 (June 2003), pp. 1191–1253.
[17] M Staniek and K Lehnertz. “Symbolic transfer entropy”. In: Phys Rev Lett 100.15 (2008), p. 158101.
[18] Liangyue Cao. “Practical method for determining the minimum embedding dimension of a scalar time series”. In: Physica D: Nonlinear Phenomena 110.1 (Dec. 1997), pp. 43–50.
[19] Michael Lindner et al. “TRENTOOL: A Matlab open source toolbox to analyse informa- tion flow in time series data with transfer entropy”. In: BMC Neuroscience 12.1 (Nov. 2011), p. 119.
[20] T M Cover and J A Thomas. Elements of Information Theory. Wiley, 2012.
[21] Andrea Baronchelli, Emanuele Caglioti, and Vittorio Loreto. “Measuring complexity with
zippers”. In: European Journal of Physics 26.5 (July 2005), S69.
[22] R Avinery, M Kornreich, and R Beck. “Universal and Accessible Entropy Estimation Using
a Compression Algorithm”. In: Phys Rev Lett 123.17 (2019), p. 178102.
[23] F N M de Sousa, V G P de Sa, and E Brigatti. “Entropy estimation in bidimensional
sequences”. In: Physical Review E 105.5 (May 2022).
[24] Mickael Zbili and Sylvain Rama. “A Quick and Easy Way to Estimate Entropy and Mutual
Information for Neuroscience”. en. In: Frontiers in Neuroinformatics 15 (June 2021).
[25] Henning U Voss. “Anticipating chaotic synchronization”. In: Physical Review E 61.5 (May
2000), pp. 5115–5119.
[26] D W Hahs and S D Pethel. “Distinguishing anticipation from causality: anticipatory bias
in the estimation of information flow”. In: Phys Rev Lett 107.12 (2011), p. 128701.
[27] D A Smirnov. “Spurious causalities with transfer entropy”. In: Phys Rev E Stat Nonlin
Soft Matter Phys 87.4 (Apr. 2013), p. 042917.
[28] lzma. https://tukaani.org/xz/.
[29] Peter Deutsch. “Request for comments: 1951”. In: DEFLATE Compressed Data Format Specification version 1.3. URL= ftp://ftp. uu. net/graphics/png/documents/zlib/zdoc- index. html (1996).
[30] J Ziv and A Lempel. “UNIVERSAL ALGORITHM FOR SEQUENTIAL DATA COM- PRESSION”. In: Ieee Transactions on Information Theory 23.3 (1977), pp. 337–343.
[31] James Stone. Information Theory: A Tutorial Introduction. 2015.
[32] J A Vastano and H L Swinney. “Information transport in spatiotemporal systems”. In:
Phys Rev Lett 60.18 (1988), pp. 1773–1776.
[33] Terry Bossomaier et al. An Introduction to Transfer Entropy. 2016.
[34] C J Cellucci, A M Albano, and P E Rapp. “Statistical validation of mutual information calculations: comparison of alternative numerical algorithms”. In: Phys Rev E Stat Nonlin Soft Matter Phys 71.6 Pt 2 (June 2005), p. 066208.
[35] J T Lizier, M Prokopenko, and A Y Zomaya. “Local measures of information storage in complex distributed computation”. en. In: Information Sciences 208 (2012), pp. 39–54.
[36] A Avdesh et al. “Regular care and maintenance of a zebrafish (Danio rerio) laboratory: an introduction”. In: J Vis Exp 69 (2012), e4196.
[37] F Romero-Ferrero et al. “idtracker.ai: tracking all individuals in small or large collectives of unmarked animals”. In: Nat Methods 16.2 (Feb. 2019), pp. 179–182.
[38] pyinform. http://elife-asu.github.io/PyInform/shannon.html.
[39] R E Engeszer et al. “Zebrafish in the wild: a review of natural history and new notes from
the field”. In: Zebrafish 4.1 (2007), pp. 21–40. |