博碩士論文 108222001 詳細資訊




以作者查詢圖書館館藏 以作者查詢臺灣博碩士 以作者查詢全國書目 勘誤回報 、線上人數:27 、訪客IP:18.227.81.161
姓名 柯宜室(Is-Shih, Ko)  查詢紙本館藏   畢業系所 物理學系
論文名稱 使用壓縮演算法檢測兩時間序列之間資訊流方向的新方法
(A new method to detect the direction of Information Flow between two-time series using a compression algorithm)
相關論文
★ 利用雷射破壞方法研究神經網路的連結及同步發火的行為★ 神經膠細胞在神經同步活動及鈣離子波傳遞中之角色
★ 黏菌之運動模型研究★ 離子通道電流漲落的非線性行為
★ 亞精胺影響下DNA構形與DNA碎片分佈之研究★ DNA在微通道的熱泳行為
★ 溫度及鈣動力學對離體心臟心率之影響★ 非線性控制方法來抑制離體心臟中心跳強弱交替的現象與溫度和心臟收縮的力對心律變異性的影響
★ Thermo-diffusiophoresis and their Thermodynamics★ Predicting Self-terminating Ventricular Fibrillation by Bivariate Data Analysis and Controlling Cardiac Alternans by Chaotic Attractors
★ Effects of periodic and sustained stretching on cardiac culture★ 在外加振盪磁場中阻尼磁針的非線性動力學分析
★ 控制單一神經元的發放時間★ 非線性調控對心臟分岔現象的影響
★ 神經膠細胞對於神經網路同步爆發之影響★ A Study of Synchronized Burst Mechanisms in Neuronal Cultures
檔案 [Endnote RIS 格式]    [Bibtex 格式]    [相關文章]   [文章引用]   [完整記錄]   [館藏目錄]   至系統瀏覽論文 (全文檔遺失)
請聯絡國立中央大學圖書館資訊系統組 TEL:(03)422-7151轉57422,或E-mail聯絡
摘要(中) 了解複雜系統中因果關係的重要一環是如何檢測資訊流的方向。轉移熵(transfer entropy)被視為一種檢測資訊流的方法。然而,傳統方法中使用轉移熵需要事先確定歷史 長度,而不同的歷史長度可能導致截然相反的結果。尤其對於預測系統而言,為了獲得正確的資訊流方向,轉移熵需要更長的歷史長度。然而,過長的歷史長度可能導致轉移熵出 現巨大偏差,使結果難以解讀。在本研究中,我們提出了一種基於壓縮演算法估計轉移熵的新方法,稱之為壓縮轉移熵。這種方法可以在不需要事先指定歷史長度的情況下檢測資訊流方向。我們建立了兩個具有單一資訊流方向的模型,並使用真實的斑馬魚數據進行測試,以評估壓縮轉移熵的檢測能力,同時與傳統轉移熵方法進行比較。研究結果表明,壓縮轉移熵相較於傳統方法,在不需要指定歷史長度的情況下能夠準確地獲得結果。
摘要(英) Understanding the causality relationships in complex systems is an important aspect of determining the direction of information flow. Transfer entropy is regarded as a method for detecting information flow. However, traditional approaches using transfer entropy require determining the history length beforehand, and different history lengths can lead to contradictory results. This is particularly challenging for anticipation systems, as transfer entropy requires longer history lengths to obtain the correct direction of information flow. However, excessively long history lengths can introduce significant bias in transfer entropy, making the results difficult to interpret. In this study, we propose a new method called ”compressed transfer entropy” that estimates transfer entropy based on compression algorithms. This method enables the detection of information flow direction without the need for specifying a history length in advance. Two models with unidirectional information flow, along with real-world zebrafish data, were utilized to evaluate the detection capability of compressed transfer entropy, comparing it with traditional transfer entropy methods. The results demonstrate that compressed transfer entropy provides accurate results without the requirement of specifying a historical length.
關鍵字(中) ★ 轉移熵
★ 資訊理論
★ 因果關係
關鍵字(英) ★ transfer entropy
★ information theory
★ causality
論文目次 摘要 xi
Abstract xiii
Acknowledgement xv
Contents xvii
List of Figures xxi
Glossary xxix

1 Introduction 1

2 Information Tools 5
2.1 Shannon Entropy.............................................................. 5
2.2 Mutual Information ........................................................... 7
2.3 Time-Delayed Mutual Information ........................................... 8
2.4 Binning Method............................................................... 10
2.5 Transfer entropy............................................................... 11
2.5.1 Entropy rate .............................................................. 12
2.5.2 Active Information Storage ............................................... 14
2.5.3 Components of entropy ................................................... 15
2.5.4 Sampling disaster ......................................................... 17
2.6 Data Compression............................................................. 18

3 Material and Method 21
3.1 Experimental setup ........................................................... 21
3.1.1 Animals ................................................................... 21
3.1.2 Hardwares................................................................. 22
3.1.3 Software................................................................... 23
3.1.4 Data recording ............................................................ 24
3.2 Simulation models............................................................. 25
3.2.1 Clone series ............................................................... 25
3.2.2 Logistic map .............................................................. 25
3.3 Information analysis .......................................................... 27
3.3.1 Binning method........................................................... 27
3.3.2 Time-Delay mutual information (TDMI) ................................ 28
3.3.3 Entropy rate .............................................................. 28
3.3.4 Transfer entropy .......................................................... 30

4 Result 31
4.1 Compression Transfer entropy ................................................ 31
4.1.1 Negative value ............................................................ 34
4.1.2 Entropy rate .............................................................. 35
4.2 Simulation..................................................................... 36
4.2.1 Clone...................................................................... 36
4.2.2 Logistic Map .............................................................. 40
4.3 Experiment.................................................................... 45
4.3.1 Pair A..................................................................... 45
4.3.2 Pair B ..................................................................... 46
4.3.3 Pair C ..................................................................... 48

5 Conclusion and Discussion 51
5.1 Direction of information flow ................................................. 51
5.2 Simulation..................................................................... 52
5.3 Experiment.................................................................... 53
5.4 Discussion: Negative value of cTE............................................ 54
5.5 Discussion: Comparison between TE......................................... 55
5.5.1 Future Work .............................................................. 55

Bibliography 57

A Logistic Map without noise 61
A.1 Compression Entropy rate .................................................... 61
A.2 Transfer entropy............................................................... 61

B Logistic Map with noise 65
B.1 Compression Entropy rate .................................................... 65
B.2 Transfer entropy............................................................... 65

C Python code 69
參考文獻 [1] Norbert Wiener. “The Theory of Prediction”. In: Modern Mathematics for the Engineer: First Series. Ed. by E F Beckenbach. New York: McGraw-Hill, 1956.
[2] C W J Granger. “Investigating Causal Relations by Econometric Models and Cross- spectral Methods”. In: Econometrica 37.3 (1969), pp. 424–438.
[3] T Schreiber. “Measuring information transfer”. In: Phys Rev Lett 85.2 (2000), pp. 461– 464.
[4] S Kullback. Information Theory and Statistics. Wiley, 1959.
[5] L Barnett, A B Barrett, and A K Seth. “Granger causality and transfer entropy are
equivalent for Gaussian variables”. In: Phys Rev Lett 103.23 (2009), p. 238701.
[6] Can-Zhong Yao and Hong-Yu Li. “Effective Transfer Entropy Approach to Information Flow Among EPU, Investor Sentiment and Stock Market”. en. In: Frontiers in Physics 8 (June 2020).
[7] M Wibral et al. “Measuring information-transfer delays”. In: PLoS One 8.2 (2013), e55809.
[8] Michael Wibral and Raul Vicente. Directed Information Measures in Neuroscience. Un-
derstanding Complex Systems. 2014.
[9] S Butail, V Mwaffo, and M Porfiri. “Model-free information-theoretic approach to infer
leadership in pairs of zebrafish”. In: Phys Rev E 93 (Apr. 2016), p. 042411.
[10] S Butail et al. “Information Flow in Animal-Robot Interactions”. en. In: Entropy 16.3
(Mar. 2014), pp. 1315–1330.
[11] M Porfiri and M Ruiz Marin. “Symbolic dynamics of animal interaction”. In: J Theor Biol 435 (2017), pp. 145–156.
[12] X S Liang. “Unraveling the cause-effect relation between time series”. In: Phys Rev E Stat Nonlin Soft Matter Phys 90.5-1 (Nov. 2014), p. 052150.
[13] Claude Elwood Shannon. “A mathematical theory of communication”. In: The Bell system technical journal 27.3 (1948), pp. 379–423.
[14] Kozachenko, L And Leonenko. “On statistical estimation of entropy of a random vector”. In: Problems In-formation Transmission (1987).
[15] Joseph T Lizier. “JIDT: An Information-Theoretic Toolkit for Studying the Dynamics of Complex Systems”. en. In: Frontiers in Robotics and AI 1 (Dec. 2014).
BIBLIOGRAPHY
[16] L Paninski. “Estimation of entropy and mutual information”. en. In: Neural Computation 15.6 (June 2003), pp. 1191–1253.
[17] M Staniek and K Lehnertz. “Symbolic transfer entropy”. In: Phys Rev Lett 100.15 (2008), p. 158101.
[18] Liangyue Cao. “Practical method for determining the minimum embedding dimension of a scalar time series”. In: Physica D: Nonlinear Phenomena 110.1 (Dec. 1997), pp. 43–50.
[19] Michael Lindner et al. “TRENTOOL: A Matlab open source toolbox to analyse informa- tion flow in time series data with transfer entropy”. In: BMC Neuroscience 12.1 (Nov. 2011), p. 119.
[20] T M Cover and J A Thomas. Elements of Information Theory. Wiley, 2012.
[21] Andrea Baronchelli, Emanuele Caglioti, and Vittorio Loreto. “Measuring complexity with
zippers”. In: European Journal of Physics 26.5 (July 2005), S69.
[22] R Avinery, M Kornreich, and R Beck. “Universal and Accessible Entropy Estimation Using
a Compression Algorithm”. In: Phys Rev Lett 123.17 (2019), p. 178102.
[23] F N M de Sousa, V G P de Sa, and E Brigatti. “Entropy estimation in bidimensional
sequences”. In: Physical Review E 105.5 (May 2022).
[24] Mickael Zbili and Sylvain Rama. “A Quick and Easy Way to Estimate Entropy and Mutual
Information for Neuroscience”. en. In: Frontiers in Neuroinformatics 15 (June 2021).
[25] Henning U Voss. “Anticipating chaotic synchronization”. In: Physical Review E 61.5 (May
2000), pp. 5115–5119.
[26] D W Hahs and S D Pethel. “Distinguishing anticipation from causality: anticipatory bias
in the estimation of information flow”. In: Phys Rev Lett 107.12 (2011), p. 128701.
[27] D A Smirnov. “Spurious causalities with transfer entropy”. In: Phys Rev E Stat Nonlin
Soft Matter Phys 87.4 (Apr. 2013), p. 042917.
[28] lzma. https://tukaani.org/xz/.
[29] Peter Deutsch. “Request for comments: 1951”. In: DEFLATE Compressed Data Format Specification version 1.3. URL= ftp://ftp. uu. net/graphics/png/documents/zlib/zdoc- index. html (1996).
[30] J Ziv and A Lempel. “UNIVERSAL ALGORITHM FOR SEQUENTIAL DATA COM- PRESSION”. In: Ieee Transactions on Information Theory 23.3 (1977), pp. 337–343.
[31] James Stone. Information Theory: A Tutorial Introduction. 2015.
[32] J A Vastano and H L Swinney. “Information transport in spatiotemporal systems”. In:
Phys Rev Lett 60.18 (1988), pp. 1773–1776.
[33] Terry Bossomaier et al. An Introduction to Transfer Entropy. 2016.
[34] C J Cellucci, A M Albano, and P E Rapp. “Statistical validation of mutual information calculations: comparison of alternative numerical algorithms”. In: Phys Rev E Stat Nonlin Soft Matter Phys 71.6 Pt 2 (June 2005), p. 066208.
[35] J T Lizier, M Prokopenko, and A Y Zomaya. “Local measures of information storage in complex distributed computation”. en. In: Information Sciences 208 (2012), pp. 39–54.
[36] A Avdesh et al. “Regular care and maintenance of a zebrafish (Danio rerio) laboratory: an introduction”. In: J Vis Exp 69 (2012), e4196.
[37] F Romero-Ferrero et al. “idtracker.ai: tracking all individuals in small or large collectives of unmarked animals”. In: Nat Methods 16.2 (Feb. 2019), pp. 179–182.
[38] pyinform. http://elife-asu.github.io/PyInform/shannon.html.
[39] R E Engeszer et al. “Zebrafish in the wild: a review of natural history and new notes from
the field”. In: Zebrafish 4.1 (2007), pp. 21–40.
指導教授 陳志強(Chi-Keung Chan) 審核日期 2023-6-30
推文 facebook   plurk   twitter   funp   google   live   udn   HD   myshare   reddit   netvibes   friend   youpush   delicious   baidu   
網路書籤 Google bookmarks   del.icio.us   hemidemi   myshare   

若有論文相關問題,請聯絡國立中央大學圖書館推廣服務組 TEL:(03)422-7151轉57407,或E-mail聯絡  - 隱私權政策聲明