博碩士論文 101282605 詳細資訊




以作者查詢圖書館館藏 以作者查詢臺灣博碩士 以作者查詢全國書目 勘誤回報 、線上人數:107 、訪客IP:18.188.51.137
姓名 柯佐安(Andreas Felix Goetze)  查詢紙本館藏   畢業系所 物理學系
論文名稱 分類局部傳遞熵及其在重建神經網絡中的應用和伊辛逆問題
(Sorted local transfer entropy and its application to reconstructing neural networks and the inverse Ising problem)
相關論文
★ Case study of an extended Fitzhugh-Nagumo model with chemical synaptic coupling and application to C. elegans functional neural circuits★ 二維非彈性顆粒子之簇集現象
★ 螺旋狀高分子長鏈在拉力下之電腦模擬研究★ 顆粒體複雜流動之研究
★ 高分子在二元混合溶劑之二維蒙地卡羅模擬研究★ 帶電高分子吸附在帶電的表面上之研究
★ 自我纏繞繩節高分子之物理★ 高分子鏈在強拉伸流場下之研究
★ 利用雷射破壞方法研究神經網路的連結及同步發火的行為★ 最佳化網路成長模型的理論研究
★ 高分子鏈在交流電場或流場下的行為★ 驟放式發火神經元的數值模擬
★ DNA在微通道的熱泳行為★ 皮膚細胞增生與腫瘤生長之模擬
★ 耦合在非線性系統中的影響:模型探討以及非線性分析★ 從網路節點時間序列分析網路特性並應用在體外培養神經及心臟細胞
檔案 [Endnote RIS 格式]    [Bibtex 格式]    [相關文章]   [文章引用]   [完整記錄]   [館藏目錄]   [檢視]  [下載]
  1. 本電子論文使用權限為同意立即開放。
  2. 已達開放權限電子全文僅授權使用者為學術研究之目的,進行個人非營利性質之檢索、閱讀、列印。
  3. 請遵守中華民國著作權法之相關規定,切勿任意重製、散佈、改作、轉貼、播送,以免觸法。

摘要(中) 最近的研究已使用傳遞熵來測量大量神經元之間的有效連結。對這些網絡的 分析為神經網絡中的信息傳遞提供了新穎的見解。信息傳遞是通過傳遞熵的 估計來量化的,傳遞熵是作為無模型方法測量神經元之間有向線性和非線性 相互作用的方法。兩個序列之間的高信息傳遞是神經元之間興奮性突觸的證 據。但是抑制性突觸也顯示出重要的信息傳遞。我們通過揭示信息傳遞是來 自興奮性突觸還是抑制性突觸來擴展有效連結的分析。為了區分這些類型的 相互作用,我們分析了每種傳遞類型的符號相反的局部傳遞熵,從而使我們 能夠區分分類後的局部傳遞熵。我們進一步探索動態狀態條件以估計傳遞熵, 以消除神經群體中高度同步的驟放事件期間的網絡效應。像以前的研究一樣, 我們將這些方法應用於具有隨機突觸延遲和可塑性的 Izhikevich 神經元網絡 進行神經元發火序列的模擬,我們證明了可以推斷出抑制性和興奮性突觸, 並改善了網絡重構。此外,我們證明分類局部傳遞熵對於解決伊辛逆問題也 很有用。我們的模擬中表明,對於一個隨機連接的 Ising 網絡系統,我們可以 通過估計成對分類局部傳遞熵來推斷正和負相互作用的強度。
摘要(英) Recent studies have used transfer entropy to measure the effective connectivity among large populations of neurons. Analyzing these networks gave novel insight on the information transfer in neural networks [1]. Transfer of information as quantified by the estimation of transfer entropy [2] which measures the directed linear and non-linear interactions between neurons as a model-free method. High information transfer between two spike trains is evidential for an underlying excitatory synapse between the neurons. However, also inhibitory synapses show significant information transfer. We extend the effective connectivity analysis by revealing whether the information transfer is coming from an excitatory or an inhibitory synapse. To distinguish these types of interactions we analyze the local transfer entropies [3] which are opposite signed for each interaction type, allowing us to define the sorted local transfer entropy as the discriminating quantity. We further explore dynamic state conditioning for estimating transfer entropy [4] in order to remove the network effects during highly synchronized bursting events in the neural population which are not indicative of a direct synaptic interaction. Applying these techniques to the spike trains of simulated networks of Izhikevich neurons with random synaptic delays and spike-timing-dependent plasticity evolved connection weights like in a previous study [5], we show that inhibitory and excitatory synapses can be inferred and the network reconstruction improved.
Furthermore we show that sorted local transfer entropy is also useful to solve the inverse Ising problem [6]. In our simulations we show that for a system of randomly connected Ising nodes we can infer the interaction strength for both positive and negative interactions by estimating the pairwise sorted local transfer entropies.
關鍵字(中) ★ 轉移熵
★ 神經網絡
★ 有效的連接
★ 易辛模型
關鍵字(英) ★ transfer entropy
★ neural networks
★ effective connectivity
★ Ising model
論文目次 Chinese Abstract i
Abstract ii
Acknowledgements iv
1 Introduction 1
1.1 Background............................. 1
1.2 Summary of chapters........................ 4
2 Background on Information Theory 6
2.1 Information theory basics ..................... 6
2.2 Information theory basics for a discrete dynamical process . . . 8
2.2.1 Dynamic complex system of interacting components . . . 9
2.3 Pointwise information theory ................... 10
2.3.1 Canonic example: two binary variables .......... 12
2.3.2 Sorted pointwise mutual information . .......... 13
3 Network Reconstruction in Neural Networks 15
3.1 Methodology ............................ 15
3.1.1 Transfer Entropy . . . . . . . . . . . . . . . . . . . . . . 15
3.1.2 Local transfer entropy ................... 17
3.1.3 Sorted local transfer entropy................ 17
3.1.4 Surrogate tests ....................... 19
3.1.5 Interaction delay ...................... 19
3.1.6 Multivariate transfer entropy ............... 20
3.1.7 State-conditioned transfer entropy . . . . . . . . . . . . 22
3.1.8 GPU computing ...................... 23
3.2 Fitzhugh-Nagumo motif simulations. . . . . . . . . . . . . . . . 23
3.3 Izhikevichnetworksimulations................... 27
3.4 Reconstruction ........................... 30
3.5 Results................................ 32
3.5.1 Fitzhugh-Nagumo neurons................. 32
3.5.2 Firing dynamics in large neural networks . . . . . . . . . 32
3.5.3 Izhikevich simulations with regularly firing neurons . . . 35
3.5.4 Izhikevich simulations with bursting dynamics . . . . . . 35
3.5.5 Performance summary ................... 43
3.5.6 Clustering in the interaction classification . . . . . . . . 44
4 Reconstructing Positive and Negative couplings in Ising spin Networks by Sorted Local Transfer Entropy 46
4.1 Introduction............................. 47
4.2 Transfer Entropy and Sorted Local Transfer Entropy . . . . . . 50
4.3 Ising Spin Network reconstruction using SLTE . . 53
4.3.1 Glauber Dynamics: asynchronous update ... 54
4.3.2 Glauber Dynamics: synchronous update ...57
4.3.3 Kawasaki Spin Exchange Dynamics . . . 62
4.4 Summary and Outlook....................... 63
5 Conclusion 67
References 69
參考文獻 [1] S. Nigam, M. Shimono, S. Ito, F.-C. Yeh, N. Timme, M. Myroshnychenko, C. C. Lapish, Z. Tosi, P. Hottowy, W. C. Smith, and others, The Journal of Neuroscience 36, 670 (2016).
[2] T. Schreiber, Physical Review Letters 85, 461 (2000).
[3] J. T. Lizier, M. Prokopenko, and A. Y. Zomaya, Phys. Rev. E 77, 026110 (2008).
[4] O. Stetter, D. Battaglia, J. Soriano, and T. Geisel, PLoS Computational Biology 8, (2012).
[5] S. Ito, M. E. Hansen, R. Heiland, A. Lumsdaine, A. M. Litke, and J. M. Beggs, PLoS ONE 6, (2011).
[6] F. Goetze and P.-Y. Lai, Physical Review E 100, (2019).
[7] L. Barnett, Physical Review Letters 103, (2009).
[8] M. Wibral, R. Vicente, and J. T. Lizier, Directed Information Measures in Neuroscience (2014).
[9] O. Kwon and J.-S. Yang, EPL 82, 68003 (2008).
[10] E. Crosato, L. Jiang, V. Lecheval, J. T. Lizier, X. R. Wang, P. Tichit, G. Theraulaz, and
M. Prokopenko, Swarm Intell 12, 283 (2018).
[11] J. Runge, J. Heitzig, N. Marwan, and J. Kurths, Phys. Rev. E 86, 061121 (2012).
[12] M. Pellicoro and S. Stramaglia, Physica A: Statistical Mechanics and Its Applications 389, 4747 (2010).
[13] L. Barnett, J. T. Lizier, M. Harré, A. K. Seth, and T. Bossomaier, Phys. Rev. Lett. 111, 177203 (2013).

[14] L. Novelli, P. Wollstadt, P. Mediano, M. Wibral, and J. T. Lizier, (2019).
[15] C. E. Shannon, Bell System Technical Journal 27, 623 (1948).
[16] D. J. MacKay, Information Theory, Inference, and Learning Algorithms (Citeseer, 2003).
[17] J. T. Lizier, The Local Information Dynamics of Distributed Computation in Complex Systems, Springer Science & Business Media, 2012.
[18] C. D. Manning, C. D. Manning, and H. Schütze, Foundations of Statistical Natural Language Processing (MIT Press, 1999).
[19] B. Gourevitch and J. J. Eggermont, Journal of Neurophysiology 97, 2533 (2007). [20] E. M. Izhikevich, Neural Computation 18, 245 (2006).
[21] M. Wibral, N. Pampu, V. Priesemann, F. Siebenhühner, H. Seiwert, M. Lindner, J. T. Lizier, and R. Vicente, PloS One 8, (2013).
[22] J. Runge, Chaos: An Interdisciplinary Journal of Nonlinear Science 28, 075310 (2018). [23] A. Palmigiano, T. Geisel, F. Wolf, and D. Battaglia, Nat Neurosci advance online
publication, (2017).
[24] A. Borisyuk, A. Friedman, B. Ermentrout, and D. Terman, Tutorials in Mathematical
Biosciences I (Springer Berlin Heidelberg, Berlin, Heidelberg, 2005).
[25] R. FitzHugh, Biophysical Journal 1, 445 (1961).
[26] A. L. Hodgkin and A. F. Huxley, The Journal of Physiology 117, 500 (1952). [27] E. Izhikevich, IEEE Transactions on Neural Networks 14, 1569 (2003).
[28] S. Song, K. D. Miller, and L. F. Abbott, Nat Neurosci 3, 919 (2000).
[29] T. Fawcett, Pattern Recognition Letters 27, 861 (2006).
[30] M. Timme, Europhysics Letters (EPL) 76, 367 (2006).
[31] W. Wang, Y.-C. Lai, and C. Grebogi, Physics Reports 644, 1 (2016).
[32] M. Nitzan, J. Casadiego, and M. Timme, Science Advances 3, (2017).
[33] D. Yu, M. Righero, and L. Kocarev, Physical Review Letters 97, (2006).
[34] M. Timme, Physical Review Letters 98, (2007).
[35] S. G. Shandilya and M. Timme, New Journal of Physics 13, 013004 (2011).
[36] Z. Levnajić and A. Pikovsky, Physical Review Letters 107, (2011).
[37] Z. Levnajić and A. Pikovsky, Scientific Reports 4, (2015).
[38] E. S. C. Ching, P.-Y. Lai, and C. Y. Leung, Physical Review E 88, (2013).
[39] E. S. C. Ching, P.-Y. Lai, and C. Y. Leung, Physical Review E 91, (2015).
[40] E. S. C. Ching and H. C. Tam, Physical Review E 95, (2017).
[41] P.-Y. Lai, Physical Review E 95, (2017).
[42] H. Tam, E. S. Ching, and P.-Y. Lai, Physica A: Statistical Mechanics and Its Applications (2018).
[43] H. J. Kappen and F. B. Rodríguez, Neural Computation 10, 1137 (1998).
[44] Y. Roudi, J. Tyrcha, and J. Hertz, Physical Review E 79, (2009).
[45] Y. Roudi and J. A. Hertz, Physical Review Letters 106, (2011).
[46] H.-L. Zeng, E. Aurell, M. Alava, and H. Mahmoudi, Physical Review E 83, (2011).
[47] H.-L. Zeng, M. Alava, E. Aurell, J. Hertz, and Y. Roudi, Physical Review Letters 110, 210601 (2013).
[48] P. Zhang, Journal of Statistical Physics 148, 502 (2012).
[49] E. Aurell and M. Ekeberg, Physical Review Letters 108, (2012).
[50] S. L. Dettmer, H. C. Nguyen, and J. Berg, Physical Review E 94, 052116 (2016).
[51] J. Albert and R. H. Swendsen, Physics Procedia 57, 99 (2014).
[52] J. Albert and R. H. Swendsen, Physica A: Statistical Mechanics and Its Applications 483, 293 (2017).
[53] T. Bossomaier, L. Barnett, M. Harré, and J. T. Lizier, An Introduction to Transfer
Entropy (Springer International Publishing, Cham, 2016).
[54] F. Doria, R. Erichsen Jr., D. Dominguez, M. González, and S. Magalhaes, Physica A:
Statistical Mechanics and Its Applications 422, 58 (2015).
[55] M. Li, Y. Fan, J. Wu, and Z. Di, International Journal of Modern Physics B 27, 1350146
(2013).
[56] Hon Lau and Peter Grassberger, 87, (2013).
[57] Z. Deng, J. Wu, and W. Guo, Physical Review E 90, (2014).
[58] J. G. Orlandi, O. Stetter, J. Soriano, T. Geisel, D. Battaglia, and J. Garcia-Ojalvo, PLoS ONE 9, (2014).
[59] H. P. Robinson, M. Kawahara, Y. Jimbo, K. Torimitsu, Y. Kuroda, and A. Kawana, Journal of Neurophysiology 70, 1606 (1993).
[60] L. C. Jia, M. Sano, P.-Y. Lai, and C. K. Chan, Physical Review Letters 93, (2004).
[61] P.-Y. Lai, L. C. Jia, and C. K. Chan, Physical Review E 73, (2006).
[62] H. Song, C.-C. Chen, J.-J. Sun, P.-Y. Lai, and C. K. Chan, Physical Review E 90, 012703 (2014).
[63] M. Prokopenko, J. Lizier, and D. Price, Entropy 15, 524 (2013). [64] M. Prokopenko and I. Einav, Physical Review E 91, (2015).
[65] G. V. Steeg and A. Galstyan, (2011).
[66] G. Ver Steeg and A. Galstyan, CoRR (2012).
[67] T. Tomokiyo and M. Hurst, in Proceedings of the ACL 2003 Workshop on Multiword Expressions: Analysis, Acquisition and Treatment-Volume 18 (Association for Computational Linguistics, 2003), pp. 33–40.
[68] F. Goetze, P.-Y. Lai, and C. Chan, BMC Neuroscience 16, (2015).
[69] G. Bouma, in Proceedings of the Biennial GSCL Conference (2009).
[70] C. Finn and J. T. Lizier, arXiv Preprint arXiv:1801.09223 (2018).
[71] B. Gourévitch and J. J. Eggermont, Journal of Neurophysiology 97, 2533 (2007). [72] P. Erds and A. Rényi, Publ. Math. Inst. Hung. Acad. Sci 5, 17 (1960).
[73] B. Bollobás, Random Graphs (Academic, London, 1985).
[74] R. J. Glauber, Journal of Mathematical Physics 4, 294 (1963).
[75] P. Wollstadt, M. Martínez-Zarzuela, R. Vicente, F. J. Díaz-Pernas, and M. Wibral, arXiv Preprint arXiv:1401.4068 (2014).
指導教授 黎璧賢(Pik-Yin Lai) 審核日期 2020-1-20
推文 facebook   plurk   twitter   funp   google   live   udn   HD   myshare   reddit   netvibes   friend   youpush   delicious   baidu   
網路書籤 Google bookmarks   del.icio.us   hemidemi   myshare   

若有論文相關問題,請聯絡國立中央大學圖書館推廣服務組 TEL:(03)422-7151轉57407,或E-mail聯絡  - 隱私權政策聲明