博碩士論文 107522622 詳細資訊




以作者查詢圖書館館藏 以作者查詢臺灣博碩士 以作者查詢全國書目 勘誤回報 、線上人數:22 、訪客IP:18.220.224.50
姓名 王佳薇(Nathaporn Wanchainawin)  查詢紙本館藏   畢業系所 資訊工程學系
論文名稱 結合圖形卷積與遞迴歸神經網路的關聯圖預測模型
(GC-RNN: A Novel Relational Graph Prediction Model Based on the Fusion of Graph Convolution Network and Recurrent Neural Network)
相關論文
★ 基於edX線上討論板社交關係之分組機制★ 利用Kinect建置3D視覺化之Facebook互動系統
★ 利用 Kinect建置智慧型教室之評量系統★ 基於行動裝置應用之智慧型都會區路徑規劃機制
★ 基於分析關鍵動量相關性之動態紋理轉換★ 基於保護影像中直線結構的細縫裁減系統
★ 建基於開放式網路社群學習環境之社群推薦機制★ 英語作為外語的互動式情境學習環境之系統設計
★ 基於膚色保存之情感色彩轉換機制★ 一個用於虛擬鍵盤之手勢識別框架
★ 分數冪次型灰色生成預測模型誤差分析暨電腦工具箱之研發★ 使用慣性傳感器構建即時人體骨架動作
★ 基於多台攝影機即時三維建模★ 基於互補度與社群網路分析於基因演算法之分組機制
★ 即時手部追蹤之虛擬樂器演奏系統★ 基於類神經網路之即時虛擬樂器演奏系統
檔案 [Endnote RIS 格式]    [Bibtex 格式]    [相關文章]   [文章引用]   [完整記錄]   [館藏目錄]   [檢視]  [下載]
  1. 本電子論文使用權限為同意立即開放。
  2. 已達開放權限電子全文僅授權使用者為學術研究之目的,進行個人非營利性質之檢索、閱讀、列印。
  3. 請遵守中華民國著作權法之相關規定,切勿任意重製、散佈、改作、轉貼、播送,以免觸法。

摘要(中) 現實世界中許多高維度不規則區域的資料可以用圖來表示,像是社群網路、大腦連接組、詞向量。有多項研究是在探討如何以圖的架構開發模型,如:Graph Neuron Network、Graph Convolutional Network和Graph Attention Network等,這些網路是在靜態圖上運作的,多數是用於分類任務。
在這項研究中,我們著重於動態序列資料的預測,像是每年co-author relation graph的變化,或是玩家在隨時間的互動關係。我們的模型結合了Variational Graph Auto-Encoder(VGAE)、Graph Convolutional Network(GCN)和Long short-term memory(LSTM),其中廣泛的使用遞迴神經網路。我們在五個資料集上檢測模型的效果,High-energy physics theory citation network、Dynamic Face-to-Face Interaction Networks、CollegeMsg temporal network、Email-Eu-core temporal network和DBLP collaboration network and ground-truth communities。
摘要(英) A lot of real-world data in high-dimensional irregular domain, such as social networks, brain connectomes or words’ embedding, can be represented by graphs. There are various researches that developed models to operate on graph-structured data, such as Graph Neuron Network, Graph Convolutional Network, Graph Attention Network, etc. These networks are operated on static graphs, mostly use for classification task.
In this research, we are interested in prediction of dynamic and sequential data, such as how is the changes of the co-author relationship graph in each year, or how players interact with each other through time. We implement a model from the combination of Variational Graph Auto-Encoder (VGAE), which is a variant of Graph Convolutional Network (GCN) with Long short-term memory (LSTM), which is a wildly used Recurrent Neural Network. And evaluate the performance of the model on 5 datasets, High-energy physics theory citation network, Dynamic Face-to-Face Interaction Networks, CollegeMsg temporal network, Email-Eu-core temporal network and DBLP collaboration network and ground-truth communities.
關鍵字(中) ★ 圖卷積網路
★ 長短期記憶
★ 動態圖
★ GCN
★ LSTM
關鍵字(英) ★ Graph Convolutional Network
★ Long Short-Term Memory
★ Dynamic graph
★ GCN
★ LSTM
論文目次 摘要 ............................................................................................................................. i
Abstract ..................................................................................................................... ii
Acknowledgements .................................................................................................. iii
Table of Contents .................................................................................................... iv
List of Figures .......................................................................................................... vi
List of Tables ........................................................................................................... vii
Explanation of Symbols ........................................................................................ viii
Chapter 1 Introduction ............................................................................................ 1
1.1 Introduction.................................................................................................. 1
1.2 Contribution ................................................................................................. 4
1.3 Structure of Thesis ....................................................................................... 5
Chapter 2 Related Works ........................................................................................ 6
2.1 Recurrent Neural Network ............................................................................ 6
2.1.1 Long Short-Term Memory Networks ......................................................... 7
2.1.2 Variants of Long Short-Term Memory Architectures ................................ 9
2.2 Neural Networks on Graph .......................................................................... 11
2.2.1 Graph Neural Network .............................................................................. 11
2.2.2 Graph Convolutional Network.................................................................. 14
2.2.3 Variational Graph Auto-Encoder .............................................................. 14
2.2.4 Gated Graph Neural Networks ................................................................. 15
Chapter 3 Proposed Method ................................................................................. 17
Chapter 4 Experiment ........................................................................................... 23
4.1 Dataset ............................................................................................................ 23
4.1.1 Email-Eu-core temporal network.............................................................. 24
4.1.2 Dynamic Face-to-Face Interaction Networks ........................................... 26
4.1.3 CollegeMsg temporal network .................................................................. 27
4.1.4 High-energy physics theory citation network ........................................... 28
4.1.5 DBLP collaboration network and ground-truth communities .................. 29
4.2 Experimental setup ....................................................................................... 30
4.2.1 Proposed Model Setup .............................................................................. 30
4.2.2 Baselines ................................................................................................... 31
4.2.3 Evaluation Matrix ..................................................................................... 32
4.4 Experimental Results .................................................................................... 32
Chapter 5 Conclusions and Future Works .......................................................... 36
References ............................................................................................................... 37
參考文獻 [1] Hochreiter, Sepp, and Jürgen Schmidhuber. "Long short-term memory." Neural computation 9.8 (1997): 1735-1780.
[2] Gers, Felix A., and Jürgen Schmidhuber. "Recurrent nets that time and count." Proceedings of the IEEE-INNS-ENNS International Joint Conference on Neural Networks. IJCNN 2000. Neural Computing: New Challenges and Perspectives for the New Millennium. Vol. 3. IEEE, 2000.
[3] Graves, Alex. "Generating sequences with recurrent neural networks." arXiv preprint arXiv:1308.0850 (2013).
[4] Xingjian, S. H. I., et al. "Convolutional LSTM network: A machine learning approach for precipitation nowcasting." Advances in neural information processing systems. 2015.
[5] Scarselli, Franco, et al. "The graph neural network model." IEEE Transactions on Neural Networks 20.1 (2008): 61-80.
[6] Henaff, Mikael, Joan Bruna, and Yann LeCun. "Deep convolutional networks on graph-structured data." arXiv preprint arXiv:1506.05163 (2015).
[7] Kipf, Thomas N., and Max Welling. "Variational graph auto-encoders." arXiv preprint arXiv:1611.07308 (2016).
[8] Li, Yujia, et al. "Gated graph sequence neural networks." arXiv preprint arXiv:1511.05493 (2015).
[9] Paranjape, Ashwin, Austin R. Benson, and Jure Leskovec. "Motifs in temporal networks." Proceedings of the Tenth ACM International Conference on Web Search and Data Mining. 2017.
[10] C.Bai, S. Kumar, J. Leskovec, M. Metzger, J.F. Nunamaker, V.S. Subrahmanian. Predicting Visual Focus of Attention in Multi-person Discussion Videos. International Joint Conference on Artificial Intelligence (IJCAI), 2019.
[11] Pietro Panzarasa, Tore Opsahl, and Kathleen M. Carley. "Patterns and dynamics of users′ behavior and interaction: Network analysis of an online community." Journal of the American Society for Information Science and Technology 60.5 (2009): 911-932.
[12] J. Leskovec, J. Kleinberg and C. Faloutsos. Graphs over Time: Densification Laws, Shrinking Diameters and Possible Explanations. ACM SIGKDD International Conference on Knowledge Discovery and Data Mining (KDD), 2005.
[13] J. Yang and J. Leskovec. Defining and Evaluating Network Communities based on Ground-truth. ICDM, 2012.
[14] Seo, Youngjoo, et al. "Structured sequence modeling with graph convolutional recurrent networks." International Conference on Neural Information Processing. Springer, Cham, 2018.
[15] Cao, Shaosheng, Wei Lu, and Qiongkai Xu. "Deep neural networks for learning graph representations." Thirtieth AAAI conference on artificial intelligence. 2016.
[16] Defferrard, Michaël, Xavier Bresson, and Pierre Vandergheynst. "Convolutional neural networks on graphs with fast localized spectral filtering." Advances in neural information processing systems. 2016.
[17] Kipf, Thomas N., and Max Welling. "Semi-supervised classification with graph convolutional networks." arXiv preprint arXiv:1609.02907 (2016).
[18] Schlichtkrull, Michael, et al. "Modeling relational data with graph convolutional networks." European Semantic Web Conference. Springer, Cham, 2018.
[19] Ying, Rex, et al. "Graph convolutional neural networks for web-scale recommender systems." Proceedings of the 24th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining. 2018.
[20] Zhou, Jie, et al. "Graph neural networks: A review of methods and applications." arXiv preprint arXiv:1812.08434 (2018).
[21] J. Gehrke, P. Ginsparg, J. M. Kleinberg. Overview of the 2003 KDD Cup. SIGKDD Explorations 5(2): 149-151, 2003
指導教授 施國琛(Timothy K. Shih) 審核日期 2020-7-17
推文 facebook   plurk   twitter   funp   google   live   udn   HD   myshare   reddit   netvibes   friend   youpush   delicious   baidu   
網路書籤 Google bookmarks   del.icio.us   hemidemi   myshare   

若有論文相關問題,請聯絡國立中央大學圖書館推廣服務組 TEL:(03)422-7151轉57407,或E-mail聯絡  - 隱私權政策聲明