博碩士論文 106525009 詳細資訊




以作者查詢圖書館館藏 以作者查詢臺灣博碩士 以作者查詢全國書目 勘誤回報 、線上人數:7 、訪客IP:13.58.64.81
姓名 徐志榮(Chih-Jung Hsu)  查詢紙本館藏   畢業系所 軟體工程研究所
論文名稱 預測交通需求之分佈與數量—基於多重式注意力 機制之AR-LSTMs 模型
(Predicting Transportation Demand based on AR-LSTMs Model with Multi-Head Attention)
相關論文
★ 透過網頁瀏覽紀錄預測使用者之個人資訊與性格特質★ 透過矩陣分解之多目標預測方法預測使用者於特殊節日前之瀏覽行為變化
★ 動態多模型融合分析研究★ 擴展點擊流:分析點擊流中缺少的使用者行為
★ 關聯式學習:利用自動編碼器與目標傳遞法分解端到端倒傳遞演算法★ 融合多模型排序之點擊預測模型
★ 分析網路日誌中有意圖、無意圖及缺失之使用者行為★ 基於自注意力機制產生的無方向性序列編碼器使用同義詞與反義詞資訊調整詞向量
★ 探索深度學習或簡易學習模型在點擊率預測任務中的使用時機★ 空氣品質感測器之故障偵測--基於深度時空圖模型的異常偵測框架
★ 以同反義詞典調整的詞向量對下游自然語言任務影響之實證研究★ 利用輔助語句與BERT模型偵測詞彙的上下位關係
★ 結合時空資料的半監督模型並應用於PM2.5空污感測器的異常偵測★ 利用 SCPL 分解端到端倒傳遞演算法
★ 藉由權重之梯度大小調整DropConnect的捨棄機率來訓練神經網路★ 使用圖神經網路偵測 PTT 的低活躍異常帳號
檔案 [Endnote RIS 格式]    [Bibtex 格式]    [相關文章]   [文章引用]   [完整記錄]   [館藏目錄]   [檢視]  [下載]
  1. 本電子論文使用權限為同意立即開放。
  2. 已達開放權限電子全文僅授權使用者為學術研究之目的,進行個人非營利性質之檢索、閱讀、列印。
  3. 請遵守中華民國著作權法之相關規定,切勿任意重製、散佈、改作、轉貼、播送,以免觸法。

摘要(中) 智慧交通儼然成為智慧城市的重要一環,計程車需求預測是智慧交
通中一項重要課題。有效地預測下個時間點載客需求的分布可以減少司
機空車時間、降低乘客等待時間及增加獲利載客次數,將計程車產業獲
利最大化並解決車輛巡迴攬客所造成的能源消耗及汙染。

本文利用計程車行車紀錄資料結合深度學習的架構提出有效的計程
車載客需求預測模型,使用善於處理時間序列架構的短中長期記憶模
型(LSTM) 為基礎,交通議題的資料與長時間周期變化有關,過去的
方式難以克服於尖峰與離峰間變化的預測,因此我們使用注意力機制
(Attention) 加強長時間週期的交通問題資訊處理,並設計多層的深度學習網路架構來提高預測準確率。我們並自訂了一個同時考慮均方損失誤差及平均百分比誤差的損失函數,因為均方損失誤差通常會低估低需求區域的叫車數,而平均百分比誤差則容易錯估高需求區域的叫車數。

為驗證模型的一般性,我們使用兩組資料集,分別為紐約市計程車
的行車紀錄資料與台灣大車隊在台北的計程車叫車資料進行驗證。在實
驗中我們比較了傳統的預測方式、淺層機器學習、及深度學習模型等方
式預測計程車需求分佈,實驗結果顯示我們提出的多重式AR-LSTMs 預
測模型能有效的提高預測的準確度。
摘要(英) Smart transportation is a crucial issue for a smart city, and the forecast for taxi demand is one of the important topics in smart transportation. If we can effectively predict the taxi demand in the near future, we may
be able to reduce the taxi vacancy rate, reduce the waiting time of the passengers, increase the number of trip counts for a taxi, expand driver’s income, and diminish the power consumption and pollution caused by
vehicle dispatches.

This paper proposes an efficient taxi demand prediction model based on state-of-the-art deep learning architecture. Specifically, we use the LSTM model as the foundation, because the LSTM model is effective in predicting time-series datasets. We enhance the LSTM model by introducing the attention mechanism such that the traffic during the peak hour and the off-peak period can better be predicted. We leverage a multi-layer
architecture to increase the predicting accuracy. Additionally, we design a loss function that incorporates both the absolute mean-square-error (which tends under-estimate the low taxi demand areas) and the relative meansquare-error (which tends to misestimate the high taxi demand areas).

To validate our model, we conduct experiments on two real datasets — the NYC taxi demand dataset and the Taiwan Taxi’s taxi demand dataset in Taipei City. We compare the proposed model with non-machine learning based models, traditional machine learning models, and deep learning models. Experimental results show that the proposed model outperforms the baseline models.
關鍵字(中) ★ 計程車需求預測
★ 深度學習
★ 遞歸神經網絡
★ 長短期記憶模型
★ 注意力模型
關鍵字(英) ★ Taxi Demand Prediction
★ Deep Learning
★ Recurrent Neural Networks
★ Long Short-Term Memory Work
★ Attention
論文目次 摘要iv
Abstract vi
目錄viii
圖目錄ix
表目錄x
一、緒論1
1.1 研究動機.................................................................. 1
1.2 研究目標.................................................................. 2
1.3 研究貢獻.................................................................. 2
1.4 論文架構.................................................................. 3
二、預前工作4
2.1 遞歸神經網路(Recurrent Neural Network, RNN)............... 4
2.2 長短期記憶模型(Long Short-Term Memory, LSTM) .......... 6
三、模型及方法8
3.1 原始座標資料編碼...................................................... 8
3.2 保留連結(Residual Connections) ................................... 9
3.3 注意力機制(Attention)................................................ 10
3.3.1 比例內積式注意力(Scaled Dot-Product Attention) ... 10
3.3.2 多頭式注意力(Multi-Head Attention) .................... 11
3.4 損失函數.................................................................. 11
3.5 Residual-LSTMs 預測模型............................................ 12
3.6 Attention-Residual-LSTMs 預測模型............................... 13
四、實驗結果與分析15
4.1 資料集介紹............................................................... 15
4.1.1 NYC Taxi ........................................................ 16
4.1.2 台灣大車隊...................................................... 16
4.2 實驗環境.................................................................. 16
4.3 實驗對照模型與方法介紹............................................. 17
4.3.1 歷史平均......................................................... 17
4.3.2 ARIMA ........................................................... 17
4.3.3 XGboost.......................................................... 18
4.3.4 線性回歸(Linear Regression) ............................... 18
4.3.5 DMVST-Net..................................................... 18
4.4 評量指標.................................................................. 18
4.5 實驗結果.................................................................. 19
4.5.1 實驗結果分析................................................... 20
4.5.2 不同時間區段比較............................................. 27
五、相關研究32
5.1 LSTM-MDN-Conditional .............................................. 33
5.2 DMVST-Net.............................................................. 33
六、結論及未來展望35
6.1 結論........................................................................ 35
6.2 未來展望.................................................................. 35
參考文獻37
附錄一39
參考文獻 [1] A. Krizhevsky, I. Sutskever, and G. E. Hinton, “Imagenet classification with deep
convolutional neural networks,” pp. 1097–1105, 2012.
[2] K. Simonyan and A. Zisserman, “Very deep convolutional networks for large-scale
image recognition,” arXiv preprint arXiv:1409.1556, 2014.
[3] K. He, X. Zhang, S. Ren, and J. Sun, “Deep residual learning for image recognition,”
pp. 770–778, 2016.
[4] S. Hochreiter and J. Schmidhuber, “Long short-term memory,” Neural computation,
vol. 9, no. 8, pp. 1735–1780, 1997.
[5] Y. Wu, M. Schuster, Z. Chen, Q. V. Le, M. Norouzi, W. Macherey, M. Krikun,
Y. Cao, Q. Gao, K. Macherey, et al., “Google’s neural machine translation system:
Bridging the gap between human and machine translation,” arXiv preprint arXiv:
1609.08144, 2016.
[6] R. Pascanu, T. Mikolov, and Y. Bengio, “Understanding the exploding gradient
problem,” CoRR, abs/1211.5063, vol. 2, 2012.
[7] A. Vaswani, N. Shazeer, N. Parmar, J. Uszkoreit, L. Jones, A. N. Gomez, Ł. Kaiser,
and I. Polosukhin, “Attention is all you need,” pp. 5998–6008, 2017.
[8] H. Yao, F. Wu, J. Ke, X. Tang, Y. Jia, S. Lu, P. Gong, J. Ye, and Z. Li, “Deep
multi-view spatial-temporal network for taxi demand prediction,” 2018.
[9] 徐志榮and 陳弘軒, “多層式短中長期記憶模型之即時計程車需求預測,” Conference
on Technologies and Applications of Artificial Intelligence, 2018.
[10] J. Xu, R. Rahmatizadeh, L. Bölöni, and D. Turgut, “Real-time prediction of taxi
demand using recurrent neural networks,” IEEE Transactions on Intelligent Transportation
Systems, vol. 19, no. 8, pp. 2572–2581, 2017.
[11] NYC Taxi Limousine Commission and others, “Taxi and limousine commission(tlc)
trip record data,”
[12] D. P. Kingma and J. Ba, “Adam: A method for stochastic optimization,” arXiv
preprint arXiv:1412.6980, 2014.
37
[13] R. J. Hyndman, Y. Khandakar, et al., “Automatic time series for forecasting: the
forecast package for r,” no. 6/07, 2007.
[14] L. Moreira-Matias, J. Gama, M. Ferreira, J. Mendes-Moreira, and L. Damas, “Predicting
taxi–passenger demand using streaming data,” IEEE Transactions on Intelligent
Transportation Systems, vol. 14, no. 3, pp. 1393–1402, 2013.
[15] A. Krizhevsky, I. Sutskever, and G. E. Hinton, “Imagenet classification with deep
convolutional neural networks,” pp. 1097–1105, 2012.
指導教授 陳弘軒 審核日期 2019-7-2
推文 facebook   plurk   twitter   funp   google   live   udn   HD   myshare   reddit   netvibes   friend   youpush   delicious   baidu   
網路書籤 Google bookmarks   del.icio.us   hemidemi   myshare   

若有論文相關問題,請聯絡國立中央大學圖書館推廣服務組 TEL:(03)422-7151轉57407,或E-mail聯絡  - 隱私權政策聲明