博碩士論文 995203034 詳細資訊




以作者查詢圖書館館藏 以作者查詢臺灣博碩士 以作者查詢全國書目 勘誤回報 、線上人數:42 、訪客IP:18.216.34.146
姓名 魏憶君(Yi-Chun Wei)  查詢紙本館藏   畢業系所 通訊工程學系
論文名稱 以運動補償模型為基礎之移動式相機多物件追蹤
(Multiple object tracking using compensated motion model for mobile cameras)
相關論文
★ 應用於車內視訊之光線適應性視訊壓縮編碼器設計★ 以粒子濾波法為基礎之改良式頭部追蹤系統
★ 應用於空間與CGS可調性視訊編碼器之快速模式決策演算法★ 應用於人臉表情辨識之強健式主動外觀模型搜尋演算法
★ 結合Epipolar Geometry為基礎之視角間預測與快速畫面間預測方向決策之多視角視訊編碼★ 基於改良式可信度傳遞於同質區域之立體視覺匹配演算法
★ 以階層式Boosting演算法為基礎之棒球軌跡辨識★ 多視角視訊編碼之快速參考畫面方向決策
★ 以線上統計為基礎應用於CGS可調式編碼器之快速模式決策★ 適用於唇形辨識之改良式主動形狀模型匹配演算法
★ 以運動補償模型為基礎之移動式平台物件追蹤★ 基於匹配代價之非對稱式立體匹配遮蔽偵測
★ 以動量為基礎之快速多視角視訊編碼模式決策★ 應用於地點影像辨識之快速局部L-SVMs群體分類器
★ 以高品質合成視角為導向之快速深度視訊編碼模式決策★ 基於匹配代價曲線特徵之遮蔽偵測之研究
檔案 [Endnote RIS 格式]    [Bibtex 格式]    [相關文章]   [文章引用]   [完整記錄]   [館藏目錄]   [檢視]  [下載]
  1. 本電子論文使用權限為同意立即開放。
  2. 已達開放權限電子全文僅授權使用者為學術研究之目的,進行個人非營利性質之檢索、閱讀、列印。
  3. 請遵守中華民國著作權法之相關規定,切勿任意重製、散佈、改作、轉貼、播送,以免觸法。

摘要(中) 以手機與車輛等移動平台上架設相機進行多物件追蹤之需求興起,而相異於單物件追蹤,多物件追蹤需考慮物件之間彼此量測相互影響的問題,而當運用於移動平台上,更須隨著背景的變動,仍能準確估計正確的相機動量以達成高準確度之物件追蹤。因此,本論文提出結合粒子濾波器估測之量測與聯合機率關聯性(JPDA)之多物件演算法,以粒子濾波器(PF)的估測作為JPDA之量測,且在PF的更正階段中,加入條件篩選對候選粒子們的量測,計算物件和量測間之關聯機率,以基於卡爾曼濾波器計算聯合機率數據關聯性,計算聯合機率關聯性。此外,在運動模型方面,本論文採用SURF作特徵點匹配,再結合仿射模型(affine model)協助計算相機動量,以供做於運動補償模型以補償物件在二維影像中之動量。從實驗結果中顯示,我們提出的演算法用於移動式相機之多物件追蹤中,也可達到良好的追蹤效果。
摘要(英) The necessity of multiple object tracking based on mobile platforms such as mobile phones and vehicles are growing. Different from single object tracking, multiple object tracking needs to consider the data association problem between objects and measurements. Moreover, for mobile platforms, how to estimate camera motion accurately plays a key role in the success of object tracking. Therefore, this paper proposes a multiple object tracking algorithm that combines the particle filter (PF) based measurement and Kalman filter (KF) based joint probability data association (JPDA). Before the correction stage of PF, several particles may be filtered out based on the validation region of each object. In addition, this paper combines SURF matching and affine model to estimate camera motion for motion compensated model. Experimental results show that the proposed multiple object tracking algorithm based on mobile cameras performs well.
關鍵字(中) ★ 移動式相機
★ 追蹤
★ 多物件
關鍵字(英) ★ mobile cameras
★ tracking
★ multiple object
論文目次 摘要 I
Abstract II
誌謝 III
目錄 IV
圖目錄 VI
表目錄 IX
第一章 緒論 1
1.1 前言 1
1.2 研究動機 1
1.3 研究方法 2
1.4 論文架構 3
第二章 物件追蹤技術現況 4
2.1 遞迴式狀態估測(Recursive State Estimate) 4
2.1.1貝氏濾波器(Bayesian Filter) 4
2.1.2卡爾曼濾波器(Kalman Filter) 7
2.1.3 粒子濾波器(Particle Filter) 9
2.2 移動平台之物件追蹤 17
2.2.1 先偵測後追蹤(Object Tracking by Detection) 17
2.5.2 以估測相機動量為基礎之物件追蹤 19
2.3 總結 24
第三章 採用聯合機率數據關聯性之多物件追蹤技術現況 25
3.1 機率數據關聯(Probability Data Association, PDA) 25
3.2 聯合機率數據關聯性(Joint Probability Data Association, JPDA) 28
3.3 結合採用與粒子濾波器JPDA之多物件追蹤技術現況 32
3.4 總結 35
第四章 結合粒子濾波器估測之量測的聯合機率數據關聯性為基礎之多物件追蹤 36
4.1 系統架構 36
4.2 結合Speed-up Robust Feature和仿射模型之全域運動量估測 38
4.3 結合粒子濾波器估測之量測與聯合機率數據關聯性之多物件追蹤演算法 44
4.4 結論 46
第五章 實驗結果與討論 47
5.1 實驗參數與測試影片 47
5.2.1 追蹤系統的準確度 48
5.2.2 追蹤系統的強健度 52
5.2.3 多物件追蹤不同運動模型之估測差異 53
5.2.4 系統計算複雜度 57
5.3 總結 58
第六章 結論與未來展望 59
參考文獻 60
參考文獻 [1] L. Ling, E. Cheng, and I. S. Burnett, “Eight solutions of the seential matrix for continuous camera motion tracking in video augmented reality,” in Proc. IEEE International Conference on Multimedia and Expo, July 2011.
[2] U. Neumann and S. You, “Integration of region tracking and optical flow for image motion estimation,” in Proc. IEEE International Conference on Image Processing, Oct. 1998.
[3] S.-W. Yang and C.-C. Wang, “Multiple-model RANSAC for ego-motion estimation in highly dynamic environments,” in Proc. IEEE International Conference on Robotics and Automation, May 2009.
[4] I. J. Cox and L. Hingorani, “An efficient implementation of Reld’s multiple hypothesis tracking algorithm and its evaluation for the purpose of visual tracking,” IEEE Transactions on Pattern Analysis and Machine Intelligence, Vol. 18, No. 2, pp. 138-150, Feb. 1996.
[5] Y. B. Shalom, F. Daum, and J. Huang, “The probabilistic data association filter,” IEEE Transactions on Control Systems Magazine ,Vol. 29, No. 6, pp. 82–100, Dec. 2009.
[6] A. Yilmaz, O. Javed, and M. Shah, “Object tracking: A Survey,” ACM Computing Surveys, Vol. 38, No. 4, pp. 1-45, Dec. 2006.
[7] G. Welch and G. Bishop, “An introduction to the Kalman filter,” Technical Report TR 95-041, University of North Carolina, Department of Computer Science, 1995.
[8] N. Gordon, D. Salmond, and A. Smith, “Novel approach to nonlinear/non-Gaussian Bayesian state estimation,” IEEE Transactions on Radar and Signal Processing, Vol. 140, No. 2, pp. 107-113, Apr. 1993.
[9] K. Nummiaro, E. Koller-Meier, and L. V. Gool, “An adaptive color-based particle filter,” Image and Vision Computing, Vol. 21, No. 1, pp. 99-110, January 2003.
[10] C. R. Blanco, F. Jaureguizar, and N. Garcia, “Bayesian visual surveillance: A model for detecting and tracking a variable number of moving objects,” in Proc. IEEE International Conference on Image Processing, 2011.
[11] M. Meuter, U. Iurgel, S. Park, and A. Kummert, “The un-scented Kalman filter for pedestrian tracking from a moving host,” in Proc. IEEE Symposium on Intelligent Vehicles, June 2008.
[12] C.-C. Lin and W. Wolf, “MCMC-based feature-guided particle filtering for tracking moving objects from a moving platform,” in Proc. IEEE International Conference on Computer Vision, Oct. 2009.
[13] R. Vidal, “Multi-subspace methods for motion segmentation from affine, perspective and central panoramic cameras,” in Proc. IEEE International Conference on Robotics and Automation, pp. 1216-1221, Apr. 2005.
[14] J. Kang, K. Gajera, I. Cohen, and G. Medioni, “Detection and tracking of moving objects from overlapping EO and IR sensors,” in Proc. IEEE Conference on Computer Vision and Pattern Recognition, June 2004.
[15] M. D. Breitenstein, F. Reichlin, B. Leibe, E. K. Meier, and L. V. Gool, “Robust tracking-by-detection using a detector confidence particle filter,” in Proc. IEEE International Conference on Computer Vision, Oct. 2009.
[16] M. Ebrahimi, Student Member, and W. W. Mayol-Cuevas, “Adaptive sampling for feature detection, tracking, and recognition on mobile platforms,” IEEE Transactions on Circuits and Systems for Video Technology, Vol. 21, No. 10, pp. 1467-1475, Oct. 2011.
[17] C.-M. Huang, Y.-R. Chen, and L.-C. Fu, “Real-time object detection and tracking on a moving camera platform,” in Proc. IEEE ICROS-SICE, Aug. 2005.
[18] J. Kang, I. Cohen, G. Medioni, and C. Yuan, “Detection and tracking of moving objects from a moving platform in presence of strong parallax,” in Proc. IEEE International Conference on Computer Vision, Oct. 2005.
[19] A. Ess, B. Leibe, K. Schindler, and L. V. Gool, “A mobile vision system for robust multi-person tracking,” in Proc. IEEE Conference on Computer Vision and Pattern Recognition, June 2008.
[20] J.-Y. Lu, Y.-C. Wei, and C.-W. Tang, “Visual tracking using compensated motion model for mobile cameras,” in Proc. IEEE International Conference on Image Processing, Sep. 2011.
[21] A. Seki and M. Okutomi, “Ego-Motion estimation by matching dewarped road regions using stereo images,” in Proc. IEEE International Conference on Image Processing, Sep. 2006.
[22] B. Jung and G. S. Sukhatme, “Real-time motion tracking from a mobile robot,” International Journal of Social Robotics, Vol. 2, No. 1, pp. 63-78, March 2010.
[23] C. R. Blanco, F. Jaureguizar, L. Salgado, and N. Garcia, “Target detection through motion segmentation and tracking restriction in aerial FLIR images,” in Proc. IEEE International Conference on Image Processing, Oct. 2007.
[24] Z. G. Liu, Y. F. Li, Senior Member and P. Bao, “Stereo-based head tracking with motion compensation model,” in Proc. IEEE International Conference on Robotics and Biomimetics, Oct. 2004.
[25] Y. Jin, “Beyond ICONDENSATION: AICONDENSATION and AFCONDENSATION for visual tracking with low-level and high-level cues,” in Proc. IEEE International Conference on Image Processing, Nov. 2009.
[26] M. Kristan, “A local-motion-based probabilistic model for visual tracking,” The Journal of the Pattern Recognition, Vol. 42, No. 9, pp. 2160-2168, January 2009.
[27] H. Bay, A. Ess, T. Tuytelaars, and L. Van Gool, “SURF: Speeded up robust features,” Computer Vision and Image Understanding, Vol. 110, No. 3, pp. 346–359, 2008.
[28] J. Shi and C. Tomasi, “Good features to track,” in Proc. IEEE Conference on Computer Vision and Pattern Recognition, pp. 593–600, June 1994.
[29] S. Challa, M. R. Morelande, D. Musicki, and R. L. Evans, “Fundamentals of object tracking”, pp.115-211, Cambridge, UK, 2011.
[30] C. Rasmussen and G. D. Hager, “Probabilistic data association methods for tracking complex visual objects,” IEEE Transactions on Pattern Analysis and Machine Intelligence, Vol. 23, No. 6, pp. 560-575, June 2001.
[31] K. C. Chang and Y. B. Shalom, “Joint probabilistic data association for multitarget tracking with possibly unresolved measurements and maneuvers,” IEEE Transactions on Automatic Control, Vol. 29, No. 7, pp.585-594, July 1984.
[32] I. J. Cox and L. Hingorani, “An efficient implementation of Reld’s multiple hypothesis tracking algorithm and its evaluation for the purpose of visual tracking,” IEEE Transactions on Pattern Analysis and Machine Intelligence, Vol. 18, No. 2, pp. 138-150, Feb. 1996.
[33] B. Chen and J. K. Tugnait, “Tracking of multiple maneuvering targets in clutter using IMM/JPDA filtering and fixed-lag smoothing,” IEEE Transactions on Automatic Control, Vol. 2, No. 37, pp. 239-249, Feb. 2001.
[34] K. Bai, “Particle filter tracking with mean shift and joint probability data association,” in Proc. IEEE International Conference on Image Analysis and Image Processing, Apr. 2010.
[35] X. Song, B. Wen, J. Cui, H. Zhao, X. Shao, R. Shibasaki, and H.Zha, "A boosted JPDA particle filter for multi-target tracking", in Proc. Asian Workshop on Sensing and Visualization of City-Human Interaction (AWSVCI ), pp.1-4, 2009.
[36] Y. Cai, N. D. Freitas, and J. J. Little, “Robust visual tracking for multiple targets,” in Proc. Europe Conference on Image and Vision Computing, 2006.
[37] M. S. Djouadi Y. Morsly, and D. Berkani, ”JPDA-IMM based particle filter algorithm for tracking highly maneuvering targets,” IEEE Transactions on Aerospace and Electronic Systems, Vol. 43, No. 1, p.p 23-35, Jan. 2007.
[38] N. T. Pham, K. Leman, M. Wong, and F. Gao, “Combining JPDA and particle filter for visual tracking,” in Proc. IEEE International Conference on Multimedia & Expo, 2010.
[39] S. Nikitidis, S. Zafeirious, and I. Pitas. “Camera motion estimation using a novel online vector field model in particle filters,”IEEE Transactions on Circuit and Systems for Video Technology, Vol. 18, No. 8, pp. 1028-1039, Aug. 2008.
[40] J. L. Yang, D. Schonfeld, and M. Mohamed, “Robust video stabilization based on particle filter tracking of projected camera motion,” IEEE Transactions on Circuits and Systems for Video Technology, Vol. 19, No. 7, July 2009.
[41] PETs Database, 2001. http://www.hitech-projects.com/euprojects/cantata/datasets_cantata/dataset.html
指導教授 唐之瑋(Chih-Wei Tang) 審核日期 2012-7-18
推文 facebook   plurk   twitter   funp   google   live   udn   HD   myshare   reddit   netvibes   friend   youpush   delicious   baidu   
網路書籤 Google bookmarks   del.icio.us   hemidemi   myshare   

若有論文相關問題,請聯絡國立中央大學圖書館推廣服務組 TEL:(03)422-7151轉57407,或E-mail聯絡  - 隱私權政策聲明