博碩士論文 103522093 詳細資訊




以作者查詢圖書館館藏 以作者查詢臺灣博碩士 以作者查詢全國書目 勘誤回報 、線上人數:1 、訪客IP:3.134.76.51
姓名 鄭博仁(Bo-Ren Zheng)  查詢紙本館藏   畢業系所 資訊工程學系
論文名稱 以核方法化的相關濾波器之物件追蹤方法 實作眼動儀系統
(Implementation of an eye-tracking system using Kernelized Correlation Filter.)
相關論文
★ 基於虹膜色彩空間的極端學習機的多類型頭痛分類★ 以多分數加權融合方式進行虹膜影像品質檢定
★ 基於深度學習之工業用智慧型機器視覺系統:以文字定位與辨識為例★ 基於深度學習的即時血壓估測演算法
★ 基於深度學習之工業用智慧型機器視覺系統:以焊點品質檢測為例★ 基於pix2pix深度學習模型之條件式虹膜影像生成架構
★ 雷射都普勒血流原型機之驗證與校正★ 以生成對抗式網路產生特定目的影像—以虹膜影像為例
★ 一種基於Faster R-CNN的快速虹膜切割演算法★ 運用深度學習、支持向量機及教導學習型最佳化分類糖尿病視網膜病變症狀
★ 應用卷積神經網路的虹膜遮罩預估★ Collaborative Drama-based EFL Learning with Mobile Technology Support in Familiar Context
★ 可用於自動訓練深度學習網路的網頁服務★ 基於深度學習方法之高精確度瞳孔放大片偵測演算法
★ 基於CNN方法之真假人臉識別模型★ 深度學習基礎模型與自監督學習
檔案 [Endnote RIS 格式]    [Bibtex 格式]    [相關文章]   [文章引用]   [完整記錄]   [館藏目錄]   [檢視]  [下載]
  1. 本電子論文使用權限為同意立即開放。
  2. 已達開放權限電子全文僅授權使用者為學術研究之目的,進行個人非營利性質之檢索、閱讀、列印。
  3. 請遵守中華民國著作權法之相關規定,切勿任意重製、散佈、改作、轉貼、播送,以免觸法。

摘要(中) 近年來,眼動儀已經是一個常用於心理學分析、疾病分析、廣告配置分析等等領域的一套設備。在本研究中,我們自製了一套穿戴式的眼動儀系統。使用Microsoft HD-6000 相機,改造成可清楚拍攝虹膜與瞳孔邊界的紅外光相機,搭配專用連接器可將攝影機置於眼鏡的鏡框上,讓攝影機能夠拍攝到眼睛的紅外光影像。眼動儀系統最主要的功能是能夠正確的偵測瞳孔的位置,本研究方法使用了以核方法的相關濾波器之物件追蹤演算法實作瞳孔追蹤功能,使用追蹤演算法找出大致上的瞳孔中心位置,並且搭配圓形擬合方法找出正確的瞳孔圓心及半徑,在搭配投影轉換,將瞳孔的位置轉換到實際所看的螢幕位置。在實驗結果中,與手動定位的瞳孔圓心的誤差平均只有2.02 個pixel,半徑大小的誤差為1 個pixel,且在執行速度上,處理一張影像只需要0.0295 秒,相當於每秒可執行33.95 張影像,執行速度超過一般攝影機所能提供的每秒30 張影像,是一套計算快速且準確的眼動儀系統。
摘要(英) In recent years, eye-tracking is already used in areas like psychology, human-computer interface and e-learning. In this study we made a wearable eye-tracking system.We hand-made a IR camera by modifying a commercially available webcam(MS HD-6000) and mounted it to a customized glasses frame. Such device is a wearable eye-tracking system which is able to record a clear video of eye movement when the user wears the glasses frame. The most important feature of an effective eye-tracker is to locate and track the pupil movement correctly in realtime. In this research, we used Kernelized Correlation Filter to implement pupil location tracking. By using KCF and a self-developed circle-fitting algorithm, we are able to detect and track the pupil location accurately. In experimental results,compare manual and automatic detection of the pupil center, the average error of manual and automatic detection pupil center is 2.02 pixel, and the average error ofpupil radius is 1.1 pixel. In terms of execution speed, it only takes 0.0295 second to process an image, which is equivalent to 33.9 FPS (frame per second). Therefore,our eye-tracking system is fast enough to fulfill the real-time requirement and is ready to be used in many practical situations.
關鍵字(中) ★ 眼動儀
★ 相關濾波器
★ 核方法
關鍵字(英) ★ eye-tracking
★ correlation filter
★ Kernel method
論文目次 中文摘要. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . i
英文摘要. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . ii
目錄. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . iii
圖目錄. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . vi
表目錄. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . viii
一、緒論. . . . . . . . . . . . . . . . . . . . . . . . . . . . 1
1-1 研究背景. . . . . . . . . . . . . . . . . . . . . . . . . 1
1-2 研究動機. . . . . . . . . . . . . . . . . . . . . . . . . 3
1-3 論文架構. . . . . . . . . . . . . . . . . . . . . . . . . 4
二、眼動儀介紹. . . . . . . . . . . . . . . . . . . . . . . . 5
2-1 眼動儀簡介與演算法. . . . . . . . . . . . . . . . . . . 5
2-2 眼動儀系統比較. . . . . . . . . . . . . . . . . . . . . 6
2-3 硬體介紹. . . . . . . . . . . . . . . . . . . . . . . . . 8
2-3-1 移除紅外光濾鏡. . . . . . . . . . . . . . . . . . . . . 9
2-3-2 紅外光源建置. . . . . . . . . . . . . . . . . . . . . . 10
三、眼動儀系統核心演算法. . . . . . . . . . . . . . . . . 13
3-1 演算法簡介. . . . . . . . . . . . . . . . . . . . . . . . 13
3-1-1 Kernel type . . . . . . . . . . . . . . . . . . . . . . . . 14
1. Gaussian Kernel . . . . . . . . . . . . . . . . . . . . . . 15
2. Polynomial Kernel . . . . . . . . . . . . . . . . . . . . 15
3-1-2 Feature type . . . . . . . . . . . . . . . . . . . . . . . . 16
1. 方向梯度直方圖. . . . . . . . . . . . . . . . . . . . . 16
3-2 Kernelized Correlation Filters . . . . . . . . . . . . . . 19
3-2-1 Linear regression . . . . . . . . . . . . . . . . . . . . . 19
3-2-2 Circulant Shifts and Circulant Matrix . . . . . . . . . . . 20
3-2-3 Kernel trick . . . . . . . . . . . . . . . . . . . . . . . . 22
3-2-4 Detection . . . . . . . . . . . . . . . . . . . . . . . . . 24
3-2-5 Kernel Correlation . . . . . . . . . . . . . . . . . . . . 25
3-3 偵測瞳孔大小. . . . . . . . . . . . . . . . . . . . . . 26
3-4 圓形擬合方法. . . . . . . . . . . . . . . . . . . . . . 27
3-5 程式流程. . . . . . . . . . . . . . . . . . . . . . . . . 28
四、系統介紹. . . . . . . . . . . . . . . . . . . . . . . . . 32
4-1 校正. . . . . . . . . . . . . . . . . . . . . . . . . . . . 32
4-2 螢幕投影方法. . . . . . . . . . . . . . . . . . . . . . 32
4-2-1 Affine Transform . . . . . . . . . . . . . . . . . . . . . 33
4-2-2 二次線性回歸. . . . . . . . . . . . . . . . . . . . . . 35
4-3 系統實作與介面設計. . . . . . . . . . . . . . . . . . . 35
五、實驗結果. . . . . . . . . . . . . . . . . . . . . . . . . 40
5-1 Database . . . . . . . . . . . . . . . . . . . . . . . . . . 40
5-2 瞳孔定位分析. . . . . . . . . . . . . . . . . . . . . . 41
5-2-1 Ground truth 誤差. . . . . . . . . . . . . . . . . . . . . 41
5-2-2 K-means Integration of Radial Difference(KIRD) . . . . 42
5-2-3 瞳孔定位實驗結果分析. . . . . . . . . . . . . . . . . 45
5-3 螢幕投影. . . . . . . . . . . . . . . . . . . . . . . . . 48
六、結論與未來展望. . . . . . . . . . . . . . . . . . . . . 51
6-1 結論. . . . . . . . . . . . . . . . . . . . . . . . . . . . 51
6-2 未來展望. . . . . . . . . . . . . . . . . . . . . . . . . 51
索引. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 53
參考文獻. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 53
參考文獻 [1] Sabine Heuer and Brooke Hallowell,A novel eye-tracking method to assess
attention allocation in individuals with and without aphasia using a dual-task
paradigm ,Journal of Communication Disorders, 2015,vol. 55,pp. 15-30
[2] Corine S. Meppelink, Nadine Bol, Exploring the role of health literacy on
attention to and recall of text-illustrated health information: An eye-tracking
study, Computers in Human Behavior, Volume 48, July 2015, Pages 87-93,
ISSN 0747-5632.
[3] Shu-Fei Yang, An eye-tracking study of the Elaboration Likelihood Model in
online shopping, Electronic Commerce Research and Applications, Volume
14, Issue 4, July–August 2015, Pages 233-240, ISSN 1567-4223,
[4] Tamara van Gog, Katharina Scheiter, Eye tracking as a tool to study and
enhance multimedia learning, Learning and Instruction, Volume 20, Issue 2,
April 2010, Pages 95-99, ISSN 0959-4752.
[5] Peter M. Corcoran, Florin Nanu, Stefan Petrescu and Petronel Bigioi, Realtime
eye gaze tracking for gaming design and consumer electronics systems,
in IEEE Transactions on Consumer Electronics, vol. 58, no. 2, pp. 347-355,
May 2012. doi: 10.1109/TCE.2012.6227433
[6] Dong-Chan Cho, Wah-Seng Yap, HeeKyung Lee, Injae Lee and Whoi-Yul
Kim, Long range eye gaze tracking system for a large screen, in IEEE Trans-
53
actions on Consumer Electronics, vol. 58, no. 4, pp. 1119-1128, November
2012. doi: 10.1109/TCE.2012.6414976
[7] Chul Woo Cho, Ji Woo Lee, Kwang Yong Shin, Eui Chul Lee, Kang Ryoung
Park, Heekyung Lee, and Jihun Cha, Gaze Detection by Wearable Eye-
Tracking and NIR LED-Based Head-Tracking Device Based on SVR, ETRI
Journal, vol. 34, no. 4, Aug. 2012, pp. 542-552.
[8] Ruian Liu, Xin Zhou, Nailin Wang, Mimi Zhang,Adaptive regulation of CCD
camera for real time eye tracking, Multimedia Tools and Applications, vol.
52, no. 1, 2011, pp.33-43
[9] Eui Chul Lee and Min Woo Park, A New Eye Tracking Method as a Smartphone
Interface KSII Transactions on Internet and Information SystemsVol.
7, No.4, April 30, 2013
[10] Alper Yilmaz, Omar Javed, and Mubarak Shah.Object tracking: A survey.
ACM Comput. Surv. vol. 38,Issue 4, no. 13, December 2006.
[11] Wenhao Zhang, Melvyn L. Smith, Lyndon N. Smith, Abdul Farooq, Eye center
localization and gaze gesture recognition for human–computer interaction,
in Journal of the Optical Society of America, March 2016.
[12] Seung-Jin Baek, Kang-A Choi, Chunfei Ma, Young-Hyun Kim, and Sung-
Jea Ko, Eyeball model-based iris center localization for visible image-based
eye-gaze tracking systems, in IEEE Transactions on Consumer Electronics,
vol. 59, no. 2, pp. 415-421, May 2013, doi: 10.1109/TCE.2013.6531125
[13] Yan Shen, Hak Chul Shin, Won Jun Sung, Sarang Khim, Honglak Kim, Phill
Kyu Rhee, Evolutionary adaptive eye tracking for low-cost human computer
54
interaction applications. J. Electron. Imaging. 0001;22(1):013031-013031.
doi:10.1117/1.JEI.22.1.013031.
[14] Stylianos Asteriadis, Dimitris Soufleros, Kostas Karpouzis, and Stefanos
Kollias. A natural head pose and eye gaze dataset. In Proceedings of the
International ACM, New York, NY, USA, Article 1 , 4 pages.
[15] Kang-A Choi, Chunfei Ma, and Sung-Jea Ko, Improving the usability of
remote eye gaze tracking for human-device interaction, in IEEE Transactions
on Consumer Electronics, vol. 60, no. 3, pp. 493-498, Aug. 2014. doi:
10.1109/TCE.2014.6937335
[16] David Sliney, Danielle Aron-Rosa, Francois DeLori, Franz Fankhauser,
Robert Landry, Martin Mainster, John Marshall, Bernard Rassow, Bruce
Stuck, Stephen Trokel, Teresa Motz West, and Michael Wolffe, Adjustment
of guidelines for exposure of the eye to optical radiation from ocular instruments:
statement from a task group of the International Commission on Non-
Ionizing Radiation Protection (ICNIRP) Appl. Opt. 44, 2162-2176, 2005.
[17] Donald K. Martin and Brien A. Holden, A New Method for Measuring the
Diameter of the in Vivo Human Cornea,Article in American journal of optometry
and physiological optics 59(5):436-41, June 1982.
[18] João F. Henriques, Rui Caseiro, Pedro Martins, and Jorge Batista, High-speed
tracking with kernelized correlation filters, Pattern Analysis and Machine
Intelligence, IEEE Transactions on, vol. 37, no. 3, pp. 583–596, 2015.
[19] James Mercer, Functions of Positive and Negative Type, and their Connection
with the Theory of Integral Equations.Philosophical Transactions of the
Royal Society of London, vol. 209, pp.415-446, January 1909.
55
[20] Aizerman, M. A. and Braverman, E. A. and Rozonoer, L., Theoretical foundations
of the potential function method in pattern recognition learning. Automation
and Remote Control, vol. 25, pp.821-837,1964
[21] Dalal, Navneet, and Bill Triggs. Histograms of oriented gradients for human
detection. Computer Vision and Pattern Recognition, 2005. CVPR 2005.
IEEE Computer Society Conference on. Vol. 1. IEEE, 2005.
[22] Matthew A. Turk, Face recognition using eigenfaces, Computer Vision and
Pattern Recognition, 1991. Proceedings CVPR’ 91., IEEE Computer Society
Conference on, Jun. 1991
[23] Peter N. Belhumeur, João P. Hespanha, and David J. Kriegman, Eigenfaces
vs. Fisherfaces: recognition using class specific linear projection,Pattern
Analysis and Machine Intelligence, IEEE Transactions on , vol.19, no.7, pp.
711-720, Jul 1997 doi: 10.1109/34.598228
[24] Ryan Rifkin, Gene Yeo and Tomaso Poggio, Regularized least-squares classification,
Nato Science Series Sub Series III Computer and Systems Sciences,
vol. 190, pp. 131–154, 2003.
[25] Robert M. Gray, Toeplitz and Circulant Matrices: A Review. NowPublishers,
2006.
[26] Maycas Nadal, Catlos. Input-Output Kernel Regression applied to proteinprotein
interaction network inferenc. 2015. PhD Thesis.
[27] http://cc.ee.ntu.edu.tw/~ultrasound/belab/midterm_oral_
files/2011_100_1/100-1-mid-9.pdf
[28] Dapper Vision, Inc..Eye Tracking. http://www.wearscript.com/en/
latest/eyetracking.html
56
[29] http://www.vishay.com/ir-emitting-diodes/
[30] http://www.clspectrum.com/articleviewer.aspx?articleid=
12852
[31] http://www.clspectrum.com/articleviewer.aspx?articleid=
12852
[32] 行政院研究發展考核委員會, 紅外線及低頻電磁場量測方法建立研究,
中華民國政府出版品,2008.
[33] 連翊展,AILIS: An Adaptive and Iterative Learning Method for Accurate Iris
Segmentation, 2016,master’s thesis.
指導教授 栗永徽(Yung-Hui Li) 審核日期 2016-8-18
推文 facebook   plurk   twitter   funp   google   live   udn   HD   myshare   reddit   netvibes   friend   youpush   delicious   baidu   
網路書籤 Google bookmarks   del.icio.us   hemidemi   myshare   

若有論文相關問題,請聯絡國立中央大學圖書館推廣服務組 TEL:(03)422-7151轉57407,或E-mail聯絡  - 隱私權政策聲明