中大機構典藏-NCU Institutional Repository-提供博碩士論文、考古題、期刊論文、研究計畫等下載:Item 987654321/61021
English  |  正體中文  |  简体中文  |  Items with full text/Total items : 80990/80990 (100%)
Visitors : 41642210      Online Users : 1476
RC Version 7.0 © Powered By DSPACE, MIT. Enhanced by NTU Library IR team.
Scope Tips:
  • please add "double quotation mark" for query phrases to get precise results
  • please goto advance search for comprehansive author search
  • Adv. Search
    HomeLoginUploadHelpAboutAdminister Goto mobile version


    Please use this identifier to cite or link to this item: http://ir.lib.ncu.edu.tw/handle/987654321/61021


    Title: 適應亮度與天候變化的前車碰撞警示系統;Light-and-Weather-Adapted Forward Collision Warning System
    Authors: 蔡昀昇;Cai,Yun-Sheng
    Contributors: 資訊工程學系
    Keywords: 前車偵測;碰撞警示;電腦視覺;車輛安全;智慧型車輛;Forward Collision Warning;vehicle detection
    Date: 2013-07-25
    Issue Date: 2013-08-22 12:09:57 (UTC+8)
    Publisher: 國立中央大學
    Abstract: 先進駕駛輔助系統 (advanced driver assistance system, ADAS) 在近年來已經成為重要的研究議題;利用先進的科技輔佐駕駛者,讓駕駛者更加的安全並大幅下降交通事故的發生。目前的駕駛輔助系統有許多項目
    例如車道偏離警示、行人偵測剎車系統、前方車輛偵測碰撞警示等。本研究分析了天候明暗變化,提出了可適應天候不同明暗變化的前車偵測技術並配合穩定偵測的判定準則,發展成一個可適應亮度與天候變化的前車碰撞警示系統。
    首先我們提出一個判斷白天與夜晚的準則,針對不同的時間點使用不同的演算法。在判斷白天夜晚的準則中,我們在大小為320240影像裡取出固定數量和固定位置的像素來代表整張影像,求出像素的平均亮度和小於某個亮度門檻值的像素佔所有像素的比例,依照這個比例定義出門檻值來分別日夜間。
    在日間偵測前方車輛的演算法中,我們會先依據偵測出來的車道線定義偵測範圍來避免非路面上的物體影響偵測,並對偵測範圍內做灰階第二差分找出不同強度的垂直邊和水平邊。而為了適應不同的天候環境,我們統計邊強度資訊與二值化門檻值的關係,求出動態門檻值並偵測影像邊界點。最後連結邊界點成平邊及垂直線;水平邊線分成為負水平邊和正水平邊。我們根據水平邊的兩端點位置找出是否有顯著的垂直邊或是是否具有左右對稱的垂直邊
    若符合此條件的水平邊,再根據水平邊和車道寬比例判斷是否位在我們定義的道路寬比例中,若符合則產生候選車輛。當天候狀況不良導致影像模糊時,我們採用車尾燈來偵測車輛
    例如,在大雨中,影像中的前車非常模糊,無法採用邊資訊來偵測車輛。而在此種天候情況下駕駛者通常會開啟車尾燈來提醒後方車輛,因此我們擷取車尾燈的紅色特徵來偵測車輛。在日間演算法中,一旦偵測到前方車輛我們會估計前方車輛的距離和計算碰撞時間 (time to collision, TTC) 判斷是否會對駕駛者產生危險並提供駕駛者警示。
    在夜間前車偵測中,我們使用車尾燈在影像中所呈現的特徵來偵測車尾燈。再以車尾燈的垂直距離、水平距離、移動軌跡、與兩車尾燈相關性等四項特徵配對,來偵測可能的候選車輛。最後再依據車尾燈在不同時間點寬度的變化量,計算與前車的碰撞時間,提供駕駛者警示。夜間車輛偵測和天候不良的日間偵測都是利用車尾燈資訊,由於我們所使用的相機設備擁有自動曝光功能,而夜間的平均亮度較低,因此相機會將畫面亮度提高,使得車尾燈產生過曝情形,夜間主要是利用此過曝點來偵測車尾燈。
    論文最後將介紹我們的實驗器材與開發環境,並統計日間車輛偵測在各種不同天候環境下的偵測率和偵測結果,根據數據顯示本系統在一般天候環境下,前方有車輛且正確偵測的偵測率皆可達到90%以上;並且分析系統效能,在Intel(R) Core(TM)2 Duo CPU 7400 @2.80GHz 2.80GHz 和4.0GB DDR2 RAM 的個人電腦上,整體系統執行速度每秒約為50張影像,在我們的實驗環境中相機每秒拍攝30張影像,因此本系統是可以達到即時性的車輛偵測。
    Advanced driver assistance systems (ADAS) has become an important research topic in recent years, the use of advanced technology assisted motorists to allow motorists more secure and substantial decline in traffic accidents. The standard driver assistance systems have many items, such as lane departure warning system, pedestrian detection system, forward collision warning system. This paper analyzed the weather brightness changes, and proposed changes in brightness can be adapted to different weather vehicle detection algorithm and stable detection criterion.we developed into an adaptive brightness and weather forward collision warning system.
    First, we integrate the car detection in day and night time, using different algorithm for the different time. The beginning we defined a criterion for judging what it day or night is, and decided what the algorithm we choose. The criterion judging day or night, we fetch fixed number of frame from different video. We fetch fixed pixels at fixed position for representing the whole image which size is 320×240, then calculating the average intensity and the percentage of bellow the threshold of pixels we fetch. According to the criterion, judging what it day or night.
    In daytime forwarding collision warning system, we define the detection area based on detected lane mark to avoid the impact of non-detection of objects on the road, and fetch vertical edge response image and horizontal edge response image by second difference mask. For adapting different weather, we use edge strength histogram and our binary threshold function to determine dynamically threshold value. we generate the bi-level gradient image with dynamically threshold. Then find the continuous pixels belong to the horizontal edge, the horizontal edge we divided into positive and negative for representing the bottom of car and body of car respectively. We find whether it has enough of vertical edge patency or symmetric vertical edge pair at the end point of horizontal edge. If the horizontal edge confirmed, we calculate the proportion of width of the horizontal edge with lane width. If the proportion confirmed, the candidate vehicles generate. In bad weather condition, the vehicles are detected by tail lights. For example, in the heavy rain, the vehicle is very vague and can not use edge information to detect. In this case, the driver usually turns taillights to warn vehicles behind, so we captured the red taillight features to detect vehicles. In daytime forwarding collision warning system, once the vehicle is detected we will estimate the distance and calculate the time to collision to (TTC) judge whether they would be dangerous drivers and provides the driver alert.
    In nighttime forwarding collision warning system, we detect the tails of preceding vehicles. Then we pair the lights using the features of the horizontal distance, the vertical heights, the trajectory, and correlation of a pair of lights. Finally, we estimate the time to collision of the verified light pair and providing the warning for the driver. Vehicle detection at night and bad weather are the use of daytime detection tail lights, because of we are using the camera device has automatic exposure control, while at night the average brightness is low, so the camera will increase the brightness of the screen, it’s making the car's taillights overexposure. the night is used to detect this overexposure taillights.
    Finally, the paper introduces our experimental equipment and the development environment, and statistical daytime vehicle detection in various weather conditions and the detection results of the detection rate. According to the data indicate that the system accurate detection of vehicles in front measured detection rate can reach 90% in normal weather conditions, and analyze system performance. In our experiment, the system execution speed of approximately 50 frames per second and camera captured 30 frames per second, so the system can achieve real-time vehicle detection.
    Appears in Collections:[Graduate Institute of Computer Science and Information Engineering] Electronic Thesis & Dissertation

    Files in This Item:

    File Description SizeFormat
    index.html0KbHTML664View/Open


    All items in NCUIR are protected by copyright, with all rights reserved.

    社群 sharing

    ::: Copyright National Central University. | 國立中央大學圖書館版權所有 | 收藏本站 | 設為首頁 | 最佳瀏覽畫面: 1024*768 | 建站日期:8-24-2009 :::
    DSpace Software Copyright © 2002-2004  MIT &  Hewlett-Packard  /   Enhanced by   NTU Library IR team Copyright ©   - 隱私權政策聲明