Please use this identifier to cite or link to this item:
|Title: ||先進安全車輛的視覺式前車碰撞警示與倒車導引技術;Vision-based Forward Collision Warning and Parking guiding Techniques for Advanced Safety Vehicles|
|Keywords: ||先進安全車輛;前車碰撞警示;倒車導引;主成分分析;Advanced Safety Vehicle;Forward Collision Warning;Parking guiding;Principal components analysis|
|Issue Date: ||2014-10-15 17:07:56 (UTC+8)|
|Abstract: ||近年來，前後方車輛碰撞造成了大量的死傷人數。為了避免這些致命傷害，有許多前方碰撞警示系統 (FCW) 被提出來保護駕駛人遠離因為疏忽而造成的前方碰撞事故。一個標準的FCW系統應該要包含車輛偵測和車輛驗證兩個部分。前方車輛偵測要在影像當中擷取出候選車輛，接著驗證程序會計算候選影像的車輛相似程度，以減少錯誤偵測的結果。不只是FCW系統，我們也一直不斷地發展了許多相關的先進安全車輛系統，例如：車道偏移警示系統 (LDW)、側邊盲點偵測系統 (BSD)、環場影像監控系統 (AVM)、行人碰撞警示系統 (PCW)、影像式倒車輔助系統 (IPG)等。|
在我們所提的FCW系統中，第一步驟是先計算影像中的局部特徵，也就是水平邊和垂直邊。接著，使用一個學習式的方法將邊影像做二值化，以適應影像亮度的變化。因此，擷取出來的邊點比較不會受到天候因素的影響。第三步驟是用保留下來的邊點產生可能的物件。第四步驟是根據邊的強度、位置、和對稱性，分析這些物件以產生候選車輛。我們提出了階層式的車輛偵測方法，能在各種天候下偵測出候選車輛。最後我們使用一個基於主成分分析 (PCA) 的方法驗證候選車輛。PCA是從大量車輛影像中擷取重要特徵的技術。每一個擷取出來的特徵都是在描述車輛樣貌的特性，我們將這樣的特徵定義為整體特徵。根據擷取出來的特徵，一個候選區域可以被分解及重建。原始影像和重建影像之間的相似度則被計算出來用以驗證候選車輛。理論上，PCA方法是用來排除非車輛的候選影像以減少錯誤警報。我們提出的FCW系統結合了局部特徵和整體特徵的優點以降低錯誤警報率同時又保持偵測率。
我們所提的FCW系統具有下列特色：(1) 邊點的擷取可以適應亮度變化。(2) 局部特徵相互參考與關聯，以強化偵測的可靠度。(3) 階層式的偵測機制提昇了車輛偵測對於天候條件的適應力。(4) 以PCA為基礎的驗證方法可以嚴格地排除缺乏車輛樣貌的候選區域。
停車對駕駛人來說是相當重要的技能。然而，對於許多駕駛人來說，要將車輛停進一個較小的停車空間也是非常困難的。因此才有以方向盤轉向感知器為基礎的倒車導引輔助系統上市。一般來說，方向盤轉向感知器可以將車輛移動方向的資訊提供給駕駛人；但是，方向盤轉向感知器的安裝相當麻煩，而且價格昂貴。因此在本論文中，我們提出了一個影像式倒車導引 (IPG) 系統幫助駕駛人將車輛停進停車格內。我們所提的系統只需要一個嵌入式硬體設備及一個廣角相機擷取影像，不需要安裝方向盤轉向感知器。這是一個花費較低的技術，同時也能適用於新舊車輛及後裝市場。
在我們所提的IPG系統中，第一步驟是以一個homography轉換矩陣將輸入影像轉換成鳥瞰視角的影像。接著，在連續兩張影像中擷取出邊點並且互相匹配。在匹配到的成對邊點中，利用最小平方誤差方法進一步刪除不可靠或錯誤對應的匹配邊點。剩下來的可靠的匹配邊點才用來估計車輛移動參數，其中，我們使用一個結合阿克曼轉向原理的等距 (isometric) 轉換矩陣描述車輛的移動行為。最後，車輛軌跡是基於車輛移動參數估計出來的。倒車導引線會根據車輛移動軌跡顯示在影像上。
;In recent years, there were lots of deaths caused by traffic accidents of rear-end collision. To prevent these fatalities, forward collision warning (FCW) systems have been proposed to protect drivers from the danger due to paying no attention to forward traffic situations. A standard FCW system should include two major parts: preceding vehicle detection and verification. The preceding vehicle detection extracts vehicle candidates, and then the verification procedure measures the likelihood of vehicles to reduce the wrong detection. Not only FCW systems but also many other advanced driving assistance systems (ADASs) such as lane departure warning (LDW), blind spot detection (BSD), around view monitoring (AVM), pedestrian collision warning (PCW), parking guiding assistance (PGA) systems, etc. have been enthusiastically developed to assist drivers.
In this dissertation, we present a weather-adaptive forward collision warning system applying local features for vehicle detection and global features for vehicle verification, which would help drivers to avoid unintended collisions to the preceding vehicles or obstacles. On the other hand, an image-based parking guiding system is also proposed to help drivers’ parking.
In the proposed FCW system, the local features such as horizontal and vertical edges are first calculated. Then edge maps are bi-leveled using a learning thresholding method to adapt the intensity variation of captured images, so that the extraction of edge points is less influenced by bad weather conditions. Third, the preserved edge points are used to generate possible objects . Fourth, the objects are selected based on edge response, location, and symmetry of object candidates to generate vehicle candidates. Three candidate generation schemes are hierarchically designed to extract vehicle candidates in various weather conditions. At last, a method based on principal component analysis (PCA) is proposed to verify the vehicle candidates. PCA is a technique used to extract the important features of a set of vehicle images. Each extracted feature describes a characteristic of vehicle appearance which is defined as a global feature. Depending on the extracted features, a candidate region can be decomposed and reconstructed. The similarity between the original regions and the reconstructed regions are measured to verify the vehicle candidates. Theoretically, PCA method is used to remove the non-vehicle candidates to reduce the false alarm. The proposed FCW system combined the merits of local and global features to reduce the false alarm rate but keep the detection rate.
The proposed FCW system has been test and evaluated on various weather conditions. The average accuracies of the proposed FCW system in clear and bad weather conditions are 96.2% and 79.1%, respectively. The accuracy on heavily rainy day is poor than other weather conditions due to the severely blurred edges and appearance. The blurred images is a huge challenge of all related FCW systems, but the proposed system can recognize vehicles under heavily rainy days as long as they can be observed.
The proposed FCW system has the following properties: (i) the edge extraction is adaptive to various lighting condition, (ii) the local features are mutually processed to improve the reliability of vehicle detection, (iii) the hierarchical generation schemes of vehicle detection enhances the adaptability to various weather conditions, (iv) the PCA-based verification can strictly eliminate the candidate regions without vehicle appearance.
Parking is an essential skill for most drivers. However, it is difficult for many drivers to park their vehicles into a small parking area. Thus parking guiding assistant systems have been developed to help the drivers. In general, a steering sensor can provide the vehicle moving direction for drivers; nevertheless, a steering sensor is complicated to install and expensive. In this study, an image-based parking guidance (IPG) system is proposed to help drivers parking their cars into parking space. The proposed system only relies on an embedded hardware and a wide-angle camera to capture images for analysis without steering sensor. This is a money-saved technique; moreover, it is suitable for used cars and after-market usage.
In the proposed IPG system, input images are first transformed into top-view images by a homography transformation. Then corner points on two consecutive images are extracted to match each other. The feature-point pairs are further pruned by a least-square error metrics. The remained pairs are then used to estimate vehicle motion parameters, where an isometric transformation model based on the Ackermann steering geometry is proposed to describe the vehicle motion. At last, the vehicle trajectory is estimated based on the vehicle motion parameters and the parking guidance lines are drawn according to the vehicle trajectory.
The estimated parking guidance lines are compared with the actual trajectories based on several specific angles of turning steering wheel. The average errors of the estimated vehicle trajectories with different turning angles are about from 2 to 8 cm on images; the maximum errors are about from 6 to 30 cm on images. The key characteristics of a parking guidance system are stability and precision; the proposed system almost has similar accuracy to the steering-sensor-based system, but is a cheaper and accessible system. The image-based parking guidance system is worth developing to achieve a commerical product.
|Appears in Collections:||[資訊工程研究所] 博碩士論文|
Files in This Item:
All items in NCUIR are protected by copyright, with all rights reserved.