dc.description.abstract | Advanced driver assistance systems (ADAS) has become an important research topic in recent years, the use of advanced technology assisted motorists to allow motorists more secure and substantial decline in traffic accidents. The standard driver assistance systems have many items, such as lane departure warning system, pedestrian detection system, forward collision warning system. This paper analyzed the weather brightness changes, and proposed changes in brightness can be adapted to different weather vehicle detection algorithm and stable detection criterion.we developed into an adaptive brightness and weather forward collision warning system.
First, we integrate the car detection in day and night time, using different algorithm for the different time. The beginning we defined a criterion for judging what it day or night is, and decided what the algorithm we choose. The criterion judging day or night, we fetch fixed number of frame from different video. We fetch fixed pixels at fixed position for representing the whole image which size is 320×240, then calculating the average intensity and the percentage of bellow the threshold of pixels we fetch. According to the criterion, judging what it day or night.
In daytime forwarding collision warning system, we define the detection area based on detected lane mark to avoid the impact of non-detection of objects on the road, and fetch vertical edge response image and horizontal edge response image by second difference mask. For adapting different weather, we use edge strength histogram and our binary threshold function to determine dynamically threshold value. we generate the bi-level gradient image with dynamically threshold. Then find the continuous pixels belong to the horizontal edge, the horizontal edge we divided into positive and negative for representing the bottom of car and body of car respectively. We find whether it has enough of vertical edge patency or symmetric vertical edge pair at the end point of horizontal edge. If the horizontal edge confirmed, we calculate the proportion of width of the horizontal edge with lane width. If the proportion confirmed, the candidate vehicles generate. In bad weather condition, the vehicles are detected by tail lights. For example, in the heavy rain, the vehicle is very vague and can not use edge information to detect. In this case, the driver usually turns taillights to warn vehicles behind, so we captured the red taillight features to detect vehicles. In daytime forwarding collision warning system, once the vehicle is detected we will estimate the distance and calculate the time to collision to (TTC) judge whether they would be dangerous drivers and provides the driver alert.
In nighttime forwarding collision warning system, we detect the tails of preceding vehicles. Then we pair the lights using the features of the horizontal distance, the vertical heights, the trajectory, and correlation of a pair of lights. Finally, we estimate the time to collision of the verified light pair and providing the warning for the driver. Vehicle detection at night and bad weather are the use of daytime detection tail lights, because of we are using the camera device has automatic exposure control, while at night the average brightness is low, so the camera will increase the brightness of the screen, it’s making the car’s taillights overexposure. the night is used to detect this overexposure taillights.
Finally, the paper introduces our experimental equipment and the development environment, and statistical daytime vehicle detection in various weather conditions and the detection results of the detection rate. According to the data indicate that the system accurate detection of vehicles in front measured detection rate can reach 90% in normal weather conditions, and analyze system performance. In our experiment, the system execution speed of approximately 50 frames per second and camera captured 30 frames per second, so the system can achieve real-time vehicle detection. | en_US |