Application of Interactive Video Sensing and Management for Pedestrian and Bicycle Safety Studies

Wu-chi Feng, Portland State University

Co-investigators:

Summary:

As video data collection and storage technologies become ubiquitous and inexpensive, transportation agencies struggle to process and extract "intelligence" or useful information from growing libraries of archived video data. In some cases useful video is lost because agencies cannot justify the expensive staff time required to process it. This video information overload is caused by the inability of most transportation agencies to write customized video processing algorithms to extract valuable safety or traffic data from large amounts of collected raw video. 

Manually watching long stretches of video to extract information is boring and expensive. In addition to being time and financially inefficient, human-based extraction of video data is prone to errors. Although there are some sophisticated, specialized applications for transportation agencies these are either/both proprietary or/and too expensive to be widely deployed. The goal of the project is to create an interactive, video sensor processing system which will combine computer vision techniques into a user-friendly interface that can be easily "programmed" by  researchers to study some of the most frequent traffic safety issues such as vehicle and  pedestrian conflicts in the time-space plane or bicycle compliance with traffic signals.  We will be leveraging the OpenCV computer vision libraries made available via open-source from Intel and programming the user-friendly interface. The goal is to allow  practitioners to more easily provide the semantics of the information they wish to extract and process into the video processing algorithms.

As a demonstration and development tool, we will use a complicated mid-block crosswalk located on SW4th Ave. This is a pedestrian busy downtown street on the PSU campus. There are three lanes of heavy one-way motor vehicle traffic; speeds are moderate as most vehicles have just exited interstate I-405. We will use this location to demonstrate how easily and accurately our flexible, user-friendly tool can measure pedestrian wait times, crossing speeds, and near misses (e.g., a car abruptly stopping prior to the crosswalk or a vehicle passing a stopped vehicle).  In addition, the system will output meaningful data about vehicle and person trajectories for further analysis. We will also show how the tool can be used to clip and store only the meaningful video data, e.g. a few minutes out of one hour of video, based on a few user input parameters such as minimum vehicle-pedestrian threshold and vehicle deceleration.

Project Details

Project Type:
Research
Project Status:
Completed
End Date:
June 30,2014
UTC Grant Cycle:
Tier 1 Round 1
UTC Funding:
$143,058
TRB RIP:
32179

Other Products

  • Addressing the Semantic Gap Between Video Sensors and Applications (PRESENTATION)