For the understanding of the dynamics inside crowds reliable empirical data are needed enabling an increase of safety and comfort for pedestrians and the design of models reflecting the real dynamics. Manual procedures for collecting this data are very time-consuming and usually do not supply sufficient accuracy in space and time.
For this reason we are developing the tool named PeTrack (Pedestrian Tracking) to automatically extract accurate pedestrian trajectories from video recordings. The joint trajectories of all pedestrians provide data like velocity, flow and density at any time and position. With such a tool extensive experimental series with a large number of persons can be analyzed. Individual codes enables personalized trajectories with static information of each participant (e.g. age, gender).
The program has to deal with wide angle lenses and a high density of pedestrians. Lens distortion and perspective view are taken into account. The procedure includes calibration, recognition, tracking and height detection.
Different kinds of markers (e.g. with height information, head direction, individual code) are implemented. With a stereo camera more accurate height measurements and also markerless tracking is possible.
The source code as well as precompiled executables of PeTrack are available. The brief documentation of using PeTrack cannot answer all questions. Thus you may contact the author before setting up experiments and automatic extraction with PeTrack: petrack@fz-juelich.de. Results of collected trajectories can be found here.