Using Synthetic Data for Person Tracking Under Adverse Weather Conditions

Published

Image and Vision Computing

DOI

10.1016/j.imavis.2021.104187

On the first half, sample frames from the currently-available real (top-left quarter) (MOT, NUS-PRO, OTB-100, and TC128) and synthetic (bottom-left quarter) (VIPER, PHAV, Synthia, and Virtual KITTI) visual object tracking datasets demonstrate the lack of adverse weather conditions. The second half presents sample frames from sequences spanning raining, foggy and snowy weather conditions from PTAW172Real (top-right quarter) and PTAW217Synth (bottom-right quarter) datasets that we introduce in this work.
paper thumbnail
Paper

Abdulrahman Kerim, Ufuk Celikcan, Erkut Erdem, and Aykut Erdem. "Using Synthetic Data for Person Tracking Under AdverseWeather Conditions", Image and Vision Computing.
Preprint (with low-res images) | Published Version
Supplementary Material | Bibtex

PTAW217Synth and PTAW172Real Datasets: Download link


Abstract

Robust visual tracking plays a vital role in many areas such as autonomous cars, surveillance and robotics. Recent trackers were shown to achieve adequate results under normal tracking scenarios with clear weather condition, standard camera setups and lighting conditions. Yet, the performance of these trackers, whether they are correlation filter-based or learning-based, degrade under adverse weather conditions. The lack of videos with such weather conditions, in the available visual object tracking datasets, is the prime issue behind the low performance of the learning-based tracking algorithms. In this work, we provide a new person tracking dataset of real-world sequences (PTAW172Real) captured under foggy, rainy and snowy weather conditions to assess the performance of the current trackers. We also introduce a novel person tracking dataset of synthetic sequences (PTAW217Synth) procedurally generated by our NOVA-Extended framework spanning the same weather conditions in varying severity to mitigate the problem of data scarcity. Our experimental results demonstrate that the performances of the state-of-the-art deep trackers under adverse weather conditions can be boosted when the available real training sequences are complemented with our synthetically generated dataset during training.


Acknowledgements

This work was supported in part by TUBA GEBIP fellowship awarded to E. Erdem, and by TUBITAK-1001 Program Award No. 217E029.

Footnotes