PrecipitaTrack: A Comprehensive Dataset for Advancing Object Tracking in Autonomous Vehicles
Résumé
Object tracking is a cornerstone of computer vision, essential in developing autonomous systems, particularly self-driving cars. Deep learning has accelerated progress in autonomous driving in recent years, but it relies heavily on high-quality datasets encompassing diverse real-world scenarios. Existing datasets often fall short regarding challenging environments, varying lighting conditions, and adverse weather phenomena. To address these limitations, this research introduces the Percipita Track video database, a comprehensive resource designed to enhance the training of autonomous vehicle systems. Percipita Track aims to bridge the gap in existing datasets by focusing on (1) Highlighting the specific shortcomings of current datasets in capturing the complexities of real-world driving scenarios, (2) Presenting an innovative approach to data collection and annotation, ensuring a rich and diverse representation of challenging environments, and (3) Specifically tailoring the database to include a wide range of road conditions, weather patterns (rain, clear skies, haze, lightning, hailstorms), and lighting variations (daytime to nighttime). Additionally, the database features meticulously annotated object trajectories, interactions, and behaviors, even under difficult conditions. By integrating sequences from publicly accessible sources, Percipita Track fosters collaboration and further research in the field. This novel dataset is poised to significantly contribute to developing safer, more reliable, and truly autonomous vehicles by establishing a new benchmark for object-tracking in dynamic environments.