Item

Visual Object Tracking in Adverse Weather conditions

Alghallabi, Wafa Hamad Mohamed
Department
Computer Vision
Embargo End Date
Type
Thesis
Date
2022
License
Language
English
Collections
Research Projects
Organizational Units
Journal Issue
Abstract
Visual object tracking is one of the essential tasks in computer vision, and several techniques have been presented throughout the years. Given just the initial target position, the challenging task requires predicting the status of the target object in each frame of image sequences. Tracking benchmarks has been essential for impartially evaluating and contrasting various trackers. With time, various benchmarks have been suggested, and they are often built for particular purposes, which restricts their variability and scope. In addition, adverse weather scenarios such as heavy rain and dense fog are insufficiently covered in these benchmarks. Thus, we propose an Adverse Weather dataset for visual object tracking in severe weather scenarios. Our dataset is composed of 38 challenging video sequences with 30k annotated frames that cover six different weather and are categorized into 18 object categories. We then evaluate 17 state-of-art and recent trackers on our dataset and provide a complete analysis of their tracking performance across each weather scenario. We observed that the performance of state-of-the-art trackers degrades on our proposed dataset compared to the existing benchmarks with clear frames. Thus, we propose an enhancement-based dual-tracker framework that can be formed with any state- of-the-art tracker to improve the tracking system in adverse weather conditions frames. To our knowledge, this is the first framework for visual object tracking in adverse weather conditions. We achieved a +2.0 absolute gain on the Adverse Weather dataset using this framework.
Citation
W.H.M. Alghallabi, "Visual Object Tracking in Adverse Weather conditions", M.S. Thesis, Computer Vision, MBZUAI, Abu Dhabi, UAE, 2022.
Source
Conference
Keywords
Subjects
Source
Publisher
DOI
Full-text link