Visual object tracking is a challenging problem in computer vision that deals with tracking an object of interest in a sequence of images given its location in the initial frame. In the literature, numerous approaches have been proposed to handle the problem but some are still either affected by the drifting issue or some other cannot track in real time. Recently, part-based approaches have shown a great potential for tracking generic objects under severe occlusions. However, they need to control a large number of parameters to keep the parts in a good shape. Moreover, they consider very ad-hoc solutions while localizing the target object in a frame or constructing the geometric constraints between parts. Contrary to the previous methods, in this thesis, we address the problem of short-term and long-term visual object tracking, and propose three different correlation-filter based trackers. Our first tracker integrates the idea of learning and detection with the correlation-filter based tracking mechanism. In particular, our proposed learning and detection approach allows the tracker to keep track of the object interest in a long-term tracking setting. Our second tracker investigates the use of multiple visual cues within a correlation-filter based framework, giving advantages of fusing different features as well as color to successfully address a number of tracking challenges such as robustness to partial occlusion, motion blur, illumination changes etc. Lastly, our third tracker employs a deformable part-based scheme and takes advantage of using a global and multiple local filters in a collaborative manner to cope with partial occlusion as well as scale changes. Experiments on four large public benchmark datasets (OTB-50, TB-100, Nus-pro and VOT2015) demonstrate that our approaches, specifically the deformable part-based tracking by coupled and local correlation filters (DPCF) give significantly better results compared to the state-of-the-art trackers, and work in real time. |