loading
Papers Papers/2022 Papers Papers/2022

Research.Publish.Connect.

Paper

Paper Unlock

Authors: C. Liu 1 ; N. H. C. Yung 1 and R. G. Fang 2

Affiliations: 1 Laboratory for Intelligent Transportation Systems Research, Department of Electrical & Electronic Engineering, The University of Hong Kong, China ; 2 Information Science & Engineering College, Zhejinang University, China

Keyword(s): Tracking, Feature Scale Selection, Density Approximation, Bayesian Adaptation, MAP, EM.

Related Ontology Subjects/Areas/Topics: Computer Vision, Visualization and Computer Graphics ; Image and Video Analysis ; Model-Based Object Tracking in Image Sequences ; Motion, Tracking and Stereo Vision ; Segment Cluster Tracking ; Statistical Approach

Abstract: Feature density approximation (FDA) based visual object appearance representation is emerging as an effective method for object tracking, but its challenges come from object’s complex motion (e.g. scaling, rotation) and the consequent object’s appearance variation. The traditional adaptive FDA methods extract features in fixed scales ignoring the object’s scale variation, and update FDA by sequential Maximum Likelihood estimation, which lacks robustness for sparse data. In this paper, to solve the above challenges, a robust multi-scale adaptive FDA object representation method is proposed for tracking, and its robust FDA updating method is provided. This FDA achieve robustness by extracting features in the selected scale and estimating feature density using a new likelihood function defined both by feature set and the feature’s effectiveness probability. In FDA updating, robustness is achieved updating FDA in a Bayesian way by MAP-EM algorithm using density prior knowledge extracted from historical density. Object complex motion (e.g. scaling and rotation) is solved by correlating object appearance with its spatial alignment. Experimental results show that this method is efficient for complex motion, and robust in adapting the object appearance variation caused by changing scale, illumination, pose and viewing angel. (More)

CC BY-NC-ND 4.0

Sign In Guest: Register as new SciTePress user now for free.

Sign In SciTePress user: please login.

PDF ImageMy Papers

You are not signed in, therefore limits apply to your IP address 3.142.12.240

In the current month:
Recent papers: 100 available of 100 total
2+ years older papers: 200 available of 200 total

Paper citation in several formats:
Liu, C.; Yung, N. and G. Fang, R. (2009). SCALE ROBUST ADAPTIVE FEATURE DENSITY APPROXIMATION FOR VISUAL OBJECT REPRESENTATION AND TRACKING. In Proceedings of the Fourth International Conference on Computer Vision Theory and Applications (VISIGRAPP 2009) - Volume 2: VISAPP; ISBN 978-989-8111-69-2; ISSN 2184-4321, SciTePress, pages 535-540. DOI: 10.5220/0001802805350540

@conference{visapp09,
author={C. Liu. and N. H. C. Yung. and R. {G. Fang}.},
title={SCALE ROBUST ADAPTIVE FEATURE DENSITY APPROXIMATION FOR VISUAL OBJECT REPRESENTATION AND TRACKING},
booktitle={Proceedings of the Fourth International Conference on Computer Vision Theory and Applications (VISIGRAPP 2009) - Volume 2: VISAPP},
year={2009},
pages={535-540},
publisher={SciTePress},
organization={INSTICC},
doi={10.5220/0001802805350540},
isbn={978-989-8111-69-2},
issn={2184-4321},
}

TY - CONF

JO - Proceedings of the Fourth International Conference on Computer Vision Theory and Applications (VISIGRAPP 2009) - Volume 2: VISAPP
TI - SCALE ROBUST ADAPTIVE FEATURE DENSITY APPROXIMATION FOR VISUAL OBJECT REPRESENTATION AND TRACKING
SN - 978-989-8111-69-2
IS - 2184-4321
AU - Liu, C.
AU - Yung, N.
AU - G. Fang, R.
PY - 2009
SP - 535
EP - 540
DO - 10.5220/0001802805350540
PB - SciTePress