Loading [a11y]/accessibility-menu.js
Robust and Efficient Estimation of Relative Pose for Cameras on Selfie Sticks | IEEE Journals & Magazine | IEEE Xplore

Robust and Efficient Estimation of Relative Pose for Cameras on Selfie Sticks

Publisher: IEEE

Abstract:

Taking selfies has become one of the major photographic trends of our time. In this study, we focus on the selfie stick, on which a camera is mounted to take selfies. We ...View more

Abstract:

Taking selfies has become one of the major photographic trends of our time. In this study, we focus on the selfie stick, on which a camera is mounted to take selfies. We observe that a camera on a selfie stick typically travels through a particular type of trajectory around a sphere. Based on this finding, we propose a robust, efficient, and optimal estimation method for relative camera pose between two images captured by a camera mounted on a selfie stick. We exploit the special geometric structure of camera motion constrained by a selfie stick and define this motion as spherical joint motion . Utilizing a novel parametrization and calibration scheme, we demonstrate that the pose estimation problem can be reduced to a 3-degrees of freedom (DoF) search problem, instead of a generic 6-DoF problem. This facilitates the derivation of an efficient branch-and-bound optimization method that guarantees a global optimal solution, even in the presence of outliers. Furthermore, as a simplified case of spherical joint motion, we introduce selfie motion , which has a fewer number of DoF than spherical joint motion. We validate the performance and guaranteed optimality of our method on both synthetic and real-world data. Additionally, we demonstrate the applicability of the proposed method for two applications: refocusing and stylization.
Published in: IEEE Transactions on Pattern Analysis and Machine Intelligence ( Volume: 44, Issue: 9, 01 September 2022)
Page(s): 5460 - 5471
Date of Publication: 31 May 2021

ISSN Information:

PubMed ID: 34057889
Publisher: IEEE

Funding Agency:


References

References is not available for this document.