Skip to main content
Log in

Solving the PnL problem using the hidden variable method: an accurate and efficient solution

  • Original article
  • Published:
The Visual Computer Aims and scope Submit manuscript

Abstract

This paper addresses the camera pose estimation problem from 3D lines and their 2D projections, known as the perspective-n-line (PnL) problem. Although many successful solutions have been presented, it is still a challenging to optimize both computational complexity and accuracy at the same time. In our work, we parameterize the rotation by using the Cayley–Gibbs–Rodriguez (CGR) parameterization and formulate the PnL problem into a polynomial system solving problem. Instead of the Gröbner basis method, which may encounter numeric problems, we seek for an efficient and stability technique—the hidden variable method—to solve the polynomial system and polish the solution via the Gauss–Newton method. The performance of our method is evaluated by using simulations and real images, and results demonstrate that our method offers accuracy and precision comparable or better than existing state-of-the-art methods, but with significantly lower computational cost.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8

Similar content being viewed by others

Notes

  1. The C++ version will be published upon completion.

  2. https://sites.google.com/view/ping-wang-homepage.

  3. All source codes can be downloaded from https://sites.google.com/view/ping-wang-homepage.

  4. http://www.robots.ox.ac.uk/~vgg/data/.

References

  1. Abdelaziz, Y.I.: Direct linear transformation from comparator coordinates in close-range photogrammetry. In: Asp Symposium on Close-Range Photogrammetry in Illinois (1971)

  2. Ansar, A., Daniilidis, K.: Linear pose estimation from points or lines. IEEE Trans. Pattern Anal. Mach. Intell. 25(5), 578–589 (2003)

    Article  Google Scholar 

  3. Azuma, R., Baillot, Y., Behringer, R., Feiner, S., Julier, S., MacIntyre, B.: Recent advances in augmented reality. Comput. Graph. Appl. IEEE 21(6), 34–47 (2001)

    Article  Google Scholar 

  4. Azuma, R.T., et al.: A survey of augmented reality. Presence 6(4), 355–385 (1997)

    Article  Google Scholar 

  5. Brezov, D.S., Mladenova, C.D., Mladenov, I.M.: New perspective on the gimbal lock problem. In: American Institute of Physics Conference Series (2013)

  6. Bronson, R., Costa, G.B.: An Introduction to Optimization (2009)

  7. Caglioti, V.: The planar three-line junction perspective problem with application to the recognition of polygonal patterns. Pattern Recognit. 26(11), 1603–1618 (1993)

    Article  Google Scholar 

  8. Cao, M.W., Jia, W., Zhao, Y., Li, S.J., Liu, X.P.: Fast and robust absolute camera pose estimation with known focal length. Neural. Comput. Appl. 29(5), 1383 (2018)

    Article  Google Scholar 

  9. Chen, H.H.: Pose determination from line-to-plane correspondences: existence condition and closed-form solutions. IEEE Trans. Pattern Anal. Mach. Intell. 13(6), 530–541 (1991)

    Article  Google Scholar 

  10. Dani, A.P., Fischer, N.R., Dixon, W.E.: Single camera structure and motion. IEEE Trans. Autom. Control 57(1), 238–243 (2012)

    Article  MathSciNet  Google Scholar 

  11. Dhome, M., Richetin, M., Lapreste, J.T.: Determination of the attitude of 3D objects from a single perspective view. IEEE Trans. Pattern Anal. Mach. Intell. 11(12), 1265–1278 (1989)

    Article  Google Scholar 

  12. Engelhard, N., Endres, F., Hess, J., Sturm, J., Burgard, W.: Real-time 3d visual slam with a hand-held RGB-D camera. In: Proceedings of the RGB-D Workshop on 3D Perception in Robotics at the European Robotics Forum, Vasteras, Sweden, vol. 180 (2011)

  13. Ferraz, L., Binefa, X., Moreno-Noguer, F.: Very fast solution to the PnP problem with algebraic outlier rejection. In: 2011 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), pp. 501–508. IEEE (2014)

  14. Fischler, M.A., Bolles, R.C.: Random sample consensus: a paradigm for model fitting with applications to image analysis and automated cartography. Commun. ACM 24(6), 381–395 (1981)

    Article  MathSciNet  Google Scholar 

  15. Gander, W.: Least Squares Fit of Point Clouds. Springer, Berlin (1997)

    Book  Google Scholar 

  16. Han, P., Zhao, G.: Line-based initialization method for mobile augmented reality in aircraft assembly. Vis. Comput. 33(9), 1185–1196 (2017)

    Article  Google Scholar 

  17. Hesch, J.A., Roumeliotis, S.I.: A direct least-squares (DLS) method for PnP. In: 2011 IEEE International Conference on Computer Vision (ICCV), pp. 383–390. IEEE (2011)

  18. Jafari, M., Yayli, Y.: Generalized quaternion and rotation in 3-space e (3-alfa, beta). Physics (2012)

  19. Kangni, F., Laganiere, R.: Orientation and pose recovery from spherical panoramas. In: IEEE 11th International Conference on Computer Vision, 2007. ICCV 2007. pp. 1–8. IEEE (2007)

  20. Kneip, L., Li, H., Seo, Y.: UPnP: An optimal o(n) solution to the absolute pose problem with universal applicability. In: European Conference on Computer Vision, pp. 127–142. Springer, Berlin (2014)

  21. Kukelova, Z., Bujnak, M., Pajdla, T.: Automatic generator of minimal problem solvers. In: Computer Vision—ECCV 2008, 10th European Conference on Computer Vision, Marseille, France, October 12–18, 2008, Proceedings, Part III (2008)

  22. Kumar, R., Hanson, A.R.: Robust Methods for Estimating Pose and a Sensitivity Analysis. Academic Press Inc, Cambridge (1994)

    Google Scholar 

  23. Lategahn, H., Geiger, A., Kitt, B.: Visual slam for autonomous ground vehicles. In: 2011 IEEE International Conference on Robotics and Automation (ICRA), pp. 1732–1737. IEEE (2011)

  24. Lepetit, V., Moreno-Noguer, F., Fua, P.: EPnP: An accurate O(n) solution to the PnP problem. Int. J. Comput. Vis. 81(2), 155 (2009)

    Article  Google Scholar 

  25. Li, S., Xu, C., Xie, M.: A robust O(n) solution to the perspective-n-point problem. IEEE Trans. Pattern Anal. Mach. Intell. 34(7), 1444–1450 (2012)

    Article  Google Scholar 

  26. Liu, Y.: Determination of camera location from 2-d to 3-d line and point correspondences. IEEE Trans. Pattern Anal. Mach. Intell. 12(1), 28–37 (1990)

    Article  Google Scholar 

  27. Liu, Y., Chen, X., Gu, T., Zhang, Y., Xing, G.: Real-time camera pose estimation via line tracking. Vis. Comput. 34(6–8), 899–909 (2018)

    Article  Google Scholar 

  28. Lu, C.P., Hager, G.D., Mjolsness, E.: Fast and globally convergent pose estimation from video images. IEEE Trans. Pattern Anal. Mach. Intell. 22(6), 610–622 (2000)

    Article  Google Scholar 

  29. Mirzaei, F.M., Roumeliotis, S.I.: Globally optimal pose estimation from line correspondences. In: IEEE International Conference on Robotics and Automation, pp. 5581–5588 (2011)

  30. Nakano, G.: Globally optimal DLS method for PnP problem with Cayley parameterization. In: BMVC, pp. 78–1 (2015)

  31. Press, W., Flannery, B., Teukolsky, S., Vetterling, W.: Numerical Recipes: The Art of Scientific Computing. Cambridge University Press, Cambridge (1986)

    MATH  Google Scholar 

  32. Přibyl, B., Zemčlk, P., Čadlk, M.: Camera pose estimation from lines using plücker coordinates (2016)

  33. Přibyl, B., Zemčlk, P., Čadlk, M.: Absolute pose estimation from line correspondences using direct linear transformation. Comput. Vis. Image Underst. 161, 130–144 (2017)

    Article  Google Scholar 

  34. Ryan, J., Hubbard, A., Box, J., Todd, J., Christoffersen, P., Carr, J., Holt, T., Snooke, N.: Uav photogrammetry and structure from motion to assess calving dynamics at store glacier, a large outlet draining the greenland ice sheet. Cryosphere 9(1), 1–11 (2015)

    Article  Google Scholar 

  35. Silva, M., Ferreira, R., Gaspar, J.: Camera calibration using a color-depth camera: points and lines based DLT including radial distortion (2013)

  36. Urban, S., Leitloff, J., Hinz, S.: MLPnP —a real-time maximum likelihood solution to the perspective-n-point problem. ISPRS J. Photogram. Rem. Sens. 3(3), 131–138 (2016)

    Google Scholar 

  37. Visual, S.: IROS 2014: Robots descend on Chicago. In: IEEE Robotics and Automation Magazine (2015)

  38. Wang, P., Xu, G., Cheng, Y., Yu, Q.: A simple, robust and fast method for the perspective-n-point problem. Pattern Recognit. Lett. 108, 31–37 (2018)

    Article  Google Scholar 

  39. Wang, P., Xu, G., Cheng, Y., Yu, Q.: Camera pose estimation from lines: a fast, robust and general method. Mach. Vis. Appl. 30, 603–614 (2019)

    Article  Google Scholar 

  40. Xu, C., Zhang, L., Cheng, L., Koch, R.: Pose estimation from line correspondences: a complete analysis and a series of solutions. IEEE Trans. Pattern Anal. Mach. Intell. 39(99), 1–1 (2017)

    Google Scholar 

  41. Zhang, L., Xu, C., Lee, K.M., Koch, R.: Robust and efficient pose estimation from line correspondences. In: Asian Conference on Computer Vision, pp. 217–230 (2012)

  42. Zhang, Y., Xin, L., Liu, H., Yang, S.: Probabilistic approach for maximum likelihood estimation of pose using lines. IET Comput. Vis. 10(6), 475–482 (2016)

    Article  Google Scholar 

  43. Zheng, Y., Kuang, Y., Sugimoto, S., Åström, K., Okutomi, M.: Revisiting the PnP problem: a fast, general and optimal solution. In: 2013 IEEE International Conference on Computer Vision (ICCV), pp. 2344–2351. IEEE (2013)

  44. Zhou, L., Kaess, M.: An efficient and accurate algorithm for the perspective-n-point problem. In: IEEE/RSJ International Conference on Intelligent Robots and Systems (2019)

Download references

Acknowledgements

This work was supported by the National Natural Science Foundation of China (No. 62001198).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Ping Wang.

Ethics declarations

Conflict of interest

We declare that we have no financial and personal relationships with other people or organizations that can inappropriately influence our work, there is no professional or other personal interest of any nature or kind in any product, service and/or company that could be construed as influencing the position presented in, or the review of, the manuscript entitled.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Appendices

Appendix A: Coefficients of Eqs. (17)–(19)

$$\begin{aligned} e_{11}= & {} -h_{31}b-h_{32}, e_{12}=(h_{11}-h_{33})b+h_{12}-h_{34} \\ e_{13}= & {} -h_{35}b^2-h_{36}b-h_{37},e_{14}=h_{13}b+h_{14} \\ e_{15}= & {} h_{15}b^2+h_{16}b+h_{17} \\ e_{21}= & {} -h_{21}b-h_{22}, e_{22}=(h_{31}-h_{23})b+h_{32}-h_{24} \\ e_{23}= & {} -h_{25}b^2-h_{26}b-h_{27}, e_{24}=h_{33}b+h_{34} \\ e_{25}= & {} h_{35}b^2+h_{36}b+h_{37} \\ e_{31}= & {} (h_{31}^2-h_{11}h_{21})b^2+(2h_{31}h_{32}-h_{11}h_{22}-h_{12}h_{21})b\\&\quad + h_{32}^2-h_{12}h_{22}\\ e_{32}= & {} (2h_{31}h_{33}-h_{13}h_{21}-h_{11}h_{23})b^2+(2h_{32}h_{33}+2h_{31}h_{34}\\&\quad -h_{14}h_{21}-h_{13}h_{22}-h_{12}h_{23}-h_{11}h_{24})b+2h_{32}h_{34}\\&\quad -h_{14}h_{22}-h_{12}h_{24}\\ e_{33}= & {} (2h_{31}h_{35}-h_{15}h_{21}-h_{11}h_{25})b^3+(2h_{31}h_{36}-h_{12}h_{25}\\&\quad -h_{15}h_{22}-h_{16}h_{21}-h_{11}h_{26}+2h_{32}h_{35})b^2+(2h_{31}h_{37}\\&\quad -h_{12}h_{26}-h_{16}h_{22}-h_{17}h_{21}-h_{11}h_{27}+2h_{32}h_{36})b\\&\quad -h_{12}h_{27}-h_{17}h_{22}+2h_{32}h_{37}\\ e_{34}= & {} (h_{33}^2-h_{13}h_{23})b^2+(2h_{33}h_{34}-h_{14}h_{23}-h_{13}h_{24})b \\&\quad +h_{34}^2-h_{14}h_{24} \\ e_{35}= & {} (2h_{33}h_{35}-h_{15}h_{23}-h_{13}h_{25})b^3+(2h_{33}h_{36}-h_{14}h_{25}\\&\quad -h_{15}h_{24} - h_{16}h_{23} - h_{13}h_{26} +2h_{34}h_{35})b^2+ (2h_{33}h_{37}\\&\quad -h_{14}h_{26} - h_{16}h_{24} - h_{17}h_{23} - h_{13}h_{27}+ 2h_{34}h_{36})b\\&\quad -h_{14}h_{27} - h_{17}h_{24} + 2h_{34}h_{37}\\ e_{36}= & {} (2h_{35}h_{36} -h_{16}h_{25} - h_{15}h_{26})b^3+ (h_{36}^2 - h_{15}h_{27}\\&\quad -h_{16}h_{26}-h_{17}h_{25}+2h_{35}h_{37})b^2+(h_{35}^2 - h_{15}h_{25})b^4 \\&\quad + (2h_{36}h_{37} - h_{17}h_{26} -h_{16}h_{27})b + h_{37}^2 - h_{17}h_{27} \end{aligned}$$

Appendix B: Coefficients of Eqs.(20)–(22)

$$\begin{aligned} k_{11}= & {} e_{13}+e_{11}h_{12} + e_{14}h_{22} + e_{12}h_{32} + be_{11}h_{11} +be_{14}h_{21}\\&\quad + be_{12}h_{31}\\ k_{12}= & {} e_{15} + e_{11}h_{14} + e_{14}h_{24} + e_{12}h_{34} + be_{11}h_{13} + be_{14}h_{23}\\&\quad + be_{12}h_{33}\\ k_{13}= & {} e_{11}h_{17} + e_{14}h_{27} + e_{12}h_{37} + be_{11}h_{16} + be_{14}h_{26} \\&\quad +be_{12}h_{36} + b^2e_{11}h_{15} + b^2e_{14}h_{25} + b^2e_{12}h_{35}\\ k_{21}= & {} e_{23} + e_{21}h_{12} + e_{24}h_{22} + e_{22}h_{32} + be_{21}h_{11} + be_{24}h_{21} \\&\quad + be_{22}h_{31}\\ k_{22}= & {} e_{25} + e_{21}h_{14} + e_{24}h_{24} + e_{22}h_{34} + be_{21}h_{13} + be_{24}h_{23}\\&\quad + be_{22}h_{33} \\ k_{23}= & {} e_{21}h_{17} + e_{24}h_{27} + e_{22}h_{37} + be_{21}h_{16} + be_{24}h_{26} \\&\quad +be_{22}h_{36} + b^2e_{21}h_{15} + b^2e_{24}h_{25} + b^2e_{22}h_{35}\\ k_{31}= & {} e_{33} + e_{31}h_{12} + e_{34}h_{22} + e_{32}h_{32} + be_{31}h_{11} + be_{34}h_{21}\\&\quad + be_{32}h_{31}\\ k_{32}= & {} e_{35} + e_{31}h_{14} + e_{34}h_{24} +e_{32}h_{34} + be_{31}h_{13} + be_{34}h_{23}\\&\quad + be_{32}h_{33}\\ k_{33}= & {} e_{36} + e_{31}h_{17} + e_{34}h_{27} + e_{32}h_{37} + be_{31}h_{16} + be_{34}h_{26}\\&\quad + be_{32}h_{36} + b^2e_{31}h_{15} + b^2e_{34}h_{25} + b^2e_{32}h_{35} \end{aligned}$$

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Wang, P., Chou, Y., An, A. et al. Solving the PnL problem using the hidden variable method: an accurate and efficient solution. Vis Comput 38, 95–106 (2022). https://doi.org/10.1007/s00371-020-02004-2

Download citation

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00371-020-02004-2

Keywords

Navigation