Skip to main content
Log in

Vision-based relative pose determination of cooperative spacecraft in neutral buoyancy environment

  • Original Paper
  • Published:
Machine Vision and Applications Aims and scope Submit manuscript

Abstract

Neutral buoyancy systems simulate the microgravity environment by taking advantage of buoyancy forces of water to offset the gravity of test bodies. Functional verification of space robots in neutral buoyancy system is of great importance for ground tests. The relative pose determination of a spacecraft plays an essential role in the on-orbit operation of space robots. In order to meet the requirement of on-orbit operation ground verification for space robots, this paper develops a vision-based system for determining the relative pose of cooperative spacecraft in neutral buoyancy environment. Cooperative markers and underwater binocular vision system are designed for the pose determination, and a cooperative spacecraft model is built. A detection and recognition method based on the topological characteristic is proposed for the cooperative marker. An underwater imaging model of binocular camera is established, and its refraction parameters are calibrated. The marker points are measured with an underwater binocular 3D measurement algorithm. Furthermore, the pose of cooperative spacecraft is determined using axisymmetric plane feature points. Additionally, the stable and reliable pose and velocity are obtained after the data are further processed with a Kalman filter. Finally, the experiments are carried out and the experimental results show that the proposed system can achieve a stable and reliable high-precision relative pose determination for cooperative spacecraft in neutral buoyancy environment.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12
Fig. 13
Fig. 14
Fig. 15
Fig. 16
Fig. 17
Fig. 18
Fig. 19
Fig. 20
Fig. 21
Fig. 22
Fig. 23
Fig. 24
Fig. 25
Fig. 26
Fig. 27

Similar content being viewed by others

References

  1. Du, X., Liang, B., Tao, Y.: Pose determination of large non-cooperative satellite in close range using coordinated cameras. In: 2009 International Conference on Mechatronics and Automation, pp. 3910–3915

  2. Zhang, X., Yuan, L., Wu, W., Tian, L., Yao, K.: Some key technics of drop tower experiment device of National Microgravity Laboratory (China) (NMLC). Sci. China Ser. E Eng. Mater. Sci. 48, 305–316 (2005)

    Google Scholar 

  3. Belser, V., Breuninger, J., Reilly, M., Laufer, R., Dropmann, M., Herdrich, G., et al.: Aerodynamic and engineering design of a 1.5 s high quality microgravity drop tower facility. Acta Astronaut. 129, 335–344 (2016)

    Google Scholar 

  4. Nikhil, V.V., Nair, A., Niketh, P., Kumar, A., Muruganandam, T.M.: The 2.5 s microgravity drop tower at national centre for combustion research and development (NCCRD), Indian Institute of Technology Madras. Microgravity Sci. Technol. 30, 663–673 (2018)

    Google Scholar 

  5. Li, L., Deng, Z., Gao, H., Guo, P.: Active gravity compensation test bed for a six-DOF free-flying robot. In: 2015 IEEE International Conference on Information and Automation, pp. 3135–3140

  6. Yang, M., Xu, Z., He, Y., Liu, Y., Wang, B.: Zero gravity tracking system using constant tension suspension for a multidimensional framed structure space antenna. In: 2016 7th International Conference on Mechanical and Aerospace Engineering (ICMAE), pp. 614–621

  7. Rybus, T., Seweryn, K.: Planar air-bearing microgravity simulators: review of applications, existing solutions and design parameters. Acta Astronaut. 120, 239–259 (2016)

    Google Scholar 

  8. Mantellato, R., Lorenzini, E.C., Sternberg, D., Roascio, D., Saenz-Otero, A., Zachrau, H.J.: Simulation of a tethered microgravity robot pair and validation on a planar air bearing. Acta Astronaut. 138, 579–589 (2017)

    Google Scholar 

  9. Rybus, T., Seweryn, K., Oleś, J., Basmadji, F.L., Tarenko, K., Moczydłowski, R., et al.: Application of a planar air-bearing microgravity simulator for demonstration of operations required for an orbital capture with a manipulator. Acta Astronaut. 155, 211–229 (2019)

    Google Scholar 

  10. Petersen, J.C.G., Hargens, A.R., Petersen, L.G.: Parabolic flight. In: Young, L.R., Sutton, J.P. (eds.) Encyclopedia of Bioastronautics, pp. 1–8. Springer, Cham (2020)

    Google Scholar 

  11. Pletser, V.: European aircraft parabolic flights for microgravity research, applications and exploration: a review. REACH 1, 11–19 (2016)

    Google Scholar 

  12. Pletser, V.: Microgravity research conducted by Prof. J.C. Legros during parabolic flights: notes on a historical perspective. Microgravity Sci. Technol. 31, 445–463 (2019)

    Google Scholar 

  13. Lockowandt, C.: Interactive operation of sounding rockets and onboard experiments. In: SpaceOps 2012 Conference, ed: American Institute of Aeronautics and Astronautics (2012)

  14. Olson*, S.L., Hegde, U., Bhattacharjee, S., Deering, J.L., Tang, L., Altenkirch, R.A.: Sounding rocket microgravity experiments elucidating diffusive and radiative transport effects on flame spread over thermally thick solids. Combust. Sci. Technol. 176, 557–584 (2004)

    Google Scholar 

  15. Yuan, J., Zhu, Z., Ming, Z., Luo, Q.: An innovative method for simulating microgravity effects through combining electromagnetic force and buoyancy. Adv. Space Res. 56, 355–364 (2015)

    Google Scholar 

  16. Jairala, J., Durkin, R., Marak, R., Prince, A., Sipila, S., Ney, Z. et al.: Extravehicular activity development and verification testing at NASA’s neutral buoyancy laboratory. In: 42nd International Conference on Environmental Systems, ed: American Institute of Aeronautics and Astronautics (2012)

  17. Sukkarieh, S., Nebot, E.M., Durrant-Whyte, H.F.: A high integrity IMU/GPS navigation loop for autonomous land vehicle applications. IEEE Trans. Robot. Autom. 15, 572–578 (1999)

    Google Scholar 

  18. Shinozaki, R., Oguma, H., Kameda, S., Suematsu, N.: Experimental analysis of positioning accuracy of GPS/BeiDou on elevation mask. In: 2019 International Conference on Information and Communication Technology Convergence (ICTC), pp. 433–437 (2019)

  19. Lukas, U.F.V.: Underwater visual computing: the grand challenge just around the corner. IEEE Comput. Graph. Appl. 36, 10–15 (2016)

    Google Scholar 

  20. Knuth, J., Barooah, P.: Collaborative 3D localization of robots from relative pose measurements using gradient descent on manifolds. In: 2012 IEEE International Conference on Robotics and Automation, pp. 1101–1106 (2012)

  21. Christian, J.A., Robinson, S.B., D’Souza, C.N., Ruiz, J.P.: Cooperative relative navigation of spacecraft using flash light detection and ranging sensors. J. Guid. Control Dyn. 37, 452–465 (2014)

    Google Scholar 

  22. Liu, L., Zhao, G., Bo, Y.: Point cloud based relative pose estimation of a satellite in close range. Sensors 16, 824 (2016)

    Google Scholar 

  23. Opromolla, R., Fraia, M.Z.D., Fasano, G., Rufino, G., Grassi, M.: Laboratory test of pose determination algorithms for uncooperative spacecraft. In: 2017 IEEE International Workshop on Metrology for AeroSpace (MetroAeroSpace), pp. 169–174 (2017)

  24. Woods, J.O., Christian, J.A.: Lidar-based relative navigation with respect to non-cooperative objects. Acta Astronaut. 126, 298–311 (2016)

    Google Scholar 

  25. Kim, B., Kim, J., Cho, H., Kim, J., Yu, S.: AUV-based multi-view scanning method for 3-D reconstruction of underwater object using forward scan sonar. IEEE Sens. J. 20, 1592–1606 (2020)

    Google Scholar 

  26. Neves, G., Ruiz, M., Fontinele, J., Oliveira, L.: Rotated object detection with forward-looking sonar in underwater applications. Expert Syst. Appl. 140, 112870 (2020)

    Google Scholar 

  27. Baldwin, G., Mahony, R., Trumpf, J.: A nonlinear observer for 6 DOF pose estimation from inertial and bearing measurements. In: 2009 IEEE International Conference on Robotics and Automation, pp. 2237–2242 (2009)

  28. Opromolla, R., Fasano, G., Rufino, G., Grassi, M.: A review of cooperative and uncooperative spacecraft pose determination techniques for close-proximity operations. Progr. Aerosp. Sci. 93, 53–72 (2017)

    Google Scholar 

  29. Pan, H., Huang, J., Qin, S.: High accurate estimation of relative pose of cooperative space targets based on measurement of monocular vision imaging. Optik 125, 3127–3133 (2014)

    Google Scholar 

  30. Wen, Z., Wang, Y., Luo, J., Kuijper, A., Di, N., Jin, M.: Robust, fast and accurate vision-based localization of a cooperative target used for space robotic arm. Acta Astronaut. 136, 101–114 (2017)

    Google Scholar 

  31. Mokuno, M., Kawano, I.: In-orbit demonstration of an optical navigation system for autonomous rendezvous docking. J. Spacecr. Rockets 48, 1046–1054 (2011)

    Google Scholar 

  32. Arantes, G., Rocco, E.M., da Fonseca, I.M., Theil, S.: Far and proximity maneuvers of a constellation of service satellites and autonomous pose estimation of customer satellite using machine vision. Acta Astronaut. 66, 1493–1505 (2010)

    Google Scholar 

  33. He, Y., Liang, B., Du, X., Wang, X., Zhang, D.: Measurement of relative pose between two non-cooperative spacecrafts based on graph cut theory. In: 2014 13th International Conference on Control Automation Robotics & Vision (ICARCV), pp. 1900–1905 (2014)

  34. Zhang, L., Zhu, F., Hao, Y., Pan, W.: Optimization-based non-cooperative spacecraft pose estimation using stereo cameras during proximity operations. Appl. Opt. 56, 4522–4531 (2017)

    Google Scholar 

  35. Peng, J., Xu, W., Liang, B., Wu, A.: Virtual stereovision pose measurement of noncooperative space targets for a dual-arm space robot. IEEE Trans. Instrum. Meas. 69, 1–13 (2019)

    Google Scholar 

  36. Cai, J., Huang, P., Zhang, B., Wang, D.: A TSR visual servoing system based on a novel dynamic template matching method. Sensors (Basel) 15, 32152–32167 (2015)

    Google Scholar 

  37. Gasbarri, P., Sabatini, M., Palmerini, G.B.: Ground tests for vision based determination and control of formation flying spacecraft trajectories. Acta Astronaut. 102, 378–391 (2014)

    Google Scholar 

  38. Pertile, M., Chiodini, S., Giubilato, R., Mazzucato, M., Valmorbida, A., Fornaser, A., et al.: Metrological characterization of a vision-based system for relative pose measurements with fiducial marker mapping for spacecrafts. Robotics 7, 43 (2018)

    Google Scholar 

  39. Lynch, B., Ellery, A.: Efficient control of an AUV-manipulator system: an application for the exploration of europa. IEEE J. Ocean. Eng. 39, 552–570 (2014)

    Google Scholar 

  40. Carignan, C.R., Lane, J.C., Akin, D.L.: Control architecture and operator interface for a free-flying robotic vehicle. IEEE Trans. Syst. Man, Cybern. Part C (Appl. Rev.) 31, 327–336 (2001)

    Google Scholar 

  41. McGhan, C., Besser, R., Sanner, R., Atkins, E.: Semi-autonomous inspection with a neutral buoyancy free-flyer. In: AIAA Guidance, Navigation, and Control Conference and Exhibit, ed: American Institute of Aeronautics and Astronautics (2006)

  42. Zhu, Z., Zhang, G., Song, J., Tang, B., Ma, W., Yuan, J., et al.: Use of dynamic scaling for trajectory planning of floating pedestal and manipulator system in a microgravity environment. Microgravity Sci. Technol. 30, 511–523 (2018)

    Google Scholar 

  43. Akin, D.L., Bowden, M.L., Spofford, J.R.: Neutral buoyancy evaluation of technologies for space station external operations. In: 35th Congress of the International Astronautical Federation, pp. 34–38 (1984)

  44. Smithanik, J.R., Atkins, E.M., Sanner, R.M.: Visual positioning system for an underwater space simulation environment. J. Guid. Control Dyn. 29, 858–869 (2006)

    Google Scholar 

  45. Yuan, Y., Zhang, P., Wang, Z., Guo, L., Yang, H.: Active disturbance rejection control for the ranger neutral buoyancy vehicle: a delta operator approach. IEEE Trans. Industr. Electron. 64, 9410–9420 (2017)

    Google Scholar 

  46. Oda, M., Kibe, K., Yamagata, F.: ETS-VII, space robot in-orbit experiment satellite. In: Proceedings of IEEE International Conference on Robotics and Automation, pp. 739–744, vol. 1 (1996)

  47. Inaba, N., Oda, M.: Autonomous satellite capture by a space robot: world first on-orbit experiment on a Japanese robot satellite ETS-VII. In: Proceedings 2000 ICRA. Millennium Conference. IEEE International Conference on Robotics and Automation. Symposia Proceedings (Cat. No.00CH37065), pp. 1169–1174, vol. 2 (2000)

  48. Garrido-Jurado, S., Muñoz-Salinas, R., Madrid-Cuevas, F.J., Medina-Carnicer, R.: Generation of fiducial marker dictionaries using mixed integer linear programming. Pattern Recognit. 51, 481–491 (2016)

    Google Scholar 

  49. Jan, Č., Fabio, B., Dimitrios, S., Fotis, L.: Detecting square markers in underwater environments. Remote Sens. 11, 459 (2019)

    Google Scholar 

  50. Yang, X., Fang, S., Kong, B., Li, Y.: Design of a color coded target for vision measurements. Optik Int. J. Light Electron Opt. 125, 3727–3732 (2014)

    Google Scholar 

  51. Treibitz, T., Schechner, Y., Kunz, C., Singh, H.: Flat refractive geometry. IEEE Trans. Pattern Anal. Mach. Intell. 34, 51–65 (2012)

    Google Scholar 

  52. Fasano, G., Grassi, M., Accardo, D.: A stereo-vision based system for autonomous navigation of an in-orbit servicing platform. In: AIAA Infotech@Aerospace Conference, ed: American Institute of Aeronautics and Astronautics (2009)

  53. Segal, S., Carmi, A., Gurfil, P.: Stereovision-based estimation of relative dynamics between noncooperative satellites: theory and experiments. IEEE Trans. Control Syst. Technol. 22, 568–584 (2014)

    Google Scholar 

  54. Feng, Q., Zhu, Z.H., Pan, Q., Hou, X.: Relative state and inertia estimation of unknown tumbling spacecraft by stereo vision. IEEE Access 6, 54126–54138 (2018)

    Google Scholar 

  55. Ahn, S.J., Rauh, W., Recknagel, M.: Circular coded landmark for optical 3D-measurement and robot vision. In: 1999 IEEE/RSJ International Conference on Intelligent Robots and Systems, Proceedings IROS ‘99 (1999)

  56. Zhang, Z.: A flexible new technique for camera calibration. IEEE Trans. Pattern Anal. Mach. Intell. 22, 1330–1334 (2000)

    Google Scholar 

  57. Sakamoto, K., Moro, A., Fujii, H., Yamashita, A., Asama, H.: Three-dimensional measurement of objects in liquid with an unknown refractive index using fisheye stereo camera. In: 2014 IEEE/SICE International Symposium on System Integration, pp. 281–286 (2014)

  58. Digumarti, S.T., Chaurasia, G., Taneja, A., Siegwart, R., Thomas, A., Beardsley, P.: Underwater 3D capture using a low-cost commercial depth camera. In: 2016 IEEE Winter Conference on Applications of Computer Vision (WACV), pp. 1–9 (2016)

  59. Wu, Y., Hu, Z.: PnP problem revisited. J. Math. Imaging Vis. 24, 131–141 (2006)

    MathSciNet  Google Scholar 

  60. Taketomi, T., Okada, K., Yamamoto, G., Miyazaki, J., Kato, H.: Camera pose estimation under dynamic intrinsic parameter change for augmented reality. Comput. Graph. 44, 11–19 (2014)

    Google Scholar 

  61. Fawcett, T.: An introduction to ROC analysis. Pattern Recognit. Lett. 27, 861–874 (2006)

    Google Scholar 

Download references

Acknowledgements

The authors would like to thank the National Natural Science Foundation of China (Grant No. 51775419), Program for Chang Jiang Scholars and Innovative Research Team in University (Grant No. IRT_15R54) for supporting this work.

Author information

Authors and Affiliations

Authors

Corresponding authors

Correspondence to Kedian Wang or Zhanxia Zhu.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Appendix

Appendix

1.1 Formula derivation of Kalman filter

The state equation and observation equation of the linear system are defined by Eqs. (12) and (13), respectively.

$$ x_{k} = A \cdot x_{k - 1} + w_{k} $$
(12)
$$ z_{k} = H \cdot x_{k} + v_{k} $$
(13)

where xk is state vector, zk is observation vector, A is state transition matrix, H is observation matrix, wk is system noise vector, and vk is observation noise vector, respectively; wk and vk are assumed to satisfy positive definite, symmetric and uncorrelated, zero-mean Gaussian white noise vector. Their covariance matrixes satisfy Eqs. (14), (15), and (16).

$$ E\left[ {w_{k} w_{i}^{T} } \right] = \left\{ {\begin{array}{*{20}l} {0,} \hfill & {i \ne k} \hfill \\ {Q_{k} ,} \hfill & {i = k} \hfill \\ \end{array} } \right. $$
(14)
$$ E\left[ {v_{k} v_{i}^{T} } \right] = \left\{ {\begin{array}{*{20}l} {0,} \hfill & {i \ne k} \hfill \\ {R_{k} ,} \hfill & {i = k} \hfill \\ \end{array} } \right. $$
(15)
$$ E\left[ {w_{k} v_{i}^{T} } \right] = 0 $$
(16)

When Kalman filter is used, the object motion state at the moment of k is estimated by Eq. (17) and a prior estimate x’k is obtained.

$$ x_{k}^{'} = A \cdot x_{k - 1} $$
(17)

The prior estimation error is:

$$ e_{k}^{'} = x_{k} - x_{k}^{'} $$
(18)

The covariance matrix of prior estimation error is:

$$ P_{k}^{'} = E\left[ {e_{k}^{'} \cdot e_{k}^{'T} } \right] = E\left[ {\left( {x_{k} - x_{k}^{'} } \right)\left( {x_{k} - x_{k}^{'} } \right)^{T} } \right] $$
(19)

The prior estimate x’k is corrected by using the observed data zk, and it can be written as Eq. (20):

$$ \overline{x}_{k} = x_{k}^{'} + K_{k} \left( {z_{k} - H \cdot x_{k}^{'} } \right) $$
(20)

where \( \overline{x}_{k} \) is modified value of prior estimate x’k, which is called the posterior estimate value; Kk is the gain matrix of Kalman filter. Equations (21) and (22) give the expression of posterior estimation error and its covariance matrix, respectively.

$$ e_{k} = x_{k} - \overline{x}_{k} $$
(21)
$$ P_{k} = E\left[ {e_{k} \cdot e_{k}^{T} } \right] = E\left[ {\left( {x_{k} - \overline{x}_{k} } \right)\left( {x_{k} - \overline{x}_{k} } \right)^{T} } \right] $$
(22)

The objective is to get a Kk that minimizes the error between the posterior estimate and the actual value (i.e., the posterior estimate error). Substituting Eq. (13) into Eq. (20), and then substituting the result into Eq. (22), we get:

$$\begin{aligned} P_{k} &= E\left\{ \left[ {\left( {x_{k} - x_{k}^{'} } \right) - K_{k} \left( {H \cdot x_{k} + v_{k} - H \cdot x_{k}^{'} } \right)} \right]\right. \\ &\left.\quad\times\left[ {\left( {x_{k} - x_{k}^{'} } \right) - K_{k} \left( {H \cdot x_{k} + v_{k} - H \cdot x_{k}^{'} } \right)} \right]^{T} \right\} \end{aligned} $$
(23)

where \( \left( {x_{k} - x_{k}^{'} } \right) \) is a priori estimation error, it is irrelevant to the observation error vk, so:

$$ P_{k} = \left( {I - K_{k} H} \right)P_{k}^{'} \left( {I - K_{k} H} \right)^{T} + K_{k} R_{k} K_{k}^{T} $$
(24)

Thus, the Kk which enables the minimum error can be written as Eq. (25).

$$ K_{k} = P_{k}^{'} H^{T} \left( {H \cdot P_{k}^{'} H^{T} + R_{k} } \right)^{ - 1} $$
(25)

By substituting Eq. (25) into Eq. (24), we obtain:

$$ P_{k} = \left( {I - K_{k} H} \right)P_{k}^{'} $$
(26)

Since wk−1 describes the cumulative error before processing from the moment of k − 1 to k, x’k can be predicted by Eq. (27).

$$ x_{k}^{'} = Ax_{k} $$
(27)

Also, because wk−1 and ek−1 are not correlated, the covariance matrix of prior estimation error corresponding to x’k is written as Eq. (28). So we can get Kk, and then, \( \overline{x} \) can be estimated.

$$ P_{k}^{'} = E\left[ {\left( {A \cdot e_{k - 1} + w_{k - 1} } \right)\left( {A \cdot e_{k - 1} + w_{k - 1} } \right)^{T} } \right] = A \cdot P_{k - 1} \cdot A^{T} + Q_{k - 1} $$
(28)

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Jia, G., Min, C., Wang, K. et al. Vision-based relative pose determination of cooperative spacecraft in neutral buoyancy environment. Machine Vision and Applications 32, 19 (2021). https://doi.org/10.1007/s00138-020-01137-7

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1007/s00138-020-01137-7

Keywords

Navigation