To read this content please select one of the options below:

A new pose estimation method for non-cooperative spacecraft based on point cloud

Zhiming Chen (College of Astronautics, Nanjing University of Aeronautics and Astronautics, Nanjing, China)
Lei Li (College of Astronautics, Nanjing University of Aeronautics and Astronautics, Nanjing, China)
Yunhua Wu (College of Astronautics, Nanjing University of Aeronautics and Astronautics, Nanjing, China)
Bing Hua (College of Astronautics, Nanjing University of Aeronautics and Astronautics, Nanjing, China)
Kang Niu (Shanghai Institute of Mechanical and Electrical Engineering, Shanghai, China)

International Journal of Intelligent Computing and Cybernetics

ISSN: 1756-378X

Article publication date: 4 October 2018

Issue publication date: 22 February 2019

426

Abstract

Purpose

On-orbit service technology is one of the key technologies of space manipulation activities such as spacecraft life extension, fault spacecraft capture, on-orbit debris removal and so on. It is known that the failure satellites, space debris and enemy spacecrafts in space are almost all non-cooperative targets. Relatively accurate pose estimation is critical to spatial operations, but also a recognized technical difficulty because of the undefined prior information of non-cooperative targets. With the rapid development of laser radar, the application of laser scanning equipment is increasing in the measurement of non-cooperative targets. It is necessary to research a new pose estimation method for non-cooperative targets based on 3D point cloud. The paper aims to discuss these issues.

Design/methodology/approach

In this paper, a method based on the inherent characteristics of a spacecraft is proposed for estimating the pose (position and attitude) of the spatial non-cooperative target. First, we need to preprocess the obtained point cloud to reduce noise and improve the quality of data. Second, according to the features of the satellite, a recognition system used for non-cooperative measurement is designed. The components which are common in the configuration of satellite are chosen as the recognized object. Finally, based on the identified object, the ICP algorithm is used to calculate the pose between two frames of point cloud in different times to finish pose estimation.

Findings

The new method enhances the matching speed and improves the accuracy of pose estimation compared with traditional methods by reducing the number of matching points. The recognition of components on non-cooperative spacecraft directly contributes to the space docking, on-orbit capture and relative navigation.

Research limitations/implications

Limited to the measurement distance of the laser radar, this paper considers the pose estimation for non-cooperative spacecraft in the close range.

Practical implications

The pose estimation method for non-cooperative spacecraft in this paper is mainly applied to close proximity space operations such as final rendezvous phase of spacecraft or ultra-close approaching phase of target capture. The system can recognize components needed to be capture and provide the relative pose of non-cooperative spacecraft. The method in this paper is more robust compared with the traditional single component recognition method and overall matching method when scanning of laser radar is not complete or the components are blocked.

Originality/value

This paper introduces a new pose estimation method for non-cooperative spacecraft based on point cloud. The experimental results show that the proposed method can effectively identify the features of non-cooperative targets and track their position and attitude. The method is robust to the noise and greatly improves the speed of pose estimation while guarantee the accuracy.

Keywords

Citation

Chen, Z., Li, L., Wu, Y., Hua, B. and Niu, K. (2019), "A new pose estimation method for non-cooperative spacecraft based on point cloud", International Journal of Intelligent Computing and Cybernetics, Vol. 12 No. 1, pp. 23-41. https://doi.org/10.1108/IJICC-03-2018-0036

Publisher

:

Emerald Publishing Limited

Copyright © 2018, Emerald Publishing Limited

Related articles