Back to List

Terrain Relative Navigation

Terrain Relative Navigation


In this study, we propose a method for autonomously guiding a spacecraft to a flat spot where landing is possible. By proposing a method with unprecedented autonomy, we aim to achieve a breakthrough in landing on very distant celestial objects.


近年,小天体探査が世界的に注目されている.これらのミッションでは,着陸やランデブーのために高精度の光学航法が重要である.そのため,公称地形情報と実際の地形情報とを比較することで偏差を推定する地形相対航法(TRN: Terrain Relative Navigation)が用いられることが多い.特に小天体探査では,目標物到着後に形状モデルを作成するための十分な観測が可能である.したがって,公称地形情報の生成には,目標体の形状モデルが利用される.

The small body explorations have received attention around the world in recent years. In these missions, high-accuracy optical navigation is important for the landing or rendezvous. Therefore, Terrain Relative Navigation (TRN) to estimate deviations by comparing nominal terrain information with actual terrain information is often used. Enough observation of a target body to make a shape model is possible after arrival, especially in the small body explorations. Accordingly, the shape model of the target body is utilized for the generation of the nominal terrain information.
Recently, we are studying the autonomous optical navigation method based on TRN. Firstly, the reference image from a nominal position is generated by rendering in the case of the proposed method. Three-dimensional positions on the shape model relative to each pixel of the reference image are also memorized in addition to luminance. Secondly, multiple small images extracted from the reference image are compared with a captured image by template matching. Therefore, the relationships between the three-dimensional positions on the shape model and the multiple small images in the captured image can be determined. Finally, the actual position of the spacecraft can be determined by estimation of perspective projection, that projects a three-dimensional shape onto a two-dimensional plane. For these reasons, three-dimensional positions of the spacecraft can be estimated directly in high accuracy by utilizing the shape model. The estimation accuracy and computational time are evaluated by comparing the proposed method with other methods. As a result, the high estimation accuracy of several image resolution in real-time is achieved. We believe that the proposed method will be a key technology for landing on small bodies with higher accuracy in the future.