TU Darmstadt / ULB / TUbiblio

Robust, fast and accurate vision-based localization of a cooperative target used for space robotic arm

Wen, Zhuoman and Wang, Yanjie and Luo, Jun and Kuijper, Arjan and Di, Nan and Jin, Minghe (2017):
Robust, fast and accurate vision-based localization of a cooperative target used for space robotic arm.
In: Acta Astronautica, 136, pp. 101-114. ISSN 00945765,
DOI: 10.1016/j.actaastro.2017.03.008,
[Article]

Abstract

When a space robotic arm deploys a payload, usually the pose between the cooperative target fixed on the payload and the hand-eye camera installed on the arm is calculated in real-time. A high-precision robust visual cooperative target localization method is proposed. Combing a circle, a line and dots as markers, a target that guarantees high detection rates is designed. Given an image, single-pixel-width smooth edges are drawn by a novel linking method. Circles are then quickly extracted using isophotes curvature. Around each circle, a square boundary in a pre-calculated proportion to the circle radius is set. In the boundary, the target is identified if certain numbers of lines exist. Based on the circle, the lines, and the target foreground and background intensities, markers are localized. Finally, the target pose is calculated by the Point-3-Perspective algorithm. The algorithm processes 8 frames per second with the target distance ranging from 0.3m to 1.5 m. It generated highprecision poses of above 97.5% on over 100,000 images regardless of camera background, target pose, illumination and motion blur. At 0.3 m, the rotation and translation errors were less than 0.015° and 0.2 mm. The proposed algorithm is very suitable for real-time visual measurement that requires high precision in aerospace.

Item Type: Article
Erschienen: 2017
Creators: Wen, Zhuoman and Wang, Yanjie and Luo, Jun and Kuijper, Arjan and Di, Nan and Jin, Minghe
Title: Robust, fast and accurate vision-based localization of a cooperative target used for space robotic arm
Language: English
Abstract:

When a space robotic arm deploys a payload, usually the pose between the cooperative target fixed on the payload and the hand-eye camera installed on the arm is calculated in real-time. A high-precision robust visual cooperative target localization method is proposed. Combing a circle, a line and dots as markers, a target that guarantees high detection rates is designed. Given an image, single-pixel-width smooth edges are drawn by a novel linking method. Circles are then quickly extracted using isophotes curvature. Around each circle, a square boundary in a pre-calculated proportion to the circle radius is set. In the boundary, the target is identified if certain numbers of lines exist. Based on the circle, the lines, and the target foreground and background intensities, markers are localized. Finally, the target pose is calculated by the Point-3-Perspective algorithm. The algorithm processes 8 frames per second with the target distance ranging from 0.3m to 1.5 m. It generated highprecision poses of above 97.5% on over 100,000 images regardless of camera background, target pose, illumination and motion blur. At 0.3 m, the rotation and translation errors were less than 0.015° and 0.2 mm. The proposed algorithm is very suitable for real-time visual measurement that requires high precision in aerospace.

Journal or Publication Title: Acta Astronautica
Journal volume: 136
Uncontrolled Keywords: Edge detection, Marker localization, Robotics applications, Measurements
Divisions: 20 Department of Computer Science
20 Department of Computer Science > Mathematical and Applied Visual Computing
Date Deposited: 04 May 2020 12:51
DOI: 10.1016/j.actaastro.2017.03.008
Official URL: https://doi.org/10.1016/j.actaastro.2017.03.008
Export:
Suche nach Titel in: TUfind oder in Google
Send an inquiry Send an inquiry

Options (only for editors)
Show editorial Details Show editorial Details