Vision-Based Navigation

Community events
19 June 2023
from 13H45 to 17H30
  • EN
  • Accessible by videoconference
  • Public
  • Olivier DUBOIS-MATRA Ingenieur

Recent innovations have made possible the use of autonomous on-board vision-based systems for space missions. Cameras are relatively cheap, light and low-power sensors which can provide an important amount of data for estimating the navigation state of a spacecraft. The drawback of course is that this information has to be extracted through complex algorithms, often running on high-performance hardware and requiring the use of a large number of images -either synthetic, or obtained in laboratory or outdoor -  for training and testing. This afternoon seminar will present several applications of vision-based navigation for orbital or landing missions.

Program :

  • 13:45 - 14:00 : Welcome and introduction
  • 14:00 - 14:30 : Relative Pose Estimation Using Monocular Vision for Spacecraft Proximity OperationsMaruthi Akella (University of Texas at Austin)

The problem of estimating relative pose for uncooperative space objects has garnered great interest, especially within applications such as on-orbit assembly, debris retrieval, and satellite servicing. In this talk, we summarize our recent advances for end-to-end pose estimation and filtering pipeline using monocular camera images for space systems applications. The algorithm pipeline consists of three major components: 1) a set of neural networks to perform keypoint regression, 2) a pose estimation component, implementing both nonlinear least-squares and perspective-n-point solvers, and 3) a full-pose tracking component, implementing a multiplicative extended Kalman filter. While this overall algorithm stack is designed to be a general-purpose solution, its development was motivated and driven by the size, weight, power, and cost (SWaP-C) requirements for 3U form-factor cubesats such as the NASA Seeker program. A combination of real and simulated results is presented to evaluate the various neural network components, and simulated time-series data are reported to evaluate the real-time performance of the full pipeline using flight-like hardware components.

  • 14:30 - 15:00 : Monocular Vision-Based Pose Estimation of Uncooperative Spacecraft - Lorenzo Pasqualetto Cassinis (Clearspace)

The estimation of the relative pose of an uncooperative target with respect to a servicer spacecraft is a critical navigation task during proximity operations such as fly-around, inspections and close-approach in Active Debris Removal (ADR) and On-Orbit Servicing (OOS) missions. This research analyzes the challenges of a monocular vision-based relative pose estimation system, in which a single monocular camera is used as a navigation sensor due to its reduced mass, power consumption and system complexity compared to systems based on active sensors or stereo cameras. Building on the characteristics and limitations of existing methods, this research investigates the applicability of a Convolutional Neural Network (CNN)-based relative pose estimation method and its system interfaces with a navigation filter, with special focus given to the on-ground validation of the proposed navigation pipeline with realistic laboratory images of target spacecraft.  Furthermore, the performance analysis of the adopted CNN-based system is extended by comparing the pose estimation accuracy and inference time on a GPU and on a low-power Myriad X processor.

  • 15:00 - 15:30 : MSR-ERO Rendezvous Navigation Sensors and Image Processing - Keyvan Kanani (Airbus)

The ESA/NASA Mars Sample Return campaign aims to return to Earth samples of Mars materials. The samples will be collected by the NASA provided perseverance rover, assembled in the Orbiting Sample which will then be injected in the Mars orbit by the NASA provided Mars Ascent Vehicle. The ESA provided Earth Return Orbiter (ERO) vehicle will then autonomously detect and rendezvous with the Orbiting Sample (OS) container in low Mars orbit, capture it, seal it, and safely bring it back to Earth. Airbus Defence and Space, under ESA contract, is designing and developing the ERO spacecraft and, in particular, its vision-based GNC system to be used during rendezvous (RDV).
The GNC system includes two types of vision sensors: a Narrow Angle Camera (NAC) and a Light Detection and Ranging device (LiDAR), integrating their own image processing to provide ERO navigation with measurements of OS relative position.

From the navigation standpoint, the MSR-ERO rendezvous is a non-cooperative rendezvous scenario with a target vehicle (the OS) characterized by an uncontrolled fast tumbling motion. Choice has been made to divide rendezvous into a far range phase (from ~50km to ~500m) and a close range phase (from ~500m to capture). In the far range the vision system is required to provide only OS relative line of sight (LoS) using the Narrow Angle Camera (NAC), provided by Sodern, as main sensor. The NAC has a 4.5deg field of view (FoV) and 1Mpx detector (Faintstar2). It integrates a centre of brightness algorithm robust to proton impacts on the detector. In the close range the main sensor is a scanning LiDAR provided by Jena-Optronik with an adaptable FoV from 0.5° to 40° and a maximum scan frequency of 2Hz which integrates a 3D barycentring processing to assess the OS position from 3D point cloud directly measured by the LiDAR.

This paper focuses on the vision sensors of MSR-ERO spacecraft. In a first section, it describes the MSR mission, including its requirements, and recalls the vision sensor trade-off done in the early phases of MSR-ERO project. Then, the vision sensor architecture within the ERO system and the sensors operation concept (board/ground) are presented in the second section and, in the third section, the selected sensors as well as their image processing are described in more details. An overview of their development plans is provided in the conclusions, together with the preliminary performance test results.

  • 15:30 - 15:45 : Coffee break
  • 15:45 - 16:30 : CNES Vision-Based Navigation technologies : From rovers to asteroids proximity operations - Aurélia Bourgeaux & Irene Valenzuela (CNES)

CNES has been involved in the development of VBN techniques since the mid-90s, at the beginning mainly related to planetary exploration.

This presentation will introduce an overview of the past and the on-going VBN activities at CNES, including the different validation means. Then, it will present a focus on two specific domains : the autonomous navigation solution integrated in Exomars rover and a benchmark of different techniques to compute the centroid line-of-sight in an asteroid proximity operations scenario.

  • 16:30 - 17:00 : ESA Technology Developments in Vision-based NavigationOlivier Dubois-Matra (ESA)

The European Space Agency (ESA) has been developing onboard Guidance, Navigation and Control (GNC) technologies to support space activities in Earth orbit and beyond. One of the areas of development has been the use of vision-based systems to improve the performance and the autonomy of the navigation function. This presentation will focus on two classes of missions enabled by vision-based navigation (VBN): rendezvous (in Earth or planetary orbit, with collaborative or non-collaborative targets) and precision descent and landing on planetary surfaces. It will present an overview of both recently completed and on-going ESA technology activities raising the Technology Readiness Level (TRL) of VBN systems for these missions

 

Share :

Partage possible par tous (via Facebook, LinkedIn, Twitter, email)

You may find these events interesting, as well