Deep-learning based optical navigation
Additionally, some actions spacecraft have to carry out may need to be decided in mere seconds. Thus, spacecraft need to become more self-sufficient. To become more self-aware, spacecraft must first know their state vectors in space. Autonomy provides a way to give spacecraft situational awareness by allowing them to make decisions locally in reaction to their surroundings. The same technology that is used in self driving cars to avoid collisions can inspire autonomous guidance, navigation, and control (GN&C) in spaceflight. Autonomous functions have been tested on several past missions by NASA since the late 1990s. Spacecraft have the capability to obtain a great deal of information about their surroundings using various sensors. Advances in computational technology have eliminated the obstacle of limited processing power on interplanetary missions. The Cassini-Huygens mission's optical navigation (OpNav) system processed images with a resolution comparable to that from a smart phone camera. Therefore, the observable information that can be collected about the spacecraft's surroundings can help with state estimation.
In space, fully autonomous navigation systems need to be robust and reliable. They need to be able to think for themselves and learn from the situations and environments they encounter. Current state of the art technology is nowhere near this level. Imagine, for example, that a spacecraft is orbiting the Earth. The spacecraft can use the wide variety of observable features on the surface of the Earth to determine its state vectors. By generating and consistently updating a catalog or map of previously unknown or unspecified features on the Earth's surface, its state vectors relative to the surface can be tracked using machine learning and computer vision algorithms. Current research in this area uses deep learning, specifically neural networks, to create a robust general autonomous on-board terrain-relative navigation approach.