The attitude of a spacecraft is the 3DOF orientation (roll, pitch, yaw) of its body frame with respect to the inertial frame. Estimating the attitude of a spacecraft is a basic functionality in space flight. Star trackers are state-of-the-art in spacecraft attitude estimation, especially to support high precision orientation control. Star trackers are primarily optical devices that are used to estimate the attitude of a spacecraft by recognising and tracking star patterns. Currently, most star trackers use conventional optical sensors. We propose the usage of event sensors for star tracking. There are potentially two benefits of using event sensors for star tracking: lower power consumption and higher operating speeds.
(Left) A star field under the field of view (FOV) of a telescope. (Middle) Events recorded using an iniVation Davis 240C event camera as the FOV moves away from panel a (for clarity, the event polarities are not displayed here). (Right) The event data collapsed onto the image frame - the star patterns in panel a are clearly observed here (the stars are blurred due to the motion of the camera).
There are two potential benefits of using event cameras for star tracking:
Due to the pausity of the scene (relatively few bright spots against a black background), the number of events generated tends to be small relative to the number of pixel positions. Hence, an event camera may consume less energy.
Event sensors have high temporal resolution (e.g., µs resolution), which could enable higher-speed star tracking. This may be useful for ultra-fine attitude control.
The Sentient Satellites Lab has been pioneering event-based star tracking, having published the first paper on the topic at CVPR 2019 Workshop on Event-based Vision. Since then, we have produced follow-up work, which included the development of an ultra-fine attitude control CubeSat payload that employs on event-based star tracking and piezoelectric micromotion stabilisation.
For more details: High Frequency, High Accuracy Pointing onboard Nanosats using Neuromorphic Event Sensing and Piezoelectric Actuation
Event-based star tracking dataset (first published in Chin et al., CVPR Workshops 2019). Credits for the data go to Samya Bagchi. Please cite the following paper if you use the data in any publication: