High-accuracy attitude determination of Earth observation satellites
- Project and Master Subjects 2025-2026
-
Past Projects
-
Project and Master Subjects 2024-2025
- Improving Images for Climate Action
- Every Variable Everywhere All at Once
- Protecting Water Resources through Machine Learning and Hyperspectral Imaging in Remote Sensing CubeSats
- More Effective Earth Observation for Climate Action Through Learned Data Compression in CubeSats
- Mitigating Camera Artifacts in HYPSO Data for Improved Climate Monitoring
- Characterization of High-resolution Spectral Imager
- A Foundational Unmixing Model for the HYPSO satellites
- Hyper/Multispectral image fusion with HYPSO-2
- Game theory applied to energy optimal satellite attitude control
- Mu-analysis for agile satellite attitude control maneuvers
- Randomized optimization applied to super-agile satellite operations
- Modelling super-agile satellite operations for optimization
- Enabling high-accuracy HYPSO image georeferencing by high-accuracy satellite pose estimation through postprocessing of satelitte sensor data
- High-accuracy attitude determination of Earth observation satellites
- Agile Earth Observation Satellite simulation studies
- Multi-angle image analysis and what we can learn about the atmosphere
- GNSS-R: Simulator design of a GNSS-Reflectometry small satellite
- GNSS-R: GNSS jamming and spoofing source localization from a small satellite
- GNSS-R: Maritime Surveillance using GNSS-Reflectometry
- Project and Master Subjects 2023-2024
- Project and master assignments 2022
-
Project and Master Subjects 2024-2025
High-accuracy attitude determination of Earth observation satellites
Project description
The HYPSO programme provides and will provide data products to researchers in need of Earth observation data. NTNU is working on extending this programme with Global Navigation Satellite Systems Reflectometry (GNSS-R) which is a GNSS signal based Earth observation technique acting as a bi-static radar using Earth-illuminating GNSS signals.
This project is addressing the challenge of the accurate satellite attitude real-time which is required to accuracy Earth observation. Some reason why high-accyracy atttiude is important of why it is important to type it:
- Achieve high accuracy georeferencing of images captured in low Earth orbit (LEO). For a LEO satellite at 2000 km altitude a pitch angle error of 0.1 degrees results in a longitudinal pixel georeferencing error of ~3.5 km on the ground. A 0.1-degree roll error will result in a similar traverse error. Any yaw error will corrupt any photo mosaic using images from multiple satellite passes.
- Performance of advance manoeuvres such as slew are dependent on the attitude. Increased resolution of the observations on ground or at sea from space can be achieved by collecting overlapping observations or partially overlapping observations while the satellites is moving to, over and from the point of interest.
- Performing a study in advance of a mission to determine what accuracy can be achieved from a given type of satellite, with a given set of sensors in a given orbit. Is it necessary to increase the size of the satellite to fit more sensors to achieve the desired accuracy?
For attitude estimation one can use several sensors to demine attitude. These can be IMU, magnetometer, star tracker, sun sensor, GNSS and potentially accelerometers.
Impact of this project
Space technology plays a critical role in achieving 40% to 50% of the UN Sustainable Development Goals. The NTNU SmallSat Lab's Earth observarton satellites such as the HYPSO satellites, launched in 2022 and with a successor planned, utilize hyperspectral imagers to capture detailed information beyond the visible. This data allows us to detect and monitor water bodies like oceans, fjords, and lakes, including vital yet potentially harmful algae. The HYPSO satellites also contribute to climate change studies by imaging the Arctic region. Data from the HYPSO satellites play a role in achieving Climate Action, Preserving Life Below Water, and ensure access to Clean Water and Sanitation. The objective of this project is to get the satellite to be able to accurately determine where it has been pointing, and thus it would enable the HYPSO-satellites to detect and localize algal blooms even further away from a coastline.
Tasks and Expected Outcomes
Typical task in the project can be
• Performing a short literary review on the filtering algorithms useful for attitude determination. Example can be error state/multiplicative extended Kalman [1-2] and factor-graph-based optimization techniques (FGO) [3].
• Compare different attitude estimators and highlight strength and weaknesses.
• Study the effect of different sensor combination.
• Study the effect of different number of sensors of the same type.
• Study sensitivity to sensor errors and faults.
Who we are looking for
We are seeking a highly motivated final year student in Cybernetics and Robotics with interest in sensor fusion. Experience from subjects such as TTK4250 (Sensor fusion) and TTK4190 (Guidance, Navigation and Control of Vehicles) will be beneficial for the student in this project. The project will be adapted to the student's background and goals.
How we work
The student will be part of the NTNU SmallSat lab, a lab which typically hosts 10-20 master's student per semester. At the NTNU SmallSat Lab we encourage collaboration and try to get our group to help each other. To facilitate this, we as well as arrange common lunches and workshops where the students and supervisors can learn from each other. I some project we also implement a development process.
Supervisors
Torleiv H. Bryne (main supervisor, NTNU), Bjørn Andreas Kristiansen (NTNU).
For more information about the project or to show your interest, contact Torleiv H. Bryne and Bjørn A. Kristiansen.
References
[1] Markley, F.L (2023) Attitude error representation for Kalman filtering, Journal of Guidance, Control, and Dynamics, 26-2, pp 311-317 https://doi.org/10.2514/2.5048
[2] Carpenter and D'Souza (2018) Navigation Filter Best Practices, NASA/TP--2018—219822, https://tinyurl.com/4az7ycs9
[3] Dellaert, F. and Kaess, M (2017), Factor Graphs for Robot Perception, Foundations and Trends in Robotics, 6-(1:2), pp1-139, https://doi.org/10.1561/2300000043