Which works better, explainable AI or black-box AI?
-
Project and Master Subjects 2025-2026
- Super-agile operation of small satellites
- Early warning fault detection for satellite operations based on telemetry
- Semi-controlled re-entry for a satellite using attitude control
- System identification of environmental effects for a satellite during re-entry
- Mu-analysis for agile satellite attitude control maneuvers
- Enabling high-accuracy HYPSO image georeferencing by high-accuracy satellite pose estimation through postprocessing of satelitte sensor data
- High-accuracy attitude determination of Earth observation satellites
- Starlink: Signals of Opportunity positioning, navigation and timing (PNT)
- GNSS-R: Simulator design of a GNSS-Reflectometry simulator
- GNSS-R: Payload and embedded SW design
- GNSS-R: GNSS jamming and spoofing source localization from space
- GNSS-R: Formation flying of small satellites
- GNSS-R: Novel ship-detection methods for GNSS-Reflectometry
- Automatic Satellite Telemetry Anomaly Detection and Trend Analysis
- Which works better, explainable AI or black-box AI?
- Integrating the HYPSO constellation with the Copernicus Suite
- Explainable AI on a GPU
- What can the HYPSO-3 Hyperperspectral Cameras Observe?
- Could a short-wave infrared hyperspectral imager characterize oil spills?
- Coordinated Planning between a satellite constellation and a Autonomous Surface Vehicle
- Calibration of Hyperspectral camera point-spread function
- Past Projects
MENU
Which works better, explainable AI or black-box AI? (F25/S26)
Project Description
Comparison of two competing machine learning methods for quantitative regression model development: Today's AI-culture's convolutional neural networks and Idletechs' EMSC & sparse OTFP/PLSR, with respect to
a) over-all similarities and differences
b) robustness against measurement nois in the chosen sensor channels c) interpretability and numerical stability.
Supervisor(s)
This project would be advised by Frank Westad and co-advised by Joe Garrett