Can competitors cooperate? Yes!

Can competitors cooperate? Yes!

Challenges for Digital Twins 

2023 has seen heat records in Svalbard and extreme rainfall and floods in Norway (and other places). To prevent climate run-away, there is a pressing need to reduce the greenhouse gas emission from energy production. 

Achieving this requires enhanced modeling of production plants, so they can be run optimally and serviced just-on-time to extend their lifetime without unnecessarily exchanging parts or downtime.  This becomes crucial as renewable energy increases the dynamics of the energy system and will only be possible if we use AI in combination with existing physical knowledge in so-called hybrid AI.

Digital Twins and the challenge of simulations

A digital twin is a digital model of a physical system which for a specified purpose, such as monitoring or predictions of what-if scenarios, serves as an indistinguishable digital counterpart of the physical system. For predictive tasks, digital twins often rely on heavy numerical modelling or simulations.  Simulations are based on first-principles models with model parameters that do not necessarily correspond to measurable properties of the system e.g. the energy in each part of the system has to be interpreted through temperature and pressure at specific locations. 

The challenge lies in accurately determining model parameters that correspond to the actual system state. The current workflow entails trial and error with reasonable guestimates until the simulation output matches the observed data. This matching effort scales exponentially with the number of input parameters, which is typically between tens to hundreds. Further, the system may change over time, so this matching must be carried out continuously. 


The use case

The use case considered the first compressor stage on the  with data provided by Cognite and simulator provided by Kongsberg through their k-spice framework. The compressor train refines the oil by removing water and gas. It is a complicated process, sensitive to pressure, temperatures, composition and flow rates , and internal parameters of the specific equipment as well as time-varying degradation factors. Despite knowledge of the precise engineering layout, Kongsberg spend significant man-power setting up simulations to reflect the plant.

 

Valhall platform
Valhall. Photo: https://www.norskpetroleum.no/en/facts/field/valhall/


Simulation based inference with learned likelihood distribution

Simulation based inference is a series of methods where simulations are used to estimate unobservable parameters of a system. Each simulation is evaluated against observed data and its ‘correctness’ is quantified as ‘likelihood’ which is the probability of the observed data given the simulation input (similar to the loss function in a machine learning model). The challenge is then to find the input parameters that maximize the likelihood without the likelihood function being available. Instead, we use machine learning techniques to estimate the unknown likelihood function from individual evaluations of the simulator and use this learned model to perform parameter estimation. With only a few simulations, the estimate of the likelihood function will be crude, but each new simulation will improve it. This approach drastically reduces simulation evaluations and therefore time, compared to brute force sampling of the simulator. Also, the probabilistic models provide confidence intervals for the parameters, highly relevant in safety-critical applications.

Significant reduction of simulations in toy example

A typical benchmark in process industry is the continuously stirred tank reactor where you have a tank with multiple chemical substances which are participating in an exothermic chemical reaction. For demonstration, one partner simulated a data set and gave it to SINTEF without the input parameters. A classic least squares optimization approach required 20000 simulations to identify the inputs. The simulation-based inference with a Gaussian processes model for the likelihood mapping required only 300 simulations, showcasing a significant reduction in computational efforts.

Real life application

The setup is currently being implemented by Kongsberg with data and pre-processing provided by Cognite. Cognite and Kongsberg compete daily in the Norwegian oil and gas market, but in NorwAI they have found a neutral arena where they can focus on challenging problems with SINTEF Digital as moderator and research partner. 

Together, everyone has contributed to the development of the simulator-based interference model, Cognite with their colossal amounts of data and Kongsberg with good frameworks for simulation. Meeting places where industry get to voice their challenges and build relations from similarity in problems will become even more important with new regulations such as the EU AI Act which will require industry to document the safety of the algorithms used in critical infrastructure. 

Upon successful implementation, the next steps involve applying the same setup to K-spice-based electric grid simulations in collaboration with Statnett, and Bladed wind turbine simulations provided by DNV with data from Aneo. Better tuning of the simulators can enable better condition monitoring and maintenance predictions, to reduce waste of resources from industry.
 

Wind turbines
Photo: Colourbox

PUBLISHED: 2024-02-27

PUBLISHED: 2024-02-27

By: 

Signe Riemer-Sørensen, editor (SINTEF)

Alexander Stasik (SINTEF)

Evind Roson Eide (Kongsberg),

Simone Casolo (Cognite)

Andris Piebalgs (Cognite)