Troublesome or trustworthy—how to ensure digital twins truly deliver value
Still in the early stages of its evolution, digital twin development is presently focused on a series of stand-alone features to inform decision-making during asset design phases, and to provide a description or deliver diagnostics for parts of the asset in the operational phase. As the market and technology mature, it is expected that their sophistication and scope will increase significantly, covering the entire asset and, eventually, becoming dominated by either prescriptive or even autonomous functions.
While an awareness of these virtual replicas has been around for several years and under many guises, from model-based optimization to structural re-analysis systems, an accepted definition of exactly what it is and what it is not, is relatively wide. However, the general consensus is that it allows system information to be available to predict performance through integrated models, with the purpose of providing decision support.
Rather than think of a digital twin as a monolith, DNV GL, a technical advisor to the oil and gas industry, believes this “mirror image” should be considered as a collection of elements or components, of various levels of complexity, each with its own distinct role and function, Fig. 1.
“It’s important, when you look at the data value chains, to see where data are actually born and where they are actually used. The journey from “raw materials” to where you serve the data for consumption can sometimes be very long and pass through several stakeholders,” explains Per Myrseth, head of data services, data management and analytics at DNV GL. “On that journey, you merge and play with the data to prepare it for final analytics or visualization, and a lot of things can happen to the information and its needs during this process. Therefore, the activity to monitor the condition of the data, so that it is fit for use, all the way to where it is needed can sometimes be challenging. “As this technology evolves, it is vital to combine the criticality and the use cases of the digital twin to fully understand the quality and all the components in it.”
Building trust and efficiency in the digital era. Companies designing and manufacturing hardware across the oil and gas value chain must prove the safety, quality and integrity of components, equipment and assets through recognized quality assurance principles. However, no standard process exists to provide the same mechanism of trust and value for the digital representation of a physical asset and its behavior.
As oil and gas operators demand proof that digital twins can be trusted and deliver value over time, DNV GL, in partnership with TechnipFMC, has developed a methodology for qualifying the quality and integrity of the technology. Built on DNV GL’s recommended practice (RP) for technology qualification: DNVGL-RP-A2031, it aims to bring a level playing field to the sector’s varying technical definitions of, and expectations of, digital twins. It will set a benchmark for oil and gas operators, supply chain partners and regulators to establish trust in digital twin-generated data for performance and safety decision-making in projects and operations.
The methodology has been piloted with operators, EPC contractors, digital system providers and vendors. The upcoming RP will encompass definitions of the digital twin and clarity on data quality and algorithm performance, as well as requirements to the interaction between the digital twin and the IT-platform it operates on.
“Only a year ago, there were a lot of buzzwords and a sort of vague expectation. But now, this (digital twin technology) has matured quite fast,” believes Erlend Fjøsna, TechnipFMC’s head of innovation and digital partnering. “The digitalization debate is now more centered on what value is created and how we can leverage our domain knowledge to deliver that value. Not only to maintain it over time but to ensure interoperability, and a trusted digital thread, between various parties. Digital twins hold the promise of having significant or even huge cost-savings. It is an on-going journey, and we see already the benefits brought by those new ways of working.”
Together with several other major players, including Equinor and ConocoPhillips, TechnipFMC and DNV GL are also involved in another JIP to establish the required framework and foundation for using semantic data in business, including the upgrade of international standards.
“We are transforming into a more data-centric way of working,” added Fjøsna. “While continuously improving, we also need to look towards the horizon and make sure we do not miss the greater leaps to come in this area. What I have in mind is more automated engineering processes, and this will be made possible, I hope, by moving from plain text and tables to leveraging the power of semantic data from it.”
As part of its digital transformation, TechnipFMC has developed Subsea Studio (Fig. 2), a state-of-the-art digital solution that transforms conventional studies into ultra-fast digital field development.2 The ambition is to seamlessly connect with the iEPCI (integrated engineering, procurement, construction and installation) and iLoF (integrated life of field) phases unlocking the full potential of an integrated digital thread.
Realizing true value from virtual reality. The major challenge when implementing new digital technologies is the same as when novel hardware technologies were introduced two decades ago. How can you trust that it works, when the technology hasn’t been used before? Building trust in the quality and integrity of digital twins is key to extracting maximum value and, subsequently, to its adoption. Oil and gas companies are increasingly utilizing the technology to bring asset information from multiple sources, together in a single and secure place, connecting 3D models with real-time field data during the operation phase.
For example, in October 2019, Kongsberg Digital, a subsidiary of KONGSBERG, signed a $10.5-million contract scope to digitalize the Nyhamna facility, a gas processing and export hub for Ormen Lange and other fields connected to the Polarled pipeline. The Kognifai Dynamic Digital Twin will be updated continuously with integrated information reflecting the status of the facility in real-time. As technical service provider at Nyhamna, A/S Norske Shell will be equipped with the ability to simulate scenarios and uncover new options for optimization of its real-life counterpart.3
Petoro, established to create the highest possible value and achieve the highest possible income to the state interest in petroleum activities, has substantial holdings across 213 production licenses in 34 fields on the Norwegian Continental Shelf, including ownership in the Nyhamna gas export facility.
While encouraged by the advance of digital twins, Roy Ruså, chief digital officer with Petoro, is concerned by a lack of sector openness for alternative solutions to its possibilities and pitfalls, leading to distrust in the capabilities. “Key to digital twins are emphasis and a good understanding of what problem to solve, openness for new solutions, a step-wise approach and management systems for implementing and maintaining change processes.”
He is particularly encouraged by the role of the supply chain. “They (the supply chain) are addressing and even solving problems that customers don’t know they have. The suppliers are much more valuable than we apparently appreciate, because we tend to tell the supplier the solution or what they should do, and bypass the important step of inviting them to an open understanding of the problem,” he said. “Digital twins currently focus on some of the very small and isolated problems in a facility or a work process. This is a good and logical start. Developing models for these are the building blocks to integrate models covering wider work processes, like well planning or field planning. Building digital twins of complete facilities are complex issues and requires respect.”
Developing a twin is challenging, but success does not necessarily mean making value from it. “We need stronger emphasis on change management. While the quality of the digital twin could be ok, there is often too little focus on the change process and preparing the people that are going to use it in practice.”
Developing solutions for the future. “We have never been in a position, where we can create more and better data than today. Likewise, we have never been able to combine and analyze information in more advanced ways than we have at present,” believes Myrseth. With fluctuating oil price and the impact of Covid-19 on travel, the delivery of a mirror image of an asset from the safety of shore needs to be trusted, of value, and fit for purpose. The digital twin can be developed for varying purposes.
Probabilistic digital twin. Unveiled at SPE Offshore Europe 2019, the probabilistic digital twin (PDT) concept is intended to close the gap between digital twins and risk analysis, which is still largely conducted manually before assets enter service. The main elements, which distinguish a PDT from traditional digital twins, are:
- Probabilistic degradation and failure models, reflecting uncertainty and variability of conditions and processes that affect performance and lead to failures.
- Logic and relational models, relating performance variables to failures and loss events.
- Surrogate models, approximating heavier simulation models, allowing fast queries and enabling propagation of uncertainty and model coupling.
As risk is dynamic, varying in time with operational conditions and the state of the asset, it is not captured by current risk models, which are seldom updated and are lacking in real-time and prediction capabilities. With a single, unscheduled downtime event costing in excess of $2,000 per day, better and up-to-date risk information may significantly reduce unplanned or unnecessary downtime.
The PDT is designed to bring risk analysis into “live” use. By adding a layer of probabilistic risk modelling to existing digital twins, the concept aims to capture uncertainty, the effect of new knowledge and actual conditions on operational performance and safety. It will therefore allow operators to adjust operations or take preventive actions to maintain an acceptable risk level at all times, thereby reducing expensive downtime.
It delivers more than just predictive maintenance. By including reliability and degradation models to forecast the remaining lifetime of mechanical components, it can also be used to display the overall impact on safety.
Hybrid digital twin. DNV GL and FPSO specialist Bluewater is undertaking a pilot project to use hybrid digital twin technology to predict and analyze fatigue in the hull of an FPSO in the North Sea. The project aims to validate and quantify the benefits of creating a virtual replica of the FPSO to optimize the structural safety of the vessel and enhance risk-based inspection (RBI), a decision-making methodology for optimizing inspection regimes. The pilot underpins Bluewater’s mission to take a proactive, responsible approach to safety and environmental care in its operations.
DNV GL’s unique combination of domain experience, inspection capabilities, and digital analytics and modelling, enables the monitoring of the asset’s hull structure during operation without dependence on costly routine inspection regimes. Termed “Nerves of Steel,” the underlying concept permits the use of various data sets (external environmental data or local sensor data), combined with digital models of the asset, to develop a hybrid replica model of the vessel’s structure. This can be used in real time to monitor the asset’s condition, identify and monitor high-risk locations, and plan targeted and cost-efficient maintenance and inspection activities.
Hybrid twin technology uses a combination of numerical design models and data from actively recorded strain gauge sensors on board the FPSO. These sensors allow for a full understanding of the accumulative loading and current state of the FPSO structure. The technology blends computer-simulated modelling with real-time data, which is then streamed to the operator via DNV GL’s Veracity data platform or an existing data transfer solution.
DNV GL’s visual dashboard presents data to Bluewater on stresses in the hull’s structure, alongside information that can be used to identify areas with relative higher risk of cracks or deformities to occur. The information, which is constantly recorded, can be accessed and analyzed to inform decision-making and implement inspection based on risk priority.
The trial will expand on traditional FPSO integrity management strategies, which are based on software-based assumptions made at the design stage as well as current inspection record to enhance RBI decision-making. The pilot is expected to provide new insight and smarter ways of managing risks and costs related to structural integrity management.
This is DNV GL’s third pilot project evaluating the performance of hybrid digital twin technology. With global support from the advisor’s experts in Singapore, the UK and Norway, the first involved defining a repair procedure for a FPSO flare tower. Another trial, which is still ongoing, is being performed on a fixed offshore platform.
“We [Bluewater] decided to extend our digital twin program to include our FPSO Aoka Mizu,” explains Peter van Sloten, department head, technology management, with Bluewater. “Our ambition for the structures largely matched with the novel digitalization services of DNV GL. We are, therefore, pleased to team up with DNV GL to develop a tool to monitor the structural integrity of this most versatile FPSO, designed and proven to operate in harsh environments with high uptimes and a maintained, strict regulatory and safety regime. This will enhance the safety and enables an optimized inspection regime.”
Reliability and trust through use of best practices. Like traditional asset management, there is a need for condition monitoring of the data science, as well as within the digital twin, itself. According to Myrseth: “Companies are used to doing asset management and condition monitoring on physical assets and using this metaphor, we can employ them for condition monitoring of the data, the data science and the digital twin.”
Digital twins are a rapidly developing technology, widely expected to become a significant contributor to the future management of major industrial sites. The digital twin market is estimated to grow from $3.8 billion in 2019 to $35.8 billion by 2025.4
DNV GL’s Technology Outlook 2030, a research report identifying transformative technologies in key industries, highlights a digital value chain run by machines and algorithms as a prevailing trend for the oil and gas industry in the decade ahead.5 The research predicts cloud computing, advanced simulation, virtual system testing, virtual/augmented reality and machine learning will progressively merge into full digital twins, which combine data analytics, real-time and near-real-time data on installations, subsurface geology, and reservoirs.
The use of twins, and, importantly, trust in their accuracy, can be increased significantly by ensuring that data and information reflect the most up-to-date condition of the physical asset. To ensure that the performance of digital twins matches expectations, the organizations involved require a structured, systematic approach, Fig. 3. To support the industry achieving its goals of high-quality trustworthy digital twins, DNV GL has published two frameworks as important building blocks. In 2017, the data quality assessment framework, DNVGL-RP-0497, was published to perform quality assurance across three areas:
- An organization’s capabilities to create and maintain high-quality data
- Measuring the quality of data
- Assessing the risk of using the data.
Three years later, DNVGL-RP-0510 was published. This methodology assures the process of making, testing, deploying, maintaining and monitoring data science solutions, based on data-driven methodologies like machine learning and artificial intelligence.
The upcoming DNV GL RP on qualification and assurance of digital twins, published in October 2020, is built on the same principles as the RPs for data quality and data-driven models. It will address both the function and the operation of digital replicas.
Solving the digital trust challenge will be key to its adoption, acceleration to use at greater scale, and acceptance as an accurate, valuable and trusted technology. The use of industry best practices is an efficient way of creating trust and thereby unleash the true value of digital twin technology.
- Technological innovation delivers transformative product suite to upstream sector (November 2023)
- Taking the next step in offshore digitalization (November 2023)
- Optimizing BHA and fluid selection with a machine learning-based drilling system recommender (October 2023)
- Digital twin mitigates recovery risk of damaged riser, offshore Brazil (September 2023)
- FPSO technology: Accelerating FPSO performance evolution (September 2023)
- The last barrel (August 2023)
- Applying ultra-deep LWD resistivity technology successfully in a SAGD operation (May 2019)
- Adoption of wireless intelligent completions advances (May 2019)
- Majors double down as takeaway crunch eases (April 2019)
- What’s new in well logging and formation evaluation (April 2019)
- Qualification of a 20,000-psi subsea BOP: A collaborative approach (February 2019)
- ConocoPhillips’ Greg Leveille sees rapid trajectory of technical advancement continuing (February 2019)