You are Home   »   News   »   View Article

Energistics - using RESQML to move subsurface data

Thursday, July 2, 2020

The RESQML data exchange standard makes it possible to move subsurface data from one software application to another, with no loss of data fidelity. This makes it possible for geoscientists to adopt much more complex workflows with multiple software applications made by different companies, says Ross Philo of Energistics.

The RESQML data standard makes it possible to move subsurface data models from one application to another, with no loss of data quality, and maintaining records of everything which has happened to the data along the way.

This makes it much easier for geoscientists to adopt more complex workflows, making use of a combination of software applications, each of which might be focused on one specific task, rather than having to do everything on one universal software application, said Ross Philo, CEO of oil and gas data standards organisation Energistics.

He was speaking at the Digital Energy Journal KL forum in October, 'How to Digitalise Exploration and operations'.

In the past, the development of earth models was a highly linear process, first creating a structural framework of the subsurface in a cellular grid, then adding in reservoir properties. Geoscientists could do all of this on a single software package, designed to work in this linear way.

But now, people want to do more and more steps, and not necessarily linear steps. For example, they might want to put in pre-stack seismic interpretation, add geomechanics interpretations (studies of rock stresses and how that affects seismic properties), or do some analysis on a subset of the data.

Geoscientists may wish to add in alternate grids or bring in other types of data analytics or machine learning, or chrono-stratigraphy (adding geological time to the identified rock layers).

The reservoir production itself is becoming far more complex, as companies look at methods like enhanced oil recovery (EOR) and inject water and gas.

The applications have also become far more comprehensive. Companies might want to move the entire model into another application to add 'additional depths to the analysis' and then move it back.

There are a number of smaller companies developing solutions for specific tasks, which companies would like to incorporate into their workflows, but can't because the challenge of moving data around gets too great.

Normally, the only way to do this is to export very large files from one software application to another. You would need a way to link each software application with each other application. It gets very difficult managing the relationships between all of the files and maintain data integrity.

'You end up with a cat's cradle of complexity that becomes incredibly hard to develop, manage and maintain,' he said.

There is a limit to how complex you can allow a process to be - with every additional application adding more complexity, he said.

Sometimes work is re-done because someone is not confident it has been done properly by someone else before.

'No single vendor can cover the complexity of all the different workflows companies want to use today. You're looking for best of breed applications that you can mix and match in order to provide the sort of flexibility that is required,' he said.

'You need to have a way of plug-and-play for combinations of solutions, so you can move data from one application to another as you need to.'

There are also more and more data types. For example, companies are making more use of distributed acoustic sensing (DAS) systems, with fibre optic cables in wells, which can generate up to 10 terabytes in just a day.

Keeping track of data objects

The technical challenge with RESQML is to be able to move earth models from one application to another, where it can be unpacked by the receiving application without losing any fidelity.

Subsurface data can come in a range of different co-ordinate reference systems, means of measuring depth, and units. If there is any mismatch, the result is a big mess.

The subsurface data model must keep track of every object, in the right geographical location, including horizons (where you think the rock layers change). Individual objects need to have unique ID numbers.

You also need to keep track of what has happened to data in the past. Mr Philo uses the analogy of keeping records of a piece of art, how you prove who made it, and what has happened to it since then.

It gets very difficult to keep track of different components and their relationships. Similar to how it can be hard to keep track of data relationships if you let someone else work on your spreadsheet for a while.

The data files can get very large, which also makes them trickier to move.


Demonstration

A demonstration was made in October 2018 SEG event in Anaheim, with data for a field jointly operated by BP and Shell, moving data between 6 software packages, including from Roxar, Paradigm (both now part of Emerson), Petrel, and a special fracture porosity software (OpenFlow) from IFP, reservoir simulations from CMG, and a final visualization in Dynamic Graphics.

The workflow was designed so that the reservoir model could be enriched by specialist software tools.
In the project, a subset of data was processed in OpenFlow and then passed to Petrel. It means that people don't need to work on the whole model, they can enhance a subset of the information and re-integrate it to the original model.

The demonstration also showed how earth model data could be moved with complete fidelity between applications running in different cloud environments as part of the overall workflow. The whole demonstration took 45 minutes and was demonstrated live at an exhibition stand. 'They did this step by step and were able to update a set of the information and move it across to another cloud and bring it back,' he said.

You can get the output with just one file, not multiple outputs from different applications.

It makes it possible to have more people collaborating on the work, and increase the type of analysis you can do, and reduce the risk on exploration decisions, he said.

Also, as the data was moved, it was possible to keep track of every element in the earth model, including the history and lineage of data.

Data archiving

The standards are designed for data transfer, moving data from one application to another, but can also be used for data archiving,

By putting data in open data standards, an oil company has a higher likelihood of being able to use the data decades into the future, than if the data is stored in a proprietary format.

Oil companies often find they cannot work with data which was created in software one version behind the version they are currently using, he said. Schlumberger has said it will only support 2 versions back. 'You've spent billions of dollars on this data- you want to get to it again.'

Provide data to regulators

RESQML can be used to provide data to regulators. As an example, the UK's Oil and Gas Authority has said it does not want its own copies of all the earth model data, but wants to be able to ask an operator for data about the history of a field at any time.

This might include data about the original decision to develop the field, and data from a number of different companies who owned it along the way.

If the data is not stored in a standard format, it probably means that it can only be used with the software application it was originally created in.

'I would love to see regulators requiring data in the standard format,' he said. 'They are already asking for PRODML for production reporting, I think we'll see them requiring RESQML for earth models.'

OSDU

The Open Subsurface Data Universe (OSDU) project is 'really exciting - it seems that every conference we go to, the conversation is all about OSDU,' he said.

OSDU is designed as a standard data platform for subsurface and wells data, including seismic data in a 2020 release. It aims to keep data decoupled from the application. There is a series of APIs which will allow software tools to connect to the underlying data in a standard manner. Energistics was one of the first non-operators to join in November 2018, since it made sense that a standard data platform would include support for Energistics data exchange standards.

It has many of the largest operators signed up to it, and represents a highly-collaborative effort, involving a large number of companies within the OSDU community, to deliver the solution. 'I think it has tremendous momentum, and shows what can happen when a group of operators get serious. The proof will be in how it then expands to attract other operators and service companies, as well as being able to cover other data types.' As of Dec 2019, there are 26 operators involved, and a total of almost 120 members.

It is not the first time that attempts have been made to develop an industry standard integrated data model - other projects include POSC and Openspirit.

But this one is different because it makes use of the cloud, he said. 'The previous platforms proposed would have required an operator or service company to implement that platform within their own environment. With OSDU - the intention is that each of the major cloud providers will offer it as a service which an oil company can then contract to.'

About Energistics

Energistics sees itself more of a 'custodian' of standards rather than an organisation which develops standards. 'The standards are defined by subject matter experts within the different companies that are members of Energistics and are freely available to the industry,' he said.

There are 115 members, including oil companies, service companies and software companies who collaborate to define the standards and to support Energistics' activities on behalf of the industry.

The three major standards are WITSML, covering drilling and well construction, PRODML, covering production (production volumes and the production string), and RESQML, covering earth models.



Associated Companies
» Energistics
comments powered by Disqus

CREATE A FREE MEMBERSHIP

To attend our free events, receive our newsletter, and receive the free colour Digital Energy Journal.

FEATURED VIDEO

Clustering Considerations in the Machine Learning Workflow – Examples with Exploration Data
Philip Lesslar
from Precision DM

DIGITAL ENERGY JOURNAL

Latest Edition Jun-Jul 2020
Jun 2020

Download latest and back issues

COMPANIES SUPPORTING ONE OR MORE DIGITAL ENERGY JOURNAL EVENTS INCLUDE

Learn more about supporting Digital Energy Journal