You are Home   »   News   »   View Article

Sarawak Shell - condensing subsurface data onto a grid

Thursday, June 11, 2020

Sarawak Shell has a project to integrate subsurface data from multiple surveys and interpretations onto a single grid, covering North Borneo, and only showing the best data. Senior technical data management consultant Teck Hing Wong explained.

Sarawak Shell has embarked on a project to integrate together data from multiple seismic surveys and interpretations into a single grid covering North Borneo, showing only the best data.

The aim is to put everything together in a single grid, which would provide the exploration team with 'the final trusted exploration data set at their fingertips, with all data synchronized,' said Teck Hing Wong, senior technical data management consultant with Sarawak Shell.

It should help reduce the amount of time users spend searching for data, ensuring that all data is in a central corporate database, rather than on people's hard drives or within corporate silos.

The integration project also involves stitching together data from multiple seismic surveys. Stitching together seismic data is a complex task. Seismic is not always shot at the same grid pattern or angle. There can also be inconsistencies in the seismic sampling rates and spacing, so missing data points, he said.

The project will also bring together data integration work, although it can be made at a wide range of scales, from large scale 'play' interpretations to project interpretations.

Data quality

In order to produce this, you need controls over the quality of data which is entered into the system, so data is not pushed out to anyone unless it passes a certain standard.

Understanding data quality requires a level of technical expertise which data management or workflow support staff do not usually have. So, the initial data owners need take some responsibility for data quality, he said.

You need to set the data quality bar at the right level. If it is too high, you reject lots of data, in which case you don't have the data you need in the database. If the bar is too low, you accept too much data, and ending up with too many versions, he said.

The quality of the data, and whether the data is in the right place, can be far more important issues than the digital technology you use, he said.

The project team decided to constrain the number of data types it would look at to 10. Important data types included well top data, log curves, check shot data, seismic data, seismic velocity models, and horizon data.

The seismic velocity models themselves needed to be integrated together. This is very difficult because they are often put together for specific tasks, and put together in different ways, with different audit trails.

Business driven

The data management and workflow support staff wanted to start a project by asking the business owners and system users what they wanted.

It is a temptation for data management staff to 'try to fix everything' without any focus, he said. Data managers can have a mindset of 'my job is to make sure I manage the data, get data complete in the database'. But this makes a project too big.

Instead, it is better to ask users for very specific information about what they need fixed. Which specific area they were interested in, which wells they want data managed for, and when they need it.


In terms of technology, the company has been looking to build up basic digitalisation capability, encouraging people to do basic coding and scripting, and built the digitalisation culture in the company.

It is aiming to recruit graduates who have this sort of knowledge - a 'hybrid of geoscience knowledge and computing.'

It is looking at improving visualization, with tools like Spotfire and PowerBI. 'We want the business to have better visualization,' he said.

There is no single database which can store all the data - you need a different database for different data types. It makes it hard to build a platform to bring it all together, so it is easier for people to find.

The company has software tools which can highlight anomalies in a large area of seismic, perhaps showing them up in different colours. It can bring out different features such as channels.

There is machine learning software which can take a full stack seismic cube, filter out the noise automatically, and highlight likely faults using pattern recognition. This could take an interpreter two weeks to do. 'The quality of a tool is not as great as how a human would pick it, but for a regional prospect this is great,' he said.

Associated Companies
» Sarawak Shell Berhad
comments powered by Disqus


To attend our free events, receive our newsletter, and receive the free colour Digital Energy Journal.


Latest Edition Oct-Dec 2023
Nov 2023

Download latest and back issues


Learn more about supporting Digital Energy Journal