You are Home   »   News   »   View Article

Target - a cloud data repository for Oman

Tuesday, January 30, 2018

Target Oilfield Services is building a cloud based data repository for the Oman government - and finding that putting data on the cloud can make it easier to manage. Would oil companies get the same benefits from cloud based data hosting?

Target Oilfield Services, a company based in Oman but with digital services provided from the UK, has been building a cloud based data repository for the Oman government, to manage its oil and gas data.

Oman awarded Target the contract to manage its national data repository (government owned oil and gas data), after the previous operator had run it for 5 years.

Oman wanted to get the oil companies supplying data to carry out these tasks, taking responsibility for quality control of the data they own.
They also wanted data to be cloud hosted, so that oil companies would be able to load data themselves.

Before Target became involved, there were many staff members employed to manage the data repository, including doing data clean-up and loading, covering seismic, wells and GIS data.

The Oman government also wanted a data store to be based on open standards, so it would be easier to move the data to a different service provider in future, should it want to.

The Oman government also liked the idea of paying for the service on an ongoing basis, rather than making a capital investment.

They wanted the data, including 10 petabytes of seismic, to be hosted within their own country.

The Oman government wanted to make it possible to interact with the data as visually as possible, not just have people looking at data. This includes being able to visualise seismic without downloading it.

The government also wanted a system it can use to manage its ongoing bid rounds, providing data to companies considering bidding for license blocks. (You can see this online at

So altogether this means a transformation of the commercial aspects, user experience and operating model, says Jamie Cruise, president of digital services at Target.

Target chose to put the data into the PPDM standard data model format, hosting it in a public Tier 3 (99.98 per cent availability) data centre in Oman.

Migrating the data to Target's software system took 12 months.

Today, data can be accessed just from entering a URL into a web browser, no software is required.

To make the data easier to visualise, nearly all of the data is accessed via a map or other graphic. 'We have a passion for making sure we don't present bland tables of data to people,' he said. 'Our users require more than that.'

Once the cloud data server was set up, Oman found many more document types it wanted to store on it, including data connected to bid rounds, work programs, budgets, regulatory information, collaboration work with operators, and feedback on regulations.

In building the system, it was important for Target to see itself as a supporter of critical organisational decision making, not just running a database, Mr Cruise said. For example, it needs to understand what information a regulator needs to ensure that companies are in compliance with oil and gas regulations.

This means being focussed on workflows and data integration, not just focussing on data repositories, he said.

Target developed its software using Amazon Web Services, but because most production data needs to be stored within the country where the production happens, the software needs to be transferred to a local cloud data provider before being operational.

Jamie Cruise has been writing web based information management software since the 1990s, for a company which was sold to Halliburton. He went on to found a company called FUSE Information Management in 2005, which was acquired in 2014 by Target Oilfield Services, a company based in Oman.

Offering software to the public

Next year, Target plans to take its system to the market as a 'public cloud offering', an online software service, like Microsoft's Office 365. Companies will be able to buy their own 'instance' of the software online.

The company sees a potential market from oil companies which don't already have an infrastructure to manage their corporate data.

As part of the system, it is possible to build 3D 'scenes' of subsurface data, using technology developed to draw 3D city maps.

The view is good enough for data quality control, but not for doing interpretation directly on it. It is possible to work with seismic post stack data in this way, doing some basic quality control.

Using the system, oil companies will be able to search for and access information via map interfaces. This can either be done via ESRI GIS software (if you have an ESRI license), or through an open source GIS system (which is free). The system can provide mapping data from any company, if it supports WMS (Web Map Services) and WFS (Web Feature Services) standard.

It is very important to make the system easy to integrate with other systems. Mr Cruise sees Twitter as a company which has created a particularly easy to use Application User Interface (API). The APIs for some oilfield software are very complex.

Target is considering using GraphQL, a data query language designed to keep multiple data stores updated without needing too much data transfer. It was developed internally by Facebook for keeping Facebook pages updated from multiple data sources in any structure. The language was released by Facebook for public use in 2015.

Easier when centrally managed

The work showed that data management can be easier when data is stored on the cloud, with all of the data in one place and centrally managed, said Mr Cruise.

This is not the first time data has been centrally managed - it was also centrally managed in the pre-digital era, in libraries, and it was centrally managed when it was stored on mainframe computers. Both systems had strict rules about how the data should be managed, to maintain quality and ensure accessibility.

Decades from now, the era of storing files on individual PCs, and having multiple versions of files circulating around the company, might look like the anomaly, Mr Cruise said. Workstations made life easier for workers, but meant a big increase in the challenge for data management governance, he said.

Now, as oil and gas data increasingly moves to cloud servers, 'we're hopefully entering a new era of data management with more well managed, centralised environment,' he said. 'We can go back to having one version of each piece of data. That will make our business processes much more effective.'

Future role of data managers

One question is what the role of the data manager will be from now on. Some people assume that data managers will no longer be needed, as 'middle man' removed by technology - files can be loaded and downloaded by the people who create them or need them.

Some people think data managers will need to become data scientists, developing skills for statistical analysis of data.

Mr Cruise's personal view is that there will always be a big role of looking after data and ensuring data quality and standards (governance).

'Our responsibility as data managers is ensuring the right data is delivered to the right person,' he said.

The oil and gas industry keeps all of its data from the past - 'we carry our legacy with us,' he said. None of the old data can be ignored.

Matthias Hartung, President Digital Transformation - ?Target Oilfield Services and a former VP technical data with Shell, noted that a common fault of data managers is to try to get everything perfect, in the belief that perfect data is what the business mostly needs.

The professional trait most missing is curiosity, he said. A curious data manager will work out what internal customers need - and whether they are being provided with it, and if it is possible to provide it, and understands why this adds significant value. 'What is the most important data requirement for the most important business/process/customer that you serve?' he challenged

Another essential ability for data managers is communication skills, explaining how data assets can be leveraged to enable well-informed business decisions, using language the customer understands. Data and Digital can transform the Upstream from being laggards to leaders in energy solutions, they are mutually supportive and need each other to succeed, Mr Hartung said.

Associated Companies
» Digital Energy Journal
comments powered by Disqus


To attend our free events, receive our newsletter, and receive the free colour Digital Energy Journal.


Malaysia at the forefront of Data Economy: An overview of the National Big Data Analytics Initiative
Somasundaram Nagappan
from Malaysia Digital Economy Corporation


Latest Edition Nov Dec 18
Nov 2018

Download latest and back issues


Learn more about supporting Digital Energy Journal