You are Home   »   News   »   View Article

The future of seismic data by EMC

Thursday, April 30, 2015

David Holmes, chief industry executive with EMC's Global Oil & Gas Program, explained what the future of seismic data management will look like

At the recent Society of Exploration Geophysicists (SEG) event in Denver in October 2014, a company called Agile Geoscience ran a hackathon with 30 people in a room, asking them to write a 'supercool geoscience application'.

David Holmes, one of the judges, selected as the winner a crowdsourcing 'hot or not' tool for other people's seismic interpretation.

You log on to an online tool with your Google account, interpret some seismic data, and then rate other people's choices.

'This is the future, I'm convinced,' Mr Holmes said, speaking at the Digital Energy Journal Aberdeen conference on November 27, 'Doing more with Seismic Data.'

There are many more exciting things the industry could do, if the seismic data systems were on the cloud.

One US company put all of its seismic data onto disk, and then hired 6 students from the Colorado School of Mines and gave them access to the entire seismic library, telling them 'go and find some stuff.'

Mr Holmes said it is a 'mystery to me why we keep standalone workstation going as long as we have,' he said.

Geophysicists still work on personal workstations, where they spend 20 minutes loading up all their data every morning. If they could work directly on a cloud system it would be a lot faster.

Managing old seismic

Yet still most oil and gas companies store their seismic data on tape, and have very little idea what they have, if they are storing multiple copies of the same data, or do not have what they thought they had.

'Companies should care more about this stuff. They have spent millions on acquiring it and the cost of managing it is an unmeasurable fraction of that,' he said.

'Some companies do have a regulatory obligation to keep their data in perpetuity, and it doesn't mean a rusty 9-track no-one can read.'

'A [typical] oil company is working with five seismic data storage companies, each with different cataloguing systems, all incompatible, three million media items, including two million 9-track tapes.

'There are a finite number of read heads for 9-track tapes,' he said. 'They are not being manufactured any more.

Some companies 'are paying $10m a year in license fees for data they're not using but can't prove that they're not using,' he said.

The trouble is, managing data is hard work, and it always easier not to do it. 'Companies ask, shall we spend lots of time and money doing something hard with intangible business value, or do nothing,' he said.

Some physical data storage companies are taking advantage of oil companies willingness to take the cheap and secure option over the short term. They offer a service where they store your data free of charge, but charge you big fees when you want to retrieve it, he said.

Some cloud data services are trying to get away with the same business model. 'The cost of retrieving the data can be gigantic.'

The problem is that usage rates of seismic data is typically very low, with only small amounts of data retrieval over a time frame measured in decades, he said.

But slowly, attitudes are changing, as companies realise the risk of not properly maintaining data assets, he said, and cloud solutions offer a cheaper alternative.

Move away from tape

Mr Holmes recommendation is to move away from tape.

You can't just copy seismic data from tape to disk, because the data will get corrupted. It needs to be transferred to a different format.

Together with the data, you need to keep a scanned copy of the tape label, logs of the re-mastering process (gathering data from tape).

'You need everything in your possession that will allow you to recreate that nasty crumbly 9 track tape,' he said.

Data management processes

In the future there will also be much stricter data management processes, he said.

Many oil and gas companies already say that geoscientists may not load up data themselves, they must give it to a data manager, to load it and validate it, he said.

There are software tools to make this process easier, for example where new data is loaded to a folder, then a data manager receives an e-mail alert. The data manager can then check the data formatting and co-ordinates, make any necessary transformations and check the headers.

Working with big data

A side-effect of the growth of big data systems is that many companies now have multiple systems for storing data, including their normal archiving systems, high performance computing (HPC) enviroments, Hadoop environments.

They might have the same data file in all of these systems. If they back up the data in each environment multiple times, they can end up with many copies of the data. 'One company worked out they would have 17 copies of all of their data, if everything had gone well,' he said.

As data volumes get bigger, keeping 17 copies of everything will get very expensive. 'If we have any chance of surviving the next few years, it's going to be crucial that we have a single instance of our data,' he said. 'Or companies will make a fortune selling you vast amounts of storage you don't need.'

A new term has been invented, 'next generation data fabric', which describes the enterprise architecture for storing and managing information, he said.

Companies will also use 'object storage' which means that the analytics tools can understand the different data storage systems you are using.

The idea of 'master data management' will be redundant, because companies will be able to search all of their data at once.

Geophysicists will be able to ask complex queries, like 'show me all the files I have navigation for, which I don't know about.' Or in more specific terms, 'show me all of the navigation files which have a survey name which isn't in my survey master.

'You can run that simple query against your entire landscape,' he said.

View David Holmes' talk on video at
http://www.digitalenergyjournal.com/video/1224.aspx



Associated Companies
» EMC
comments powered by Disqus

CREATE A FREE MEMBERSHIP

To attend our free events, receive our newsletter, and receive the free colour Digital Energy Journal.

FEATURED VIDEO

Malaysia at the forefront of Data Economy: An overview of the National Big Data Analytics Initiative
Somasundaram Nagappan
from Malaysia Digital Economy Corporation

DIGITAL ENERGY JOURNAL

Latest Edition Oct-Dec 2017
Nov 2017

Download latest and back issues

COMPANIES SUPPORTING ONE OR MORE DIGITAL ENERGY JOURNAL EVENTS INCLUDE

Learn more about supporting Digital Energy Journal