You are Home   »   News   »   View Article

How to get seismic off tape

Friday, June 17, 2016

Managing seismic data would take a big step forward if people used disk drives rather than tape, communicated it electronically rather than by physically transporting disk and tape, or better still, did not move data at all. Alan Smith explained.

Companies still manage data in very similar ways to how they did it 20 years ago, and subsurface staff might spend more than half of their time looking for it, or reformatting it, said Alan Smith, data management consultant with Luchelan Ltd.

Dr Smith is a former principal consultant who has worked with Shell, the Brunei Prime Minister's Office, and has worked in a number of interim management roles including as CIO for E&P with oil company OMV. He was speaking at the Finding Petroleum forum in London on Apr 18, 'Transforming Subsurface Insights'.

In 1991, an article was published in the Oil and Gas Journal, saying that geophysicists and geologists were spending 60 per cent of their time looking for data (or getting it ready to work with), 18 per cent in 'useful work', 5 per cent in 'meetings and presentations', the rest in training and coffee breaks.

Since then, not much has changed. This might be because our systems to manage data also haven't changed very much.

Consider that in the 1990s, data was delivered on big tapes, lots of data was stored in boxes, or on tape on shelves. People searched for data by typing in codes on a text based screen. Data could be transported using 'Exabyte' tapes, which held 60 to 150 GB, but only had a 6 month life.

In 2015, data is still stored in boxes and tape is in racks. There are a few higher density tapes available and data is sometimes delivered on disk. Companies still search for data using text based systems, or sometimes a GIS [geographical information system] front end to help you find data. Many companies still rely on people who know where to go to find data.

There are some parts of the world where data can be directly accessed over the internet, for example with Norway's 'Diskos' system. 'It's the exception rather than the rule,' he said.

Data challenges

Managing data is getting more and more complicated.

As an example, consider seismic company PGS, which made 60 per cent of its 2015 revenue from selling 'multiclient' data, where the same seismic data is sold to different clients.

Historically the data starts its life on a seismic vessel, which has multiple streamers recording seismic data onto tape. The tape is sent to a processing centre, which puts the data onto a data interpretation system.

There are interim products in the seismic interpretation process, which get stored on tape somewhere. 'There's lots of tape handling, lots of scope for errors,' he said.

The final interpreted seismic data ends up getting stored on a shelf somewhere, probably both by PGS and by the oil company customer.

PGS had used the same data management system since the 1990s, provided by an outside company. 'Getting data out of that system was quite difficult. It didn't store data in industry standard formats.'

'If you want to re-process you have to get the tapes back from store. There are lots of points where things could go wrong.'

It was really a 'trace handling' system for storing and handling seismic recordings, not a data processing system, he said.

PGS is moving to an advanced data handling systems, based around disk drives and electronic data transfers.

The latest seismic vessels today have streamers of the order of 8km in length, recording large amounts of data. The data will be stored on a disk drive on the vessel. 'The data is generated so fast, you can't actually put it onto tape fast enough,' he said.

The data may be spooled onto a tape to send it to the processing centre, or it might be transferred by disk. 'In theory it doesn't need to touch tape,' he said.

Once the seismic processing is finished, the data is stored in a storage system, on a disk drive, without any tape being used.

The data can be delivered to oil and gas companies electronically, if they want. There are software tools which allow data transfer at 2 to 5 times faster than with File Transfer Protocol (FTP), he said. However many oil and gas companies do not have capability to receive large files electronically.

Data is only put onto the standard SEG-Y format tapes for long term storage, or it is kept forever on disk. All the tapes are registered into a system with a 'check-out' process, so people can see what is available and where the tapes are. When tapes are checked out, back-ups are made. The offices have robotic systems which can take tape from shelves and send the data online.

Quality Control (QC) s part of the process, making sure that data as it comes in is correct, making sure the data as it leaves the processing centre is correct. This is a big change from the past. 'Often in the early days the only time data got Quality Controlled is when it got delivered to the client. The client had the problem of correcting mistakes, to allow headers to be loaded properly.'

PGS works together with Ovation Data to manage the network. It has a network connecting PGS offices in Houston and London, and Ovation Data offices in Houston and London, with 10 gpbs data communication links. Altogether 120 terabytes a day can be moved around, he said.

Data can be backed up in multiple locations. 'The chances of loss are quite small,' he said.

Clients today

Today, PGS' oil company customers are not only interested in 'post stack' seismic (where you group together all of the seismic wave fields which have passed through a single point in the subsurface). Companies are asking for 'pre-stack' as well (the seismic data before this task has been completed).

Typically clients will ask for a seismic data covering a slightly different geographical area, and data at varying stages of processing, and PGS will need to be able to provide it. For example, prestack data for a specific polygon. This means that the data needs to be cut to the correct co-ordinates.

Historically, this meant a lot of manual handling and intervention. Now, it is virtually all automatic.

A few years ago, it could take from days to months for a data request to be delivered, depending on the complexity, and there was lots of room for error, and a lot of staff time involved. Now, if you order post stack data you can be getting it within a few minutes of ordering it, or it can be spooled to disk within a few minutes. 'It has far fewer errors, it is significantly cheaper. Customers are very pleased with that type of service,' he said.

Leave data where it is

In the future, it would be better if the amount of moving of large data files was reduced. When data is moved between systems and transferred into different formats, information is often lost.

It should be possible for computer software to interpret data which is stored somewhere else. 'Large volumes of data can be 'live' on the internet, they don't have to be sitting on your laptop or workstation,' he said.

Data may need to be converted to a different format for a different computer software package to work with it, but the reformatting can be done in place (as the information is required.

'I'm not saying it's simple and straightforward, but it's one option,' he said.

This would mean that geoscientists could start work interpreting data immediately after the survey. This would avoid problems we see today, where it can take 3-6 months to process a new seismic survey.

In '4D seismic' surveys, which show how the subsurface is changing over time as a reservoir is depleting, a new survey is done every few months. If it takes 3-6 months to process seismic data, it means that you are acquiring the next survey before anyone has been able to look at the previous survey.

A challenge with working on data stored on the other side of the world is latency (the time to communicate instructions and receive a reply). If you are interpreting data from one side of the Atlantic with the data stored on another, 'it takes time for my mouse to move.'

But 'people are looking for ways to get over that, it's not such a big issue,' he said.

Automated interpretation

In future, seismic interpretation will probably become more automated, because there are fewer and fewer people in the industry who are capable of doing it manually, he said.

'Analytics will play a far larger part in what we do and say, not just the scientific side of building models.'

'It will come, there will be luddites who don't like the idea.'

However we will still need a few people with a geological understanding. You need domain expertise to be able to tell which of the correlations the computer discovers might be helpful, and which are just chance, he said. 'You've got to ground everything in reality at the end of the day, otherwise you end up with statistics,' he said.



Associated Companies
» Luchelan Ltd

CREATE A FREE MEMBERSHIP

To attend our free events, receive our newsletter, and receive the free colour Digital Energy Journal.

DIGITAL ENERGY JOURNAL

Latest Edition Aug-Sept 2024
Sep 2024

Download latest and back issues

COMPANIES SUPPORTING ONE OR MORE DIGITAL ENERGY JOURNAL EVENTS INCLUDE

Learn more about supporting Digital Energy Journal