You are Home   »   News   »   View Article

How analytics works

Thursday, January 29, 2015

Oil and gas analytical techniques can help you make better decisions where to place wells and which interventions to make, and avoid drilling problems

'To me, analytics is being able to get a lot of insight from a large amount of data, knowing big from thinking big,' said Duncan Irving, oil and gas practise lead with Teradata, speaking at the Digital Energy Journal Sept 23rd conference, 'Using Analytics to Improve Production'.

You use large data sets together with the insight you already have, and come up with an answer which is as statistically robust as possible.

You can do this both on the long term, spending a lot of time studying data, and over the short term, analysing data as it is being generated, to support current operations.

People are starting to talk about a 'data lake', where you have all of your data in one place, and then work on it all together, he said. 'That can go well for you.'

The oil and gas industry often compartmentalises its data flows, so geophysicists get one stream of data, and production data goes in another direction. A data lake means that everyone is looking at the same core data, like people in banks do.

If all the data is stored in separate silos, it makes it very hard to do analysis which involves taking data from different places, you have to keep asking your colleagues for it, which slows the process down, and you might find they are also doing the same analytical work themselves.

People also tend to trust data more if they are getting it from the same core data store as their colleagues, rather than asking their colleagues to provide the data to them on demand.

Data integration

Integrating all of the data appropriately is 'an art and craft in itself', he said.

Your source data could include supply chain logistics, ERP systems, production data. 'We would want to cram all of this into one large platform,' he said.

The integration system needs to understand the relationships between all of the data types, so it is possible to work out the context behind the data.

Once the data is all integrated, you can run analytics tools on it, such as the ones developed by SAS.

The data can then be provisioned through the tools that people are used to working with, which has science 'baked into it', such as subsurface modelling software, he said.

CGG project

Teradata was involved in a project to analyse UK wells data with data management company CGG, to see if it would be possible to work out ways to make drilling more efficient, avoiding problems like stuck pipe and tripping. 'We threw together a few thousand well logs and looked for correlations across them.'

CGG has access to all well log data for UK Continental Shelf, as an authorised UK data agent.

The analysis also included the associated written (text) logs which drillers make, looking for words like 'stuck' and the time it was written, so it could be matched against the data logs.

The work is not complicated, one well engineer could process all of the data for one well in an afternoon, Dr Irving said. But with an analytics system you can process data for thousands of wells in one afternoon.

The software spotted many unexpected relationships between data, he said. For example you could see different relationships that could be governed by lithology (ie something which often happens when drilling through a certain type of rock), or relationships that could be linked to properties of seismic data (ie rock which causes problems often shows up with a certain signature in seismic data).

Most of this would not have been obvious to an experienced user, working without an analytics system, he said.

Once you know these relationships you can use them in your strategic decision making, for example avoiding drilling in a certain sort of rock, or providing better information to drillers while they are drilling.

This approach could be more useful to drillers in the unconventional space, who are looking for small improvements in efficiency, and might appreciate knowing how one well compares with others.

Oil major project

Teradata has been running a research project with an oil major for three years to try to improve well intervention decision making, trying to assess which well intervention provides the best return on investment.

This oil major has installed permanent fibre optic reservoir monitoring systems on a number of its fields, taking a seismic survey every 6 months. By comparing the surveys, you can see how the reservoir is changing.

The oil major 'wanted to be able to use all the data possible, and make the decision support as low latency [fast] as possible,' he said.

Usually each set of seismic data can take 2 years to process, but this is too long if your data will be out of date in 6 months.

The oil major wanted to keep the data as granular as possible, so they are doing analytics on the full pre-stack seismic data, not just the post stack. 'There's a lot of information in the prestack that can help you understand the reservoir properties,' he says.

The aspiration is to process the seismic data so fast that you can process seismic data immediately it is gathered on a ship, and then tell the ship that it needs to reshoot a line of seismic if there is something wrong with it.

Already, Teradata can make a quick comparison of the latest seismic data with the previous survey, within 24 hours of shots being reconciled and shot data being reconciled with trace recordings.

It can show up differences in the seismic recording itself, for example there might have been different wave conditions or another vessel nearby while the last survey was being shot.

If you are planning a high value intervention based on the seismic data for a certain part of the field, it would be very useful to know if the seismic data from that part of the field is 'a bit dodgy', Dr Irving said.

For the reservoirs, you can compare how you would expect the reservoir to change (based on your reservoir simulation) and what actually happened.

With an understanding of how the data connects together, you can do deeper analysis. For example, one one field, the downhole pressure predictions turned out to be close to what actually, but the predicted water saturation was not so good. 'It pointed to a lot of things that were not really obvious,' Dr Irving said.

The data led to a lot of debates, including between engineers and geoscientists, which also actually led to improved intercompany relationships, he said.

Time series matching

Another technique is to compare time series data (how something changed in time) with patterns stored in the computer's memory, to see when something like this happened before.

This is a similar data processing task to the Shazam mobile phone app, which can tell you what music you are listening to by comparing the music you are listening to (recorded as a sound file) with the recordings in its memory to see what makes the closest match.

In the oil and gas industry, you could compare a recording of vibration data with vibration recordings in the computer's memory, to try to diagnose what the problem might be.

Or an analytics system might tell you that your pumps and compressors are making a slightly different noise to the one they made last month, and whether or not you need to worry about it.

With further processing you might be able to work out what is causing what, or what might be about to happen shortly, putting together a 'likelihood pathway'. 'To me, the exciting part of this is being able to marry all of this together,' he said.

Getting it implemented

There is very little understanding in the upstream oil and gas industry about how to introduce this sort of project, and the mix of analytics expertise, database software, computer hardware and domain expertise it needs.

It is quite a departure from how the industry worked in the past, with an aim to create a single deterministic model, with different departments who passed data between each other. 'We're
moving more into a probabilistic approach and a more collaborative approach,' he said.

The more downstream you go in the oil and gas industry, the more mature people's thinking is about analytics, Dr Irving said. 'Upstream is way more conservative. If it's not a big shiny pieces of steel which costs $100m, then the investment is always questioned, especially if its IT.'

One approach to understanding data is to gather together a multidisciplinary group to work on a data set for a day and see what they can figure out. It can include subject matter experts, business experts and data scientists - with the role of the data scientist being to act as a go-between from the
subject matter experts and the data. This could be called a 'hackathon'. Teradata has tried it at two different oil companies.



Associated Companies
» Teradata
comments powered by Disqus

CREATE A FREE MEMBERSHIP

To attend our free events, receive our newsletter, and receive the free colour Digital Energy Journal.

FEATURED VIDEO

Laser scanning data acquisition by UAV (drone) - the future of offshore survey and inspection
James Arnott
from TEXO Drone Survey and Inspection Ltd

DIGITAL ENERGY JOURNAL

Latest Edition Feb-March 2018
Feb 2018

Download latest and back issues

COMPANIES SUPPORTING ONE OR MORE DIGITAL ENERGY JOURNAL EVENTS INCLUDE

Learn more about supporting Digital Energy Journal