You are Home   »   News   »   View Article

Analytics for reservoir management and field development

Thursday, January 15, 2015

Analytical techniques help make better decisions on well placement, interventions and drilling execution

'To me, analytics is being able to get robust insight from large amounts of data, knowing big from thinking big,' said Dr Duncan Irving, oil and gas practise lead with Teradata, speaking at the Digital Energy Journal Sept 23rd conference, 'Using Analytics to Improve Production'.

As a general approach, one should exploit large data sets to provide context, and when linked logically and numerically with existing insights, you should come up with an answer which is as statistically robust as possible.

You can do this both on the long term, spending a lot of time studying data, and over the short term, analysing data as it is being generated, to support current operations.

Many other industries have embraced the concept of a 'data lake', where you have all of your data in one place, without it being kept in any specific application format. This allows it to be integrated for to answer 'joined up questions', he said. 'That can go well for you.'

The oil and gas industry often compartmentalises its data flows, so geophysicists get one stream of data, and production data goes in another direction to hydrocarbon accounting. A data lake means that everyone is looking at the same core data, like people in banks do.

If all the data is stored in separate silos, it makes it very hard to do analysis which involves taking data from different places, you have to keep asking your colleagues for it, which slows the process down, and you might find they are also doing the same analytical work themselves.

People also tend to trust data more if they are getting it from the same core data store as their colleagues, rather than asking their colleagues to provide the data to them on demand.

Data integration

Integrating all of the data appropriately is 'both an art and a craft', he said.

Your source data could include supply chain logistics, ERP systems, and production data - 'It is possible to create a single view across all of these domains for cross-functional analysis'.

Integration at this scale requires a definition of relationships between all of the data types, so it is possible to work out the context behind the data. Once the data is all integrated, you can run analytics tools on it, such as the ones developed by SAS, or your own specialised algorithms. This allows insight 'discovery' to be put into an operational setting for day-to-day use.

The data can then be provisioned through the tools that people are used to working with, which has science 'baked into it', such as subsurface modelling software, he said, and equally has cost/value baked in so an integrated view of total costs or even better, profitability, can be achieved over well and equipment life cycles.

CGG project

Teradata was involved in a project to analyse UK wells data with data management company CGG, to see if it would be possible to work out ways to make drilling more efficient, avoiding problems like stuck pipe and tripping. 'We threw together a few thousand well logs and looked for correlations across them.'

CGG has access to all well log data for UK Continental Shelf, as an authorised UK data agent.

The analysis also included the associated written (text) logs which drillers make, looking for words like 'stuck' and the time it was written, so it could be matched against the data logs.

The work is not complicated, one well engineer could process all of the data for one well in an afternoon, Dr Irving said. But with an analytics system you can process data for thousands of wells in one afternoon.

The analytical approach lets the data speak for itself and drew out many unexpected relationships between data, he said. For example you could see different relationships that could be governed by lithology (ie something which often happens when drilling through a certain type of rock), or relationships that could be linked to properties of seismic data (ie rock which causes problems often shows up with a certain signature in seismic data).

Most of this would not have been obvious to an experienced user, working without an analytics system, he said.

But once you know these relationships are identified and quantified, you can use them in your strategic decision making, for example avoiding drilling in a certain sort of rock, or providing better information to drillers while they are drilling.

This approach could be more useful to drillers in the unconventional space, who are looking for small improvements in efficiency, and might appreciate knowing how one well compares with others, and decreasing the amount of suboptimal performance or cost/time over-runs due to a poor under

Time series matching

Another technique is to compare time series data (how something changed in time) with patterns stored in the computer's memory, to see when something like this happened before.

This is a similar data processing task to the Shazam mobile phone app, which can tell you what music you are listening to by comparing the music you are listening to (recorded as a sound file) with the recordings in its memory to see what makes the closest match.

In the oil and gas industry, you could compare a recording of vibration data with vibration recordings in the computer's memory, to try to diagnose what the problem might be.

Or an analytics system might tell you that your pumps and compressors are making a slightly different noise to the one they made last month, and whether or not you need to worry about it.

With further processing you might be able to work out what is causing what, or what might be about to happen shortly, putting together a 'likelihood pathway'. 'To me, the exciting part of this is being able to marry all of this together,' he said.

Getting it implemented

There is very little understanding in the upstream oil and gas industry about how to introduce this sort of project, and the mix of analytics expertise, database software, computer hardware and domain expertise it needs.

It is quite a departure from how the industry worked in the past, with an aim to create a single deterministic model, with different departments who passed data between each other. 'We're moving more into a probabilistic approach and a more collaborative approach,' he said.

The more downstream you go in the oil and gas industry, the more mature people's thinking is about analytics, Dr Irving said. 'Upstream is way more conservative. If it's not a big shiny pieces of steel which costs $100m, then the investment is always questioned, especially if its IT.'

One approach to understanding data is to gather together a multidisciplinary group to work on a data set for a day and see what they can figure out. It can include subject matter experts, business experts and data scientists - with the role of the data scientist being to act as a go-between from the
subject matter experts and the data. This could be called a 'hackathon'. Teradata has tried it at two different oil companies.



Associated Companies
» Teradata

CREATE A FREE MEMBERSHIP

To attend our free events, receive our newsletter, and receive the free colour Digital Energy Journal.

DIGITAL ENERGY JOURNAL

Latest Edition Aug-Sept 2024
Sep 2024

Download latest and back issues

COMPANIES SUPPORTING ONE OR MORE DIGITAL ENERGY JOURNAL EVENTS INCLUDE

Learn more about supporting Digital Energy Journal