You are Home   »   News   »   View Article

A 'top down' method to improve recovery by BP

Thursday, September 18, 2014

Trying to improve production recovery by studying flowrates and pressures could be considered a 'bottom up' method - you could also try a 'top down' method, analyzing larger amounts of data to try to identify trends, write four BP technical specialists R Bailey, Z Lu, S Shirzadi, E Ziegel of BP

The human mind, presented with a series of time data, will generally look to find common features, label these as events and seek correlations against other timelines.

This approach can be fallible, as questionable data selections might be made in attempting to find supporting evidence for a perceived pattern.

The real challenge is to establish an automated, comprehensive, dispassionate, independent and statistically valid process for pattern matching.

If all of these objectives can be met, then many evaluations could be carried out using an event-based analysis.

Instrumented systems and the analysis of their time series data can provide a range of events and channels through which those events can interact.

Some influences will be automatically transmitted by closed loop process control schemes, others will be prescribed by written operating practices, still more will be the result of 'custom and practice' on the part of operators, supervisors and planners.

Despite the presence of these systematic, non-reservoir derived reasons for patterns of change, the historical daily production data is actually the most direct measurement of the hydrocarbon fluids being extracted from the reservoir.

The business case to pursue increases in efficiency of extraction and total recovery, whilst simultaneously being able to accurately assess the effectiveness of measures taken to support, sustain and ultimately maximise that production, is clear.

'Top Down' analysis of well data
Understanding the flow from each well as a result of an agreed allocation process, based on the assessment of flow data from the well's tests, allows history matching. Typically one or more reservoir simulation models are tuned to the observed flow and pressure data.

These processes comprise a 'bottom-up' approach to understanding the performance of the reservoir. They enable a development plan to be monitored and updated as necessary.

However, BP has been working on a separate but entirely complementary 'top-down' approach where operational data is analysed without any preconceptions about reservoir structure.

This approach allows the rapid formulation of a workable model, thus avoiding the need for an involved history-matching procedure. The 'top-down' approach uses the Capacitance Resistivity Model combined with an analysis of the operational time-series data based on 'Events' in that data.

Event-based approach
For an asset that is being studied, one must first choose the producer and injector wells to include in the study.

These will be sufficiently well instrumented to provide a time series data stream containing events which are distinguishable from the unavoidable sources of measurement noise.

The wells will be material to the oil recovery plan and will include some key injectors as well as production wells whose output might be enhanced by the injectors.

Next, one will choose a time period for analysis.

Recent data is more likely to be most instructive in terms of potentially assessing what is likely to happen next, but better quality periods of data may exist.

Quality may be measured in terms of working instrumentation, constant or stable numbers of injection and production wells in service, and the absence of changes in operating regime, such as artificial lift or the breakthrough of water or gas into production wells.

Then, one must choose the well attribute which will be used as the variable to indicate the occurrence of an event.

Typically, the choice will be an allocated (or possibly measured) flow, an aspect of production such as water cut or gas oil ratio, or a direct intensive measurement such as a pressure or a temperature.

Similarly injector wells will have some (typically) corresponding attribute, such as injection rate or pressure.

The event-based analysis begins with the marking of events for each producer and injector well.

Parameters are used to control the relative occurrence of events. Some iteration is typically required until a set of events that make sense organically and pass visual assessments has been achieved.

Visualisations made using the association software assist with events evaluation.

Once a complete set of events has been obtained, the process associates these with each other.

This is done by considering an appropriate range of time delays that reflect the physical separation of the wells and the intervening reservoir properties.

The result is a ranking of the connections considered possible between injectors and producers together with their estimated time delays, summarized as an optimal score for each connection.

The engineer can utilise the validated scores and the resulting insights to modify other predictive reservoir models to better address the underlying business drivers around reservoir management, such as water flood pattern optimisation.

The ability to quantify production support by injectors is a key enabler for optimizing water-flood.

Practical experience
An initial BP top-down project had considered which types of modelling techniques might be able to make sense of reservoir production surveillance data without recourse to 'bottom-up', physics-based modelling.

Four vendors, each using different methodologies, evaluated the initial three years' operational data from a complex, multi-faulted reservoir.

Complicating the analysis were changes in the number and type of wells as a transition was made from the initial pure depletion phase to an early water flood. Gas lift began to be used on some of the producer wells. The 'event based' analysis was deemed to be the best technology. EDA (Event Detection and Association workflow) was simple to implement and sufficiently flexible, and its focus could be easily adjusted to different wells, well pairings, types of events and time periods.

It was also possible to analyse different regimes and epochs or simply constrain the period of interest to subsets of wells of interest.

The next deployment of EDA was for an offshore reservoir supported by a single gas injection well.

However, this well was operated in a 'campaign' mode, where gas was sold to market during periods of high spot price and injected into a gas cap for the rest of the year.

This annual pattern of cyclic behaviour had continued for several years.
The reservoir management team wanted to review the data to decide whether to continue or alter this operational paradigm.

Event analysis was appropriate, since the operational history contained very large, unambiguous, input disturbance events, and a change in operating strategy was recommended. This change had a net positive impact on the asset's revenue stream.

The first water-flood implementation carried out by BP was on a mature, complex, reservoir with limitations on the incremental water available and the ability to distribute it in a flexible manner to the in-field injection pads for optimisation of production support.

By loading the reservoir operating history into the application and periodically updating it, potential variations in the water-flood plan could be appraised.

The result was that any trial variation in the water-flood pattern could be either cut short or extended on the basis of on-going periodic analyses.

Event Detection and Association workflow can complement other types of analysis such as Capacitance Resistivity Model (CRM) technology as a key validation step in the delivery of Top-Down Water-flood (TDWF) assessment.

Working from the same operational data, but using it in distinctly different ways, the CRM model derives its well performance parameters using non-linear estimation.

Then EDA, focusing on the events revealed in the transient perturbations of the wells, independently checks and validates that analysis.

In TDWF deployments, the asset surveillance engineers have used the embedded, integrated EDA toolkit to validate their CRM results and support their selection of the most appropriate and statistically valid injector and producer connections.

Possible enhancements
There are challenges to the process based on well selection, chosen time period and event marking, but most of these reflect the fact that analysis is relatively simple.

However by extending the scope of the basic analysis to an attribution process for production events, typical support statistics have increased from 30 per cent to more than 80 per cent.

We also have evidence from early trials of this approach that the overall support statistics can be made to approach the theoretical limit of 100 per cent.

Genuinely complex events will involve further criteria such as 'time spent in a state', what the previous state had been, e.g. as with Markov models, and what state transitions are actually valid.

Such durations and switching behaviour, asserted from physical laws or operational rules, will provide an exponential explosion of possible system states and so provide the backdrop for the development of a genuine 'Complex Event Processing' application.

A specific class of applications that is highly suited to a bottom up, data-driven event based analysis is the set of problems associated with product quality control, where the relationships between inputs and outputs are repeatable but complex.

If the sources of disturbances are recorded, then the application of event-based analysis could support parametric representations.

Stochastic systems could be analysed. Random changes can be labelled as events but will require a slightly different workflow to compensate for the variable time delays. If causality is genuinely present, then the simple approach of utilising precedence may be sufficient to adapt the workflow to the stochastic problem. This is commonly done in econometric modelling.

Summary / next steps
This event-based technique has delivered value to several BP assets by improving the top-down representation of injection and its support to producing wells.
The technique has been automated in certain key respects. It is rapid to implement, objective, and capable of extension. It is also suitable for generalization to a range of operating data sets.

Any system of inputs and outputs that demonstrates repeatable behaviour will be amenable to event-based analysis. Systems such as TDWF which can carry out data analysis will benefit from the use of historical data to reveal additional process insights.

The next step for the development of this technology is to align it with the emergent field of 'Complex Event Processing'. This is an active field of computing with 'big data" that seems likely to dominate practical oil and gas processing applications in the very near future.

Field of the Future is a registered trademark of BP Plc



Associated Companies
» BP
comments powered by Disqus

CREATE A FREE MEMBERSHIP

To attend our free events, receive our newsletter, and receive the free colour Digital Energy Journal.

FEATURED VIDEO

Malaysia at the forefront of Data Economy: An overview of the National Big Data Analytics Initiative
Somasundaram Nagappan
from Malaysia Digital Economy Corporation

DIGITAL ENERGY JOURNAL

Latest Edition January 2018
Jan 2018

Download latest and back issues

COMPANIES SUPPORTING ONE OR MORE DIGITAL ENERGY JOURNAL EVENTS INCLUDE

Learn more about supporting Digital Energy Journal