Advanced Pressure Point Selection & Neural Networks
Last week we had the pleasure of attending the Operations Geology Conference “Bridging the Gaps” at Burlington House, Piccadilly, London. This was a majestic setting for two days of enlightening presentations combined with open debate about the past, present and future of the discipline. Although one&zero has a focus primarily on the quality assurance of wireline and LWD, we were privileged to have been given some insights into the current issues within the sector generally. Our aim was primarily to understand where we might be able to add increased value in the future.
Amongst the high calibre of presentations, one in particular caught our attention due to its close alignment with our wireline heritage – “Formation Pressure Test Efficiency Enhancement Through Dynamic Cutoffs” (Bataller, F.J, Rojo, Daniel, ElJaafari, K; Beda, G., 2016). This workflow developed by the exploration team at Repsol aims to streamline the process for pressure point selection in the execution of a pressure survey, based on a number of formation parameters which are built into a reference database. The objective is simple – improve the probability that a pressure test will provide a valid test by improving point selection, therefore reducing rig-time and cost (an important parameter in anybody’s book in the prevailing market conditions).
So how is point selection made? At the field level this is done in a similar way to the common practice of analysing the density, neutron, NMR, resistivity and gamma-ray data and computing the “chance of success” for those particular parameters. (This is probably not too different from what everybody is already doing within a field to optimise their pressure acquisition programme). This recipe for improving the probability of success is then applied across new wells drilled in the field to assist in point selection.
What happens when no offset well data is available? This is where we feel Repsol’s approach has the potential to evolve into a truly unique method for point selection. The team have filtered their extensive and growing global database of pressure survey and petrophysical data. They have created a dataset which attempts to refine the relationship between a successful pressure test and the parameters of the formation, for example porosity/permeability.
What we found particularly interesting are the possibilities which are raised when you consider a large dataset such as is contained in the Repsol archives. Perhaps more exciting, and this is a call for increased collaboration and the utilisation of large datasets for computational learning, is the possibility which is presented if the data set was to be built from multiple sources (i.e. multiple operators). “Collaboration” and “Big Data” are buzz words in the industry currently, we can actually see some real potential here.
Repsol were clear to point out that this approach is very much in its infancy within their organisation, although it has already proved its worth in significant savings of time, costs and programme execution on recent wells.
Our opinion
The choice of where to select a pressure point is not always driven by calculation and sometimes it is driven by the formation evaluation objectives – “we really need a pressure here – so let’s just try it anyway”. This approach is understandable, since data is often the primary reason the well is being drilled in the first place. However, applying a computational approach to estimating the number of pressure tests that may be achieved in the well, using a certain probe, based on available log data, would certainly assist in the well planning stage.
A question is raised as to why we should have to use a statistical approach to selecting the best place to take a pressure test in the first place. We have a solid understanding of the pressure tool physics, and a petrophysical understanding of the environment within which the tool operates. However, to our knowledge an algorithm to provide an accurate prediction of tool function in different environments does not currently exist, and so the team at Repsol should be commended for its exploration in this area.
Repsol’s approach, by their own admission, is in the infancy stages, but we believe the power of this concept should not be underestimated, not only for pressure testing, but sampling and other areas of log data acquisition. One concern we have with the current approach is that it fails to take into account key variables which would be critical to providing a robust approach outside of the relatively controlled confines of a field (i.e. the global). More specifically, assuming a perfect borehole geometry, the model currently fails to take into account variables such as the probe type used in the test, mud type/content, drawdown rates, overbalance etc (feel free to name some more in the comment section below). To build a computational model which incorporates multiple variables will certainly require a more advanced mathematical approach. Machine learning, specifically neural networks, may provide one solution to the problem of modelling multiple input variables. Adoption of a network approach to modelling what is a complex system could also potentially provide the opportunity to simulate the outcome of a change in variable, such as the probe area, to see how this changes the probability of a valid test. All of the tool parameters could potentially be changed simultaneously through an algorithm, which could then provide the optimal settings for getting a valid test, along with the probability that those settings would give a valid test.
At one&zero we are currently working with a number of experts in this field to explore the application of machine learning algorithms. We are interested to understand further how they can be applied in various areas within the oilfield. We hope to present some of our findings and conceptual frameworks in the not too distant future.