Being in a rush to complete a well can result in losing it completely. Take the time to analyze the data and design a proper completion.

After a high-profile exploratory well tested at a rate 80% lower than its predicted potential, one Gulf of Mexico operator examined its well analysis procedures and adopted a process that makes improved use of the reservoir data it collects prior to designing a completion. It found that the "quick look" data analyses that have become popular because of industry time and cost pressures are not always paying off. Taking the time needed to analyze thoroughly the openhole log and testing data captured during the initial drilling process makes good business sense in many cases. Particularly when working in areas with tight, complex lithologies, doing so will improve well productivity and long-term viability.

Schlumberger's PowerStim reservoir characterization and production optimization process was used to reevaluate an exploratory prospect in low-permeability sands after an initial well was unsuccessful. Because of the thorough well analysis and planning work done after losing the initial well, work on the prospect has continued, and the operator is applying the same process in fields with similar characteristics onshore.

Completing the initial well

An offshore Gulf of Mexico exploratory well with reservoir targets at 15,000 ft (4,575 m) or deeper was drilled in late 2001 with oil-based mud. The data set gathered during the drilling process included mud, logging-while-drilling and wireline logs, as well as rotary sidewall cores and wireline tests. Following the typical mandate to make decisions quickly - within 24 hours of completing the logging - the triple-combo logging data were processed using a simple sand and shale model to generate a quick petrophysical analysis for use in completion design (Figure 1). A water resistivity (Rw) value of 0.015 at 300°F (149°C), taken from an offset block, was used in this processing stage.

Meanwhile, in-house geologists and production engineers undertook more thorough analyses on the data acquired. However, because these would take up to a week or more, the completion design selected was based largely on the 24-hour quick-look results.

The team identified three tight sand pay intervals and chose a conventional fracturing design using best-guess knowledge of similar pay sands onshore. It was pumped as soon as possible.
Unfortunately, the fracture treatment screened out prematurely. Then, during production, significant drawdown pressure (3,000 psi to 4,000 psi) was placed on the pay zones. Test data indicated a gas production rate of 3.7 MMcf/d; however, critical drawdown pressure quickly resulted in severe sanding. The well was shut in. Nodal analysis indicated significant near-wellbore damage had restricted the initial production to the 3.7 MMcf/d flow rate (Figure 2).

Examining the completion process

To understand what caused the poor results in this initial well, the operator teamed up with Schlumberger's Houston PowerStim team. Postmortem results on this well would have a significant impact on whether this exploration play would continue.

The project's goal was to identify what went wrong with the selected completion and what would have been the optimal completion design. Close cooperation among the operator's personnel and PowerStim team members ensured critical information was shared and the issues and operator's needs were addressed.

The initial step was to gather existing data and absorb the operator's project knowledge. As a result, several gaps were identified between actions taken and the knowledge and expertise available but not applied.

For example, the operator had been experiencing a discrepancy between its core data results and the grain densities and porosities later modeled using elemental capture spectroscopy (ECS) data. A close examination with the aid of Core Lab showed that pressure to act quickly had caused the core samples to be cleaned improperly. Fluid left in the cores caused the grain densities to measure too light and the porosities to measure too low. When the cores in question were cleaned properly, subsequent core analyses and ECS grain densities showed good agreement.

Another gap occurred because an average water resistivity value was used during the quick-look analyses. Although these analyses identified the three pays, had the oil-based dipmeter data been taken into consideration, a major fault with an east-west strike separating the lower pay zone from the upper two would have been noticed. Acknowledging the fault led to use of the modular dynamic tester (MDT) fluid samples to select appropriate water resistivity values for each of the pay zones, instead of using a local average for all three. Using appropriate resistivity values for each zone showed that the lower zone contained water and should not have been completed. Simply electing not to complete the lower zone would have led to dramatically different results in the initial well.

Looking more closely at how the MDT data were incorporated, MDT pressures were plotted during the postmortem project using the two sets of pressures that were recorded - the values in the mud column and those of the formation. The mud column values showed a gradient of 0.91 psi/ft, which corresponds to a mud weight of 17.9 lb/gal. The formation pressures showed a gradient of 0.494 psi/ft in the lower sand, corroborating that the formation fluid is a liquid rather than gas. Further analysis of the fluid samples showed that Rw values of 0.06 at 291.5°F (144°C) for the lower zone and 0.039 at 288.7°F (142.5°C) for the upper zone would have been more appropriate than the 0.015 at 300°F (148.7°C) area average.
With more accurate core X-ray diffraction data and appropriate Rw values by zone, the team added elemental capture spectroscopy and nuclear magnetic resonance data acquired with a Combinable Magnetic Resonance (CMR) tool to the mix to generate a more realistic model of formation components. Use of the CMR data began by using the core samples to generate specific T2 cutoffs in the lab. Specifically, the team found a 16-msec cutoff to be accurate for all three cores, adjusting these to 25 msec after taking the use of oil-based mud and downhole temperatures into account.

Accurately calibrated, the CMR logs' total fluid-filled porosity measurements across the three zones of interest compared well with those shown by the density porosity and ECS curves, with the shale intervals reading higher because of the heavier grain density (Figure 3). Using the Coates-Timur equation, the free and bound fluid porosity data were converted into a permeability indicator. Resulting perm values compared well to those measured in actual core where available. Because core data was not continuous, the resulting permeability profile highlighted the fact that the shallowest zone had much lower permeabilities than the middle zone. With this information in hand, a completion strategy could be developed to stimulate both zones more effectively.

Also available to the project team was Dipole Shear Imager output, which can be used to generate minimum and maximum safe mud weight. In hindsight, close analysis shows that the project team used muds that were roughly 1 lb/gal too heavy through the upper, gas-filled zones, which pushed the gas away from the wellbore.

Adding further knowledge to the project to help fine-tune the fracture treatment, rock mechanical properties and borehole stability indicators were computed from the dipole shear imaging data on hand. Accurate values for downhole stresses, Poisson's ratio and Young's modulus are critical input for frac design. Job size and frac geometry prediction are only as good as the quality of the estimated rock properties. When data on hand can lead to locally calibrated measurements, the resulting interpretations and decisions have a higher probability of accuracy.

Recommendations and observations

Given the objective of reevaluating the process that led the operator's project team to lose its first well in a Gulf of Mexico exploratory prospect, the PowerStim team came up with several recommendations.
Near-wellbore damage led to the modest production rate of 3.7 MMcf/d tested in the initial well before it sanded in. To improve that rate and eliminate sanding, the operator could bypass the lower, water-filled zone and maximize vertical coverage of the upper two zones, which contain gas. A screenless completion strategy could be employed to solve the sanding issues and prevent formation damage. This would include designing for tip screenout using a low-guar fluid with an aggressive, encapsulated breaker design. With the latter scenario the predicted production rate from the upper two zones could realistically reach the 20 MMcf/d to 22 MMcf/d rate, seven times that tested in the lost well. Just assuming a skin of zero, the upper two zones have an estimated production potential of 8.5 MMcf/d.

Clearly, the temptation to trust the most basic data set, or a quick-look wellsite analysis of logging and testing data, as accurate has its drawbacks. Often, wells are underperforming because the reservoir models used to manage their lifespan are neither accurate nor reliable.

In this case the decision-maker followed company procedure and moved quickly with a best guess on several critical reservoir characteristics in an exploratory well - rock mechanical properties and water resistivity values, among others. These best guesses, along with inadequate core cleaning procedures, led to a reservoir model and decisions that adversely impacted the well's outcome and could have turned the operator away from a viable prospect.

Bottom line, a new best practice recommended to this operator will allow its engineering staff to consider going beyond a quick-look approach to completing and managing its wells. When there is limited data for a particular reservoir, such as with a new prospect, or when working in potentially complex geologic areas, taking the time to make adequate use of the downhole data acquired prior to completing a well makes good business sense. In other circumstances, revisiting the reservoir model of an established, yet underperforming well can lead to production optimization, as well.

As a result of the reevaluation, the operator elected to apply the comprehensive reservoir characterization and production optimization process provided by Schlumberger's PowerStim teams in several of its onshore fields.