Reservoir characterization is not a new concept. Ever since operators have been able to collect data about their wells and fields, they’ve strived for ways to use those data to understand their reservoirs better. Improved data gathering tools and software systems that integrate the data are aiding this process.
But something funny happened in about 2006—the shale gale. Suddenly, operators were so busy leasing acreage and drilling wells that they didn’t have the time or luxury to learn more about the rock they were drilling through. And if one well was a dud, no big deal—at $100 oil, they could afford to make a few mistakes.
It’s a different story at $40 oil. Anyone who still thinks that shale reservoirs are homogeneous has obviously never drilled a shale well. It’s becoming a lot more important to understand why Well A is hugely outperforming Well B even though their completions are identical. That’s where reservoir characterization can help.
“Operators are looking into how they can be a little more predictive and more proactive in how they complete their wells,” said Sudhendu Kashikar, vice president, completions evaluation for MicroSeismic Inc. “Rather than blindly fracturing all 30 stages, they’re looking to be more selective. Maybe they only need to frack 25 because geologically the other five aren’t going to do anything. They are starting to look at that and actually doing the right things.”
The Value Case
Characterizing a reservoir is not a simple process. A Society of Exploration Geophysicists Wiki breaks the process down into a shared earth model, basic interpretation, premodeling organization, data preparation and formatting, exploratory data analysis, 3-D structural modeling, 3-D sedimentary modeling, 3-D petrophysical modeling, upscaled 3-D dynamic modeling, flow simulation, and model assumption iteration and updating.
And it’s not just about science. A description of a reservoir characterization course offered by PetroSkills includes understanding business drivers and selection criteria as well as decision trees and the value of information. But companies that embrace the concept have seen significant success.
David Williams, senior reservoir engineer and manager of special projects for Texakoma, said his company uses this type of approach on many of its reservoirs, most of which are in the Granite Wash play in the Anadarko Basin and along the Texas Gulf Coast. “We started out in the Granite Wash,” Williams said. “The main problem there is that it has very complicated lithologies, so a lot of analysis is needed to understand the rock properties. We’ve also run into some very fractured production in rocks that weren’t being produced in our area. These are complicated to evaluate reservoir-wise as to the hydrocarbon volumes, how much is fracture-related and how much we’re producing from the matrix.”
The Gulf Coast wells are more conventional in nature but contain a couple types of reservoirs that are hard to produce. “These include overpressured retrograde gas condensate and strong water-drive oil reservoirs that are very stratified layers of sandstone, so we have to make sure we get good recovery from each of the sand lenses,” he said.
Texakoma uses a variety of methods to characterize its reservoirs. One of its initial tools is the nuclear magnetic resonance (NMR) log as part of a full logging suite to help analyze the openhole logs. By calibrating the NMR with the rest of the logging suite, it becomes unnecessary to run them on every well.
Additional data collection includes rotary cores to get rock properties, geochemistry to get a better understanding of the hydrocarbons in the reservoir, 3-D seismic and attribute analysis, and mud logs.
“Mud logs are very important out here, paying attention to the shows,” he said. “Large mud losses are important because they give us information about these fractured reservoirs, and our best production comes from those.”
Williams said that his company’s willingness to pay for this type of data collection has led to some pretty impressive wells. One well recompleted into the Kansas City Formation above the Granite Wash has flowed 390,000 bbl of oil in four years from a vertical wellbore. Another well was recompleted into the Des Moines Formation after drillers lost 800 bbl of mud into the formation. That well flowed more than 1,000 bbl/d of oil without requiring fracturing.
Companies that specialize in reservoir characterization each put their own spin on the process. MicroSeismic, for instance, offers a product called PermIndex that uses microseismic results to predict the performance of later wells. The company inputs microseismic events into a discrete fracture network model, creating a deterministic model from the microseismic data.
This model is then separated into propped vs. unpropped fractures, and then a geocellular grid is superimposed to build a geocellular model. Intensity is defined as the number, orientation and width of the fractures.
Based on the fracture intensity, a permeability tensor can be calculated for each cell. “I’m going from events to a fracture network to a calculation of permeability for that cell,” Kashikar said. “And I do that for all of the cells within this rock volume.”
This helps operators rank the stages of their next wells to target the areas of highest permeability.
Kashikar said that there have been two instances where clients have run production logs that were compared to the PermIndex results. “We compared those with the PermIndex stage by stage, and the match was very good,” he said. “It wasn’t 100%, but it was good. We have done that on three or four wells now where the clients shared the data with us, and we see very consistently a correlation between the relative magnitudes of the PermIndex data and the relative production from these stages.”
The tool also has been used to history-match multiple wells at the same time, using additional data sources for the other wellbore properties (porosity, saturation, fluid PVT data, etc.), but using the PermIndex data for the permeability measurement. “We have shown that we can history-match multiple wells with all of these phases (porosity, saturation, fluid PVT data, etc.), simultaneously, using that permeability as the reservoir description,” he said.
Currently the company is in the process of correlating PermIndex data with seismic attributes to see if the information can be used farther away from the wellbore and still remain a predictive tool.
Sigma Cubed Inc. (Sigma3) offers a variety of characterization options to its clients. “Our clients engage with us to either help with a problem or improve a process,” said Tom Bratton, manager of geoengineering for Sigma3. “The problems are often diverse but routinely fall in the engineering domain. In every case, the solution or improvement is greatly facilitated with the integration of reservoir characterization.”
Not surprisingly, many of the company’s clients are involved in unconventional plays, and Bratton added that the goal in unconventionals is to model the physics of what is taking place during stimulation to be more predictive about production. “One of the things industry does is to history-match production to get a sanity check on the reservoir model, and the numbers that we put into the simulator five years ago aren’t the same as those we put in today. To some extent we’re changing the way that we model these unconventional reservoirs in terms of permeability.”
The company approaches a new problem by doing a data review and literature search as well as meeting with the clients to understand the technical goal, method to be used and constraints on the project. Often Sigma³ conducts a geoscience and engineering analysis to establish a calibration database of the best data and understanding of the challenge.
Once this is completed, an earth model is created consistent with the data, and a family of simulations is run that brackets costs, risks and likely benefits. Finally, experienced field engineers help achieve the desired results by implementing the plan in the field.
Bratton outlined several recent collaborations:
• One customer wanted to understand its current production results and develop an improved stimulation design that would increase production while reducing costs;
• Another wanted to build a rock physics model to help guide new drilling locations in an EOR project;
• A third needed help building a geomechanics model to guide drilling and completion of a horizontal well; and
• A fourth needed analysis of a difficult dataset to optimize a completion design. This was accomplished building a petrophysical and geomechanical earth model using cased-hole logs.
One of its products, Shale Capacity, is a seismic-driven model that incorporates key reservoir properties such as natural fractures, brittleness, total organic carbon, oil saturation and porosity. In a Bakken case study the model was able to accurately predict 90-day IP.
Based on its beginnings as a petrophysical company, Nutech takes a different approach to reservoir characterization, according to Jorge Viamontes, vice president of reservoir intelligence. To Viamontes, it all comes down to understanding the rock. “You don’t understand the rock for the sake of understanding the rock,” he said. “We’re not artists; we’re engineers, and we like to make money.”
He added that the company starts its characterization process from the actual rock measurements. Nutech has a core laboratory in-house as well as a reservoir texture petrophysical analysis division called NuLook. Logs are calibrated with the core measurements. “Our interpretation is calibrated and representative of the reservoir,” he said. “We take this information and build geological models in NuView. We do have seismic attributes and trend certain attributes accordingly, but overall it has to make geological sense, and it has to make petrophysical sense.”
These models can give operators information in areas where there is very little well control. “We model the geomechanical properties of the reservoir, and this is calibrated and backed by the largest industry collection of sonic interpretation logs,” he said. “Any public well that we have in our database and archives becomes a comparison point. That way we use all of the data that are available.”
Finally, the data and the models are put into a dynamic model using NuVision. “In NuVision we look at more intricate problems, for example, the optimum well direction, well spacing and fracture properties,” he said. “If we determine there is bypassed oil, for example, then what are the most efficient ways to exploit it? It’s really the involvement of all known engineering data to optimize the reservoir.”
This allows the company to advise its clients on how to develop the asset. Nutech provides forecasts of what the development plan ought to produce. And this can be tested.
“The reason clients keep calling us back is because we’re right many times more than we’re wrong,” Viamontes said.
Bratton said that while it’s difficult for service companies to make money in this downturn, he’s enthusiastic about the future. “Oil companies will use this time to prepare for the next ramp-up in activity,” he said, adding that three ingredients are required to progress.
The first ingredient, he said, is an integrated team. “There are too many unconstrained variables to solve the most difficult engineering challenges facing us with single-scale or single-disciplinary thinking,” he said. “Geoscience and engineering types often think differently, but both are required.”
The second ingredient is scientific and engineering integrity. “Multivariate problems are best solved with a combination of multiscale and multidisciplinary data that have been prepared by diligent domain experts that come together in a team to assess how best to integrate these diverse observations,” he said.
Finally, there is imagination. “Significant improvements often follow from out-of-the-box thinking,” he said. “This is a natural consequence of diversity and scientific integrity.”
Contact the author at firstname.lastname@example.org.
Read the other June E&P cover story: Technology improves gas recovery offshore Italy