On paper, bringing a well into production is a four-step exercise: find, drill, complete and repeat. If it were only that simple, right? But once the well is in production, keeping it online is a delicate exercise of balancing not too much with not too little.

Recent advances in production technologies have helped operators find that balance in a variety of areas.

The phenomenal growth in the number of unconventional wells drilled has resulted in a higher number of wells that are not producing from all of the stages and low ultimate recovery rates. Operators are looking to well planning as a key first step in elevating production rates.

Advancements in high-speed computing have enabled greater insight into the modeling of reservoir behaviors during production.

Through looking at waterflood principles on an atomic scale, researchers have developed new applications that could significantly improve recovery in offshore applications.

There’s no one way to solve a problem, and the next few pages offer a sampling of the technologies and techniques that are opening and keeping open the oil fields of today for the generations of tomorrow.

Well planning for unconventional plays heads for mainstream

As companies move from HBP to development drilling, well planning has become more important for boosting EURs and extending well life.

By Scott Weeden, Senior Editor, Drilling

In the past few years in plays like the Eagle Ford, the industry has continually pushed the envelope in terms of drilling efficiency. Even though the rig count has gone down, the number of wells drilled has gone up. The focus in the industry has been on getting wells drilled quickly and holding leases by production.

Although the drilling side of the equation has benefited, the production side has been lagging in terms of effectiveness. A high percentage of wells are not producing from all of the stages, and ultimate recovery is between 5% and 15%. Production varies greatly from well to well, and producers are beginning to be more interested in using well planning to impact variation and effectiveness.

“There is a big focus on improving production going forward. A lot of that shift is coming from operators who are switching from the exploration phase where they were proving the economics of the plays to the development phase where their focus is on improving production and efficiency,” said Aaron Burton, product line manager, multistage completions systems, Baker Hughes.

“There is more and more emphasis on well planning. The industry is moving away from ‘cookie cutter’ versions of field development. What we’re seeing now is the mentality shifting more toward long-term goals of getting good production for many years. With the shift in mentality, you do have growing emphasis on data—microseismic, production logging, production monitoring, etc.,” he continued.

Mark Parker, technology manager, Midcontinent Area, Halliburton, emphasized, “As we’ve gone to more horizontal wells than vertical wells, well planning is really important. In some areas, operators are drilling 5,000-ft to 8,000-ft [1,524-m to 2,438-m] laterals. Staying in zones and understanding targets that you’re trying to access are all very important.

“I think we’re still trying to influence the industry in seeing how important it is to visualize and understand where we’re going with drilling wells. When an operator drills a well that is 4,000 ft [1,219 m] in a lateral direction, it wants to treat all 4,000 ft. I totally understand that. If we’re not accessing all 4,000 ft because we are not in good-quality rock or we’re not doing the proper stimulation treatment, that has a huge influence on production,” he said.

For quite some time, the industry has been reluctant to perform production logging, for example. Now, though, the industry is beginning to recognize the value of all well data, including production logging, in well planning.

Cumulative learning process

The key to well planning is incorporating and using the information that the industry currently has available. Data could include microseismic, MWD, LWD and production monitoring. For well planning, there are a lot of aspects to consider.

“First off, where is your top hole?” Burton asked. “Where are you going to actually drill the vertical portion of your well? In which zone are you going to place your lateral? There are a lot of variables there—everything from hydrocarbon content to ‘frackability’ to how many stages you want to put into that lateral.

“Obviously, it will take well planning to determine the optimal completion and stimulation design. In tight oil plays like the Bakken, you also have to consider post-fracture needs. If you want to run an electric submersible pump, you need to be sure it is compatible with the rest of your completion,” he said.
Well planning is a cumulative learning process. Information is taken from previous wells to improve the plan for the next well. There is a variety of ways to integrate data into well planning.

“Microseismic, for example, would be done during the frack job so you can get an idea how your fractures are growing. That can lead you to know how effective your fracture is. With real-time microseismic, there are opportunities to adjust on the job,” he continued. “I believe most operators are using it to evaluate one well, take the data and apply it to future wells. The same could be said for production logging and monitoring.”

Identifying the production volume from each stage allows operators to determine the effectiveness of the hydraulic stimulation. “If they discover that the frack plan was not effective for certain stages, they can apply the data gathered from the most productive stages to optimize future operations,” Burton added.

Even drilling from a pad, there can be variations in wells that can be addressed by well planning. For example, if wells are close together, the wellbores align with the stress regimes and the rock properties are comparable, the well plans can be similar. In this application some operators choose to focus on the efficiency aspects and design a frack plan that contacts the entire reservoir in that lateral. As long as the wells are efficient and don’t miss any hydrocarbons, they can continue to use a cookie-cutter approach, he said.

But if the wells head into different sections of the formation, different assumptions—largely based on rock properties—will need to be made. Although there are similarities between plays—multistage completions, hydraulic fracturing, etc.—that can be transferred from play to play, the formations are different enough that operators can’t cut and paste their frack designs. They have to modify the frack according to the formation, he explained.

Subsurface visualization

Decades ago when the industry was drilling only vertical wells, a lot of those wells were drilled on structures that could be observed from the surface. Now, with unconventional resources and basins with shale-type reservoirs, the industry is looking at different characteristics.

“When we incorporate all the geology and geophysics along with all the engineering data, we can use that to generate targets we can incorporate in the well-planning process,” Parker said. “It is not just looking at the location from the surface and saying, ‘I’ve got surface and section outlines for the lease.’ We are taking a look at it in 3-D to see how the formations and reservoir targets may vary in a lateral direction.

“That is something we can do with subsurface visualization, which is something Halliburton is developing right now. We call it our CYPHER process,” he said. “The industry has all this information, but how do we incorporate and utilize it? That’s part of what CYPHER does. It allows all this data to reside in one place where we can look at it, visualize it and literally see where we are going.”

Operators can look for petrophysical characteristics—water saturation, total organic content (TOC), etc.—that can be used for a target. “With CYPHER, we can visualize it. It is not just numbers on a paper or values in a spreadsheet; we can actually see 3-D representations of what that looks like,” he said.

For example, there are geological hazards such as faults that need to be identified. “In some areas, there is a lot of faulting. What happens if we put a wellbore across a fault? Would we have a severe lost circulation problem that would limit us from going further? Could that fault lead to a high-water area that would cause problems with production?” he asked.

“It will show structural information about the reservoir so that the well plan could be guided along the structure even if there were no geological hazards. By structure, I mean things like the dip of the formation and how that may change,” Parker added.

Well planning goes hand-in-hand with maximum production. “There is a typical disconnect in the industry. On the drilling side, we want to get the well drilled in the most efficient and timely manner to save money. But that can have implications for how the well can produce. For well planning, if we could look at other characteristics or parameters in the reservoir, we could use those as targets,” he continued.

Perhaps the operators want to drill high in the interval to take advantage of better porosity distribution or TOC. Or the operators could target the middle of the reservoir because they want to access as much of the reservoir as possible from top to bottom, and centering would be best. Or the operators want to land the well low in the formation to take advantage of gravity drainage for liquids production.

“There are a lot of different aspects involved in bringing together a well plan that are important in consideration of future production,” he said.

Well planning requires collaboration

Both Baker Hughes and Halliburton emphasized the importance of collaboration with the operators in well planning as well as collaboration between engineers, geologists, geophysicists and petrophysicists.

“The keys are communication and seeing it from start to finish,” Burton said. “You can’t just focus on one aspect of the drilling, completion or production because these all must correlate. The drilling will directly affect your wellbore completion. For example, if you want to run a certain type of completion system but you have the wrong size casing or the wrong size hole was drilled for that completion, you obviously would have to change your completion plan.”

Drilling a rough wellbore makes it difficult to land the completion at the intended depth. Also, the wellbore completion directly affects the frack design. The wellbore completion controls how much flow area there is in each stage and determines the pressure ratings downhole. It also helps control fluid displacement, which indirectly can control the frack growth, he continued. It is important that the completion be designed with the frack in mind to ensure that the intended frack job can be executed.

To maximize the production effectiveness, the service company has to collaborate with the operator in all aspects of the drilling, completion and production process.

“From Halliburton’s point of view as a service company, we know how to do a lot of things from the engineering aspect of drilling and completing the well. But from the operator’s side, that property is their asset. They understand it and know things we don’t. Collaboration is really bringing the two knowledge bases together to come up with the best solution,” Burton said.

In one project Halliburton worked with Devon Energy on the latter’s asset in the Barnett Shale in an area that was more liquids-rich. “They had a situation where there was a lot of variability in production from one well to the next. Continued development of this area was at risk of abandonment due to poor economics and unpredictable production results.

“We applied the CYPHER process and incorporated data to improve reservoir understanding, including wide-azimuth 3-D seismic. A collaborative team of Halliburton and Devon technical representatives was formed and made recommendations and a new well plan. The team adjusted the drilling target to a lower landing point within the reservoir compared to what was originally planned,” he continued.

Several changes also were made on how the fracturing treatments are performed. The combination of changes implemented by the collaborative team resulted in reduced variability from one well to the next and a significant overall uplift in EURs. The project has been documented in various publications, Parker said.

Simulating a hidden reality

Reservoir simulation has made great strides in predicting a reservoir’s inscrutable behavior.

By Rhonda Duey, Executive Editor

While seismic data processing often leaps to mind when discussing high-performance computing demands in the oil and gas industry, reservoir simulation also is a very compute-intensive process in the upstream. Based on surface and subsurface measurements and more than a little intuition, these models are expected to simulate the highly complex process of coaxing hydrocarbons out of a reservoir. It’s not a simple undertaking.

But computing advances and better mathematical approaches are speeding up this process and giving reservoir engineers a greater sense of confidence in these models.

Starting with the model

A reservoir simulation model starts with the geologic or earth model, a representation of the subsurface created by myriad measurements including seismic, well logs, core data, etc. These models are used to determine the best places to land the wells.

“A reservoir simulation model can be viewed as the assembly of data from a number of different elements,” said Bob Gochnour, manager, advanced reservoir simulation and deployment/advanced reservoir performance prediction/reservoir management for BP America. Gochnour said that the shape and size of the “container” both vertically and laterally are derived from geological and geophysical measurements and interpretation, including the existence and extent of faults. Geologists also identify the formations and work with petrophysicists to identify facies.

The value of these measurements cannot be overstated. “These measurements have a direct impact on the ability of the reservoir simulation model to predict reservoir behavior,” said Agha Hassan Akram, reservoir engineering adviser for geosciences and petroleum engineering at Schlumberger. Estimates such as porosity and fluid saturations are directly proportional to estimated hydrocarbons in place, he said. Permeability estimates also are important since they have the biggest impact on the optimal well type and spacing, including the ability of reservoir fluids to reach these wells. And seismic inversion helps the simulator to model the “blank” spaces between wells.

“Substantial inaccuracy in these measurements can result in a simulation model that cannot accurately predict future reservoir performance,” he said.

Joe Lynch, director of reservoir management at Landmark, a Halliburton company, recalled a simulation problem in the North Sea in which the simulator predicted a late and steady water breakthrough, but the actual breakthrough happened much more quickly and suddenly. It turned out that the reservoir was channelized, and the water was following the oil through the channel pathways. The inability to image those pathways through seismic led to an inaccurate simulation model, he said.

But even with the best geologic models, significant tweaks are needed. First of all, the model needs to be upscaled. “The blocks in the geologic model are typically too small to be of practical use in reservoir simulation,” said Steven Crockett, senior product manager for Nexus at Landmark. “Upscaling increases the gridblock size while retaining as much as possible the flow behavior of the finer scale blocks, reducing the number of gridblocks.”

Once upscaling is complete, the permeability and porosity data from the geologic model need to be integrated with reservoir data and pressure, volume and temperature data to describe the interaction between fluid phases and the rock data, he said, either measured from cores or based on correlations. Well trajectories are converted into sets of perforations to describe the wells.

The model is then history-matched to tune it to match the permeability and well characteristics of the field.

“Once these steps are completed, you have a reservoir simulation model that can be used to predict the behavior of the field under different operating scenarios,” Crockett said. “Many reservoir engineers use workflows that begin with a set of realizations of the geologic model to capture uncertainty.”

Recent advances

Reservoir simulation has taken advantage of numerous technology advances and, as a result, has made tremendous strides over the past few years. Gochnour characterized older methods as solving “only the reservoir problem.”

“The first coupled surface/subsurface models were loosely connected numerically,” he said. “More recently, today’s next-generation reservoir simulators solve the complete production system, reservoir through to the surface facilities, in one fully coupled system.”

This means these models can solve multiple reservoir systems; account for complex well trajectories; model hydraulic fractures as well as naturally fractured shale systems; solve for fluid flow in a wide range of fluid types; and allow multiple recovery process modeling options, including polymer flooding with both conventional and recent temperature-sensitive polymers, foam injection, gas injection floods for miscible recovery, thermal recovery techniques and low-salinity waterfloods, he added.

Robert Frost, development manager at Roxar Software Solutions, a business unit of Emerson Process Management, likened the old vs. new systems to a Model T vs. a Ferrari. “They may fundamentally have the same technology behind them; the difference, however, (and there is a big one!) is the detail, resulting in huge increases in power and applicability,” he said.

Many of these improvements have been enabled by the dramatic increases in compute power over the past few years. Gochnour noted that Moore’s Law has slowed in recent years, but reservoir simulation is still taking advantage of multiple processors per central processing unit, continuing to drive the throughput speed of the applications. “As such, there is a greater uptake in parallel computing, even down to the notebook computer,” he said.

Frost added that the availability of parallel computing technologies is triggering advances in linear solver technology, which is needed to solve the coupled equations that represent the flow physics in the reservoir. “Here we have come a long way,” he said, “but still have progress to make with the challenge to find solvers that can be efficiently parallelized as well as being fast and robust enough to solve many different types of problems.”

Crockett said that modeling all the relevant physical behaviors of the reservoir rock and fluids in the simulation results in a large nonlinear problem with millions of unknown quantities and the same number of equations governing them. “You convert this nonlinear problem into a set of linear problems, and you have to solve each of these linear problems in the same way,” he said. “Each one of these linear solutions is used to nudge the best estimate of the solution to the nonlinear problem along until it gets to be sufficiently accurate.”

These advances in simulation technology allow a greater use of uncertainty analysis. “A tightly matched model is usually a poorer predictor of future reservoir performance than an ensemble of more loosely matched models,” Akram said. Tools are available that use a simulator to create multiple forecasts within an acceptable error band of history-matching, generating a fully probabilistic set of outcomes. These can focus on any parameter of interest.

Reservoir simulators have long relied on gridding technology, but in recent years this technology also has dramatically improved. Cameron McBurney, reservoir engineering manager in the reservoir consulting division at Baker Hughes, said that advances in 3-D gridding mean that complex reservoir faulting can be modeled much better than before “as the limitations of gridblock shape/ordering have all but vanished.”

“Now that hydraulic fracturing is so prevalent in the completion of wells, especially in the unconventional plays, reservoir gridding has made some significant advancements to enable proper modeling of flow in and around hydraulic fractures,” he said. “The obstacle with hydraulic fracture modeling was inserting a very thin region of extremely high permeability into a vast region of extremely low permeability without creating solver problems due to large pressure changes between gridblocks of large size variation and not sacrificing transient flow conditions. The new gridding options have overcome this and allow us to provide better flow rate and decline rate predictions.”

All of these advances have provided the ability to foster better integration among different disciplines. New workflow platforms help keep the subsurface model “live and current,” said Akram.

Added McBurney, “Software packages have become start-to-finish workspaces that take you from the input of raw data to the final production forecasts predicted by a history-matched dynamic reservoir model. Everything from the raw geological data such as logs, core and seismic combined with reservoir engineering data like well production data, pressures and relative permeability are analyzed and processed under the same roof, allowing geologists, petrophysicists and engineers to work together on the same project file.”

This in turn is affecting the way engineers approach their models. “One of the biggest advances isn’t the technology itself,” Lynch said. “It’s in the thought processes behind using it. People are really beginning to understand that simulation is a tool you use to answer a question. If you don’t know what question you’re asking, you can make all kinds of simulation runs and get all kinds of data, but it’s not of much use in making decisions.”

Keeping giants active

EOR/IOR technologies help one operator’s giant fields mature gracefully.

By Jennifer Presley, Senior Editor, Offshore

A common investment goal is to squeeze as much value as possible from the investment. For operators of the world’s multimillion-dollar oil fields, the squeezing is more in the form of injection when primary recovery begins to decline. Optimizing recovery in its maturing fields, BP primarily uses water or gas flooding. According to the company, the average industry recovery factor is only 35%, leaving behind some 65% of the oil known to be in the field. From Alaska to Azerbaijan and several places in between, the company has deployed or is in the planning stages of deploying its EOR/IOR technologies to help improve recovery.

“Our philosophy around technology is that we’re not going to do everything. We’re going to pick out a few areas and try to be leaders in those,” said Andrew Brayshaw, vice president of emerging and integrated technology, upstream sector for BP. “The three big areas we feel we’re leaders—where we spend most of our investment dollars—are imaging, digital technologies and EOR.”

The current rate of global oil production is around 90 MMbbl/d, of which about 3 MMbbl/d is due to EOR, according to a BP-issued release. Of that 3 MMbbl, only about 1 MMbbl are from EOR of conventional oil, with the remaining made up by thermal EOR of heavy oil. BP-operated conventional oil EOR projects produce in excess of 100 Mbbl/d, representing more than 10% of the world’s conventional EOR production rate, the release stated. The company has more than 70 years of EOR experience and operates the world’s largest hydrocarbon miscible gas EOR project in Prudhoe Bay Field, Alaska.

In the latest issue of its “Energy Outlook 2035,” the company’s economists projected that global energy consumption is set to rise by 41% by 2035, with 95% of that growth coming from rapidly growing emerging economies. As energy demands increase around the world, so do the challenges to meet the demand with adequate supplies. The discovery of new oil reserves and the application of EOR methods in the maturing giant fields first tapped decades ago are increasingly helping to meet that demand.

“We have a lot of giant fields in our portfolio,” Brayshaw said. “EOR is really important to us as we see significant oil remaining in those existing fields. It is a great prize.”

Water works

Artificial lift and infill drilling are among the methods used by industry to seek higher recovery. Another is to inject and flood the reservoir with water or a gas like nitrogen or CO2 to keep the wells flowing.

“EOR is typically looked at in the industry as the last thing that can be done in a field before ‘turning out the lights’ on it,” said Raymond Choo, EOR deployment manager for BP. “First is primary depletion, then typically a waterflood. When a high water cut is produced, the question becomes ‘What can be done next?’ That is when chemicals, gas or other techniques are applied in the reservoir.”

For BP, “waterflooding is very important,” according to Brayshaw, adding that the company is one of the largest waterflooders in the industry.

“One of the things we’re doing differently from the past—where the belief was that waterflooding is just a physical process, where recovery is a function of the amount of water that could be cycled through the reservoir—is really understanding the chemistry of the process, and that’s what’s given us an opening,” Brayshaw noted.

Being readily available, relatively inexpensive and highly effective at increasing oil recovery has made saline water a popular choice. One example of better recovery through better chemistry is Bright Water chemical and application technology, a BP invention that was codeveloped with Nalco and Chevron. Bright Water is a submicron particulate chemical that is injected downhole with flood water. It is designed to activate at a predetermined “in-depth” location within the reservoir. Upon activation, the Bright Water particles begin to expand to many times their original volume, blocking pore throats and directing injection water into untapped, oil-rich zones. This deep reservoir profile modification causes additional oil to be swept toward the producing wells.

Another example is the company’s LoSal EOR technology. The company observed that “reducing the salinity of the water could have a positive impact on pore-scale displacement and ultimately recovery,” according to a company release.

“Waterflooding is a commonly applied technique,” Choo said. “And if you can do something to the water without needing to add a lot of expensive chemicals, then it becomes very beneficial. It’s an area where we’ve done considerable work on developing our low-salinity technology. It’s a form of waterflood where we’ve reduced the injection water salinity to help release more oil.”

He added that LoSal EOR also helps prevent scaling and production of hydrogen sulfide or “souring” of a well. “Hydrogen sulfide creates problems when it gets to the surface production facilities, and it also causes corrosion issues. LoSal EOR takes care of that as we’re cleaning up the water by removing the sulfate and other ions before it goes in the well.”

After more than a decade of R&D in the company’s lab and testing in its Endicott, Alaska, field, the first use of LoSal EOR in a full-scale sanctioned deployment is set to occur in the North Sea’s Clair Ridge oil field.

“What we’re trying to do at Clair Ridge and with future big projects is to start thinking about and recognizing the need for EOR from day one and in some cases actually starting it on day one like we’re going to do at Clair Ridge,” said Choo. “In other cases, we may want to allow for the space and design with EOR in mind; otherwise, it will be quite difficult to put in later on in offshore settings.”

Next generation

It is the transition from onshore to offshore that can be most problematic for operators looking to deploy EOR technologies in their maturing fields. For its newer fields, EOR is now a routine part of BP’s field development planning and evaluation process.

“For offshore, it is important to consider EOR in advance, whereas for onshore, there’s land available where additional equipment can be installed a bit easier,” Choo said. “Once your structure is in place offshore, if you haven’t allotted for the space and weight of the additional equipment, it becomes very difficult to retrofit it for installation of EOR systems. That is a big challenge in the industry: understanding how you do brownfield production offshore with EOR. There are very few offshore EOR projects in the world; two of these are operated by BP, namely, hydrocarbon miscible gas EOR in Magnus and Ula fields in North Sea. Some 30% to 40% of current oil production from Magnus Field and nearly all of the oil production from Ula Field today is from gas EOR.”

BP’s EOR technology is underpinned by a global team, world-class laboratory facilities and digital reservoir characterization.

What is the next generation of EOR technologies that will help the current generation of giant oil fields mature gracefully? Neither would provide much detail into the company’s current EOR R&D efforts other than to say they are working on “next-generation” digital rocks, designer water, designer gas and designer voidage EOR technologies.

JIP studies separation technology

Southwest Research Institute (SwRI) has announced the launch of a multimillion-dollar joint industry project (JIP) to better understand oil and gas separation technology. The objective of the Separation Technology Research (STAR) Program is to combine industry knowledge and resources to advance research that could lead to better equipment and test protocols.

SwRI is leading the three-year program, which is open to operating companies, contractors and equipment manufacturers. International participation is welcome. The three-year membership ranges from $75,000 to $450,000, depending on the type of company.

“Separating fluid mixtures into streams of oil, natural gas and water efficiently and cost-effectively using lighter weight equipment that requires less space is very important to the industry. The STAR Program will involve this three-phase separation process as well as gas/liquid separation and liquid/liquid separation,” said Chris Buckingham, a program director in SwRI’s Fluids and Machinery Engineering Department and manager of the STAR Program.

The advantage of the program is its ability to pool resources and industry experts to allow a more cost-effective approach to solving problems, especially in a collaborative environment. “This approach means both company-proprietary and nonproprietary equipment can be tested, with results shared among the members,” Buckingham said, adding that the research will be conducted using SwRI’s existing gas/liquid flow loops.

Members of the program will guide research initiatives by developing a project scope, identifying technologies to be tested, providing input on standard test approaches, witnessing testing and commenting on results.

Goals of the program are to develop standardized testing methods, collect data to improve equipment performance and develop analytical models for various types of separation equipment.