Geophysicists 35 years ago used colored pencils, 10-point dividers, and hand-drawn maps and contours, and computers were just being introduced. People predominately communicated through telephone or telex, and the Internet, fax, CD and multiterabyte hard drives didn’t exist.

Since then, the pace of digital change has significantly accelerated. What will the next 35 years bring? How will we ensure that the oil and gas industry can continue to keep up with these advancing developments?

Fast-growing industry

Due to the increasing global demand from many industries, computers have developed very quickly, and the digital industry has continued to evolve at an increasing pace. We now have the ability to display data and information in real time on electronic 3-D seismic displays to interpret geological models and use geosteering techniques to guide in real time where the wells are directed. Drillers also have embraced the digital information era, starting to use data streams analyzed in real time to avoid stuck pipe and to optimize the position of the wellbore. Increasingly, operating engineers are relying on vibration and equipment monitoring to optimize maintenance schedules, track chemical treatments and optimize water handling.

Considering the future

It is extremely difficult and challenging to imagine what the next five or 10 years will bring. But the plants, rigs and production platforms being built today will probably still be operating in 25 to 35 years.

How are we future-proofing the design of our facilities? How are we planning to upgrade them several times over the next few decades to take advantage of the inevitable advances in digital technologies that will come along?

The industry is still using the same methods to extract oil and gas as it has done for more than 100 years, and it is fascinating that with intelligent energy we are trying to combine one of the fastest-changing technologies with perhaps one of the slowest-changing extraction techniques.

One of the big questions facing the upstream oil and gas industry is how we will attract and retain the next generation of engineers when there will be so many exciting opportunities in other industries to leverage predictive and autonomous digital analytic and modeling techniques.

These are challenges the industry will soon face. However, we are still catching up, experimenting in many areas on how to further leverage the currently available digital hardware and analytic capacity to the issues we face today upstream. We are not yet forward-thinking enough to set ourselves up for true success.

By anticipating the future rather than just focusing on the “what’s available now” or the “what’s the flavor-of-the-month technology,” the industry can better embrace the short-term opportunities while stretching its aspirations.

Anticipation

First, how should the industry anticipate what is coming along on the digital front that might be relevant to it? It can help to look back and identify trends to help us think more directionally about the future. The last 35 years have highlighted a number of trends that many think will continue. Some of these are:

  • Redundancy: Devices and technologies have been invented, put to good use and then surpassed and replaced by the next generation. The time cycle has been reduced during the last century and certainly during the last 35 years. The pace of redundancy will continue to accelerate;
  • Miniaturization: The space race was the catalyst for miniaturization of electronics. It led to more of an uptake in electronics by many industries for multiple purposes. Now miniature computers can be fitted nearly anywhere, from the car to the refrigerator. Finding space in facilities and drilling rigs to take advantage in the future won’t be a problem;
  • Convergence: Ten years ago many people carried cameras, mobile phones and laptop computers. Today’s devices multitask, and one device can nearly do everything. Software originally designed for specific purposes has converged toward integrated interpretation and optimization systems that are beginning to meet the needs of several disciplines;
  • Standardization: This is becoming increasingly important as the need to exchange data continues to grow. The proliferation of standards groups is evidence of the emerging need to manage data in a more structured and standardized way;
  • Visualization: The increasing subtleties and complexities of information mean that it will be crucial to more powerfully visualize it in a way that removes ambiguity and facilitates understanding. This has happened in the aerospace industry, where much more information is now available but the cockpit landscape is still relatively small. Information has to be presented in integrated and sequential ways, where only relevant information is visible when required, but users have the confidence that it can be viewed at any time;
  • Amalgamation: Data centers and “cloud services” have consolidated where information is stored and from where services are provided. People don’t know where much of their personal digital data is stored and probably don’t care as long as it is secure and they are able to access it quickly; and
  • Capacity: Demand for and the supply of bandwidth also has been growing at an increasing rate. Wired cables have been surpassed by fiber-optic and wireless connectivity. This growth is likely to continue to the point that fast connectivity becomes ubiquitous. Similarly, compute capacity has seen exponential growth, and cycle times will continue to become shorter. In the near future, the expectation that complex models and workflows will deliver results in near-zero time will be a reality.

As the upstream industry transitions from conventional, generally manual ways of working to running the business based on predictive, actionable information that is updated in real time, integrating and accessing information will become the new tools of the trade. The challenge will be to make information ubiquitously available, understandable and unambiguous.

Coping strategies

Some coping strategies will minimize the impact of future change and enhance industry’s ability to adapt and apply new technologies.

  • Standardization: As outlined earlier, this is a trend that has been increasing over the past few decades. The use of data exchange standards such as those that Energistics and others are developing will become more and more important. Standard ways of storing, transmitting and integrating data will need to become mandatory to fully unlock the future potential of intelligent information. Data standards will be the key to mass adoption and the ability to integrate actionable information;
  • Centralization: One means of reducing the footprint of change could be to centralize activities, infrastructure and services, but this could be regarded as a high-risk strategy since it has vulnerabilities. This strategy can also help the industry grapple with the demographic change and skill shortages it faces. In theory, centralization of activities should require less person power and be more efficient if the intelligent information is available to support the model;
  • Agnosticism: In our information technology choices, it would be prudent to be agnostic to device, server and to some extent architecture designs. More enduring choices are needed in areas such as data-exchange standards that might outlive several phases of IT hardware or in the ability to minimize the exposure footprint. Existing technology will not be in the same form in 10 years’ time;
  • Minimization: This requires judiciously selecting data that needs to be collected from where and for what purposes and then designing, with a minimalist bias, where the analytics and presentation occurs. The aim of this strategy is to take a holistic view of where the supply and demand for intelligent information is in a given asset system and to try to segregate and minimize where the analytic centers should be. This approach should lead to more evenly distributed digital systems with no one center dominating. Everything is interconnected, and this design should help with resilience and backup;
  • Automation: This approach is aimed at reducing the need for engineers to do everything manually. Many manual checking processes could be automated and provide additional assurance. The more the human element is reduced, the less process adjustment and training effort will be needed when the inevitable system upgrade comes along; and
  • Simplification: Simplification of business processes will contribute to reducing the footprint required for intelligent systems, thereby minimizing the upgrade/redundancy burden. Simplifying the design of the system so that it is easier for mass adoption and upgrade is another benefit.

Business or technology leaders are often asking themselves, “Are we taking advantage of the latest technology to continually improve our business?” and adjusting their investment plans accordingly. However, a far more important question might be, “Are we adequately planning for and anticipating the pace of change that we need to embrace to be successful in the longer term? Are we moving fast enough to simplify what we do, automate where we can and minimize the upgrade footprint of our intelligent systems?” The facilities being built today will probably still be operating several decades from now, but the technology being invested in to run them will be obsolete, sometimes before construction completes or plateau production is reached. Applying coping strategies to the design of intelligent information systems, organizational constructs and process changes could help in the near future.

It is impossible to predict how data will be stored, transmitted, manipulated or presented in the next 35 years. However, the work purposes for which we want to use data can be determined as well as the ways to minimize the requirements for them. This offers the best chance to take advantage of these future intelligent information technologies.