For at least the first couple of thousands of years, astronomy was essentially an observational science. Using only simple tools and the naked eye, early astronomers tried to gain a better understanding of the stars and planets. Some of it was accurate, and some of it was not—despite Copernicus’ insistence that the sun was the center of the universe.

It wasn’t until the 17th century with the invention of the telescope that astronomy began to transition to a theoretical—or quantitative—science, one which leverages the development of computer or analytical models to describe objects and phenomena.

The oil and gas industry has undergone a similar, albeit more condensed, transformation. The tools used to produce hydrocarbons have come a long way in 200 or so years, and today the industry sits on the brink of its own quantitative revolution. Many of the tools of the modern oil man and woman would be nearly unrecognizable by those drilling the first wells in the late 19th century.

Aerial drones, holographic lenses, robots and devices that dramatically alter the view of the universe are no longer the tools of tomorrow. They are in use today, and their applications are rapidly emerging across a wide array of industry operations. Tech-savvy businesses and digital flexibility are no longer limited to Silicon Valley. An emerging trend out of the energy recession has been an effort for companies to become more innovative, efficient and nimble in their operations. A variety of reports suggest substantial money is being invested by oil companies in the digital space as they learn that devices such as these can help save on costs and enhance worker safety.

The “Global IoT in Oil and Gas Market—Analysis & Forecast, 2017-2026” report by BIS Research stated the global Internet of Things (IoT) market is expected to reach $30.57 billion by 2026, increasing at a compound annual growth rate (CAGR) of 24.65% during the forecast period through 2026.

According to the study “Confidence and Control: The Outlook for the Oil and Gas Industry in 2018” by DNV GL, more than one-third of senior oil and gas professionals said they expect to increase spending in R&D and innovations this year—the highest level indicator DNV GL has tracked in four years.

“We will see more R&D going into digital, artificial intelligence [AI] and automation, which is really about costs, and about other ways of doing our business,” Maria Moræus Hanssen, CEO of German-based E&P company DEA, stated in the report. “R&D is now less likely to be focused on ultradeep water, the Arctic or other extreme environments. It will be more about rationalizing the business—making the industry more profitable, more productive and modernized.”

In the past two years, only 15% (2016) and 14% (2017) of oil and gas companies were planning increases in technology R&D, which DNV GL suggests signals an imminent turnaround after three years of cuts and freezes.

In addition, the McKinsey and Co. report “The Next Frontier for Digital Technologies in Oil and Gas” stated that digital technologies have the ability to create additional profits from existing capacity.

“The effective use of digital technologies in the oil and gas sector could reduce capital expenditures by up to 20%,” the report stated. “It could cut operating costs in upstream by 3% to 5% and by about half that in downstream.”

Operators and service companies alike are seeing the future and seeing the value of these digital innovations. In 2017 Halliburton and Repsol announced partnerships with Microsoft to initially develop cloud-based computing systems before expanding to mixed reality applications and robotics to deliver integrated systems across the entire energy value chain.

Chandra Yeleshwarapu, senior director of R&D and head of global services at Halliburton Landmark, said augmented reality (AR) and virtual reality (VR) systems have been integrated at Halliburton through its DecisionSpace enterprise platform, which was founded about five years ago.

“We’ve always had a high-volume of APIs [application programming interfaces] to connect with other immersive technologies,” he said. “But over the past two to three years, mixed reality has become more prevalent so we started building DecisionSpace with the same APIs used for connecting to a 3-D-based system, thus creating the capability to connect to AR.”

Meanwhile, companies like Devon and BP have been at the forefront of digital technologies, such as the use of drones for data gathering and monitoring operations.

But as BP Technology Director Dave Truch explained, all of these types of technologies have emerged from a single building block—the quest for more data ingested by new approaches to digital data analytics.

“For various reasons, we collect data primarily by humans,” Truch said. “If you look at the algorithms out there, and look at the amount of data humans can collect, there is a disconnect. There’s no way I can put enough humans in the field to actually run these algorithms with any kind of surety about the results. By nature of trying to run these new algorithms, we had to consider a whole new way of capturing data. That led us into autonomous machines.”

Drones and robotics

In a recent report issued by IHS Markit, the use of aerial drones was identified as one of several “transformative technologies” likely to emerge in the industry this year. Another report by research firm Mordor Intelligence stated that the market for drones in the oil and gas industry is projected to reach $4 billion by 2020, with a CAGR of 37% during the forecast period. The Mordor Intelligence report also stated that drones have the ability to collect “as much data available in the last 30 years within 45 minutes” and are “poised to become the next major disruption to influence the oil and gas industry.”

In fact, according to a report by Technavio, oil and gas is the leading end-user industry in the robotics market, with a 58.5% share of the market.

Trumbull Unmanned provides drone operations to companies in the oil and gas sector such as BP and was named Exxon Mobil diverse supplier of the year. The company has performed more than 100 live flare inspections both onshore and offshore, which CEO Dyan Gibbens said in an emailed response is about onetenth the price and one-tenth the time of traditional flare inspections. She also said the savings vary from client to client, but if a typical inspection were to take a week, for example, a drone inspection can be done in less than a day and oftentimes less than an hour.

“Drone services provide several unique benefits to the oil and gas industry,” Gibbens said. “First, they allow companies to greatly reduce risk and start allowing individuals to perform important work while never having to put themselves in harm’s way. Second, in order to start effectively applying productivity increasing algorithms to work, the data need to be collected in a structured format.”

Gibbens said drones offer the ability to collect large amounts of data in those needed formats. Operational improvements often can be seen in three primary areas—efficiency, safety and quality, she said.

“For example, many operators have integrated drones into their offshore inspection activities,” she added. “They have done this because it has greatly reduced the costs of inspections with no reduction in production, allows dangerous work to be performed with no risk to people and has the ability to collect high-resolution data that was not previously possible.”

Truch said compared to traditional inspections on offshore platforms that required rope teams, drone inspections can reduce crew sizes by one-third. He also said such an inspection can be performed in about half the number of days with “significantly more” data acquired.

Intel and Cyberhawk, an aerial drone inspection company, recently partnered on a flare stack inspection in Saint Fergus, Scotland, using the Falcon 8+ drone system. According to Intel, such an inspection conducted by a drone can save $1 million to $5 million per day in potential production losses.

“Traditional inspections of oil and gas assets of this scale require either full or partial facility shutdowns,” Intel reported in a case study of the operation. “This could take days to weeks to bring the plant offline and accessible for inspection workers.”

According to the study, the Falcon 8+ deployed for the mission captured 1,100 images in 10 flights, which translated to 12 GB of data over the span of one to two days. A similar inspection would typically take a threeman team three days to complete, the study reported.

Truch said that AI algorithms run by BP offered the company new insights into how it is using devices such as drones and how those devices are influencing their perception of their operations. This led BP to go back to its risers in the Thunder Horse Field in the Gulf of Mexico, in which oil was first produced in 2008.

“At the Thunder Horse trial, our focus was on full data collection, the value of the data, the ability to actually run [drones] on our platform in a safe manner and, from there, to turn around and say the data we collected are very meaningful and very useful,” Truch said. “Now we’re starting to look at how we can take some of these new approaches and look at the data. Meanwhile, the data collected are still being analyzed in the traditional way, because we’re still getting huge value in the data we’re collecting.”

Truch said during the course of the drone inspections of the Thunder Horse risers, BP realized the mission began to evolve from merely data collection to a method that provides insight across a variety of the company’s operations.

“What we discovered is that a lot of the structural engineers, coating engineers and maintenance individuals were also interested in the data collected from the same mission,” he said. “That’s allowing us to change the nature of how we collect these data. In the past it was very specific to a single-use case. So, we put people on ropes to collect data about the risers. If we then wanted to look at some structural elements on the platform, we’d put people on ropes a different time to look at the structural elements. If we wanted to do some coating inspection, yet another group of people would go out on ropes to look at the coating. All of that information was collected on a single mission [with drones and crawlers].”

Much like unmanned aerial vehicles, robots are accessing both physical spaces and data that humans previously could not or where it was dangerous for them to do so. In the early 2010s, Total recognized that no existing autonomous surface robot existed in the oil and gas industry to meet the needs of E&P activities. In December 2013 Total, in partnership with the French National Research Agency, launched an international competition to design and build an autonomous robot for oil and gas sites. The ARGOS Challenge included five teams from Austria and Germany, Spain and Portugal, France, Japan and Switzerland. Each team was given about $740,000 and three years to design their surface robot prototypes.

According to Total, the ARGOS surface robot was to have three main missions: to carry out inspections currently performed by humans, detect anomalous situations and intervene in an emergency. More specific tasks included performing inspections during the day or night; being able to locate, read and record inspection points; take measurement and analyze readings; and detect anomalies ranging from malfunctions to dangerous situations such as gas leaks, suspicious heat sources or excess pressure, Total reported.

According to Total, the five robot prototypes were tested in a former gas dehydration facility in southwestern France in conditions representative of other company facilities. The final iteration of the competition was held in March 2017, and the prototype from the Austrian-German team was selected as the winner and was chosen by Total to start operating on one of its facilities beginning in 2020, the company reported.

In December Total successfully trialed an aerial drone system, its Multiphysics Exploration Technology Integrated System (METIS), for geophysical imaging. The goal of the METIS project, according to the company, is to obtain quality geophysical data in complex topographical locations, minimize environmental and safety risks, and improve turnaround time and costs.

The drone system uses Downfall Air Receiver Technology (DART) to “carpet” the ground in the exploration area with DART wireless geophysical sensors, Total reported. The drone fleet can deploy up to 400 DART receivers per square kilometer, with seismic traces recorded and sent in real time to a processing center, the company reported.

Devon Energy CIO Ben Williams said the use of robotics offers the potential to improve a company’s safety objectives through automating potentially highrisk tasks.

“Anytime you have the proposition of potentially removing someone from a high-energy work environment through robotics, that’s definitely something we’re interested in doing,” Williams said. “Where we can partner with service providers in which we can get people out of high-energy, hazardous environments like the rig floor or like a dangerous location in the field, that’s definitely worth doing. You’ve got to have a value proposition for any investment you make. Keeping people safe is quite a tremendous value for us.”

AR/VR

Technological innovations in the oil and gas industry are breaking down the limitations of space and time. Data-gathering is achieved much more rapidly and in quantities that never before seemed possible. AR and VR innovations are allowing industry workers to virtually be in two places at the same time and can drastically alter the visual perception of their workspaces.

For example, operators and service companies are finding AR is an ideal tool for training simulations and troubleshooting mechanical problems in the field.

Honeywell recently released a cloud-based simulation tool that uses a combination of AR and VR to train plant personnel on critical industrial work activities.

“With as much as 50% of industrial plant personnel due to retire within the next five years, the Honeywell Connected Plant Skills Immersive Competency is designed to bring new industrial workers up to speed quickly by enhancing training and delivering it in new and contemporary ways,” the company stated in a press release.

The training tool combines mixed reality with data analytics and Honeywell’s experience in worker competency management to create an interactive environment for on-the-job training. The program uses Microsoft’s HoloLens and Windows Mixed Reality headsets to simulate various scenarios.

“Megatrends, such as the aging workforce, are putting increased pressure on industrial companies and their training programs,” Youssef Mestari, program director for Honeywell Connected Plant, said in the release. “There is a need for more creative and effective training delivered through contemporary methods such as immersive competency, ultimately empowering industrial workers to directly improve plant performance, uptime, reliability and safety.”

Return to Scene, a company that specializes in visualization and data technologies, has partnered with tech startup Mozenix to develop a mobile AR application, R2S AR. The application supports the digitalization and automation of oil and gas operations. Return to Scene also works with companies like BP and ConocoPhillips on visual asset management.

“Offshore oil and gas assets are complex, adaptive structures with a constant flow of actions being undertaken by international teams,” Martin McRae, Return to Scene’s head of product development and support, stated in a press release. “The systems, which enable these actions, are underpinned by asset registers, which are represented by physical tags attached to equipment. The location of these tags and the ability to visualize data in a certain way is crucially important.”

However, for oil and gas companies to fully leverage the benefits of AR and VR, enough data—and enough of the right kind of data— must be in place. Devon’s Williams said an example of a key initial step to implementing these types of technologies is to collect 3-D images of the needed work locations, rather than just traditional diagrams.

“The emergence of drones and machine learning technologies are allowing us to—at a very low cost—capture and manage 3-D imagery of our field locations such that we can integrate that dataset into what is today a fairly developed set of automation tools for monitoring and identifying,” he said. “So if I can put someone in that space and highlight with AR the equipment that has the problem, then I can speed up someone’s activity out in the field, instead of them having to go out and start from scratch.”

Companies are beginning to leverage the capabilities of digital twins of their assets. According to Baker Hughes, a GE company (BHGE), a digital twin is a digital representation of physical parts, assets, processes or systems. Digital twins continuously collect data from sensors on the assets and apply analytics and self-learning AI to gain insights about its performance and operation, BHGE stated.

As part of its partnership with Microsoft, Halliburton has implemented AR and VR capabilities for training and field operations and has incorporated these innovations into its DecisionSpace enterprise platform. Halliburton’s Yeleshwarapu said the effect is the ability of the worker to interact with a digital twin of a reservoir or of a wellhead, for example.

“We have the platform and the unique ability to create an oil and gas digital twin,” Yeleshwarapu said. “Microsoft has the ability to provide the capability around AR and VR tools. You put that together with DecisionSpace Well Construction or DecisionSpace Production and you end up with an immersive way to interact with and understand the industry’s only true oil and gas digital twin.”

Schlumberger has implemented VR training systems for onboarding new employees who have no experience in the field and, more specifically, for its cementing downhole tool systems.

Steve Uren, head of simulation at Schlumberger, said, “The training for the cementing downhole tool systems evolved from what was primarily a traditional classroom environment into a VR simulation of the actual working environment of the system.

“We redesigned the entire program with some prework requirements. Once the employees arrive at the learning center, they engage in daily activities in the VR environment,” he said. “We basically removed all of the traditional classroom activities.”

Uren said the purpose of utilizing a VR environment rather than traditional classroom methods was to increase the fluency of specialists, make them more comfortable working in the field and enhance trainees’ ability to better process operational steps.

“In December 2017 Schlumberger rolled out a VR onboarding training program for new entrants to the industry, such as engineers and technicians who typically have not had field experience,” Uren said. The VR training environment simulates a variety of environments, such as a land rig, offshore jackup rig and offshore semisubmersible unit.

“The trainees can explore the general arrangement of the rig,” he said. “New employees are given an accelerated introduction to the different rig types they will experience; they have the opportunity to walk around and understand where the equipment is and what the equipment does.”

Uren added, “Feedback from the training has been very positive with employees embracing the change of learning environment and tools.”

The upgradable workforce

The leading case for many technological innovations in the oil and gas industry has been improvements in safety and efficiencies. Another component to the suite of advances, such as drones, robotics and AR/VR, are wearable technologies—physical devices worn by industry workers that augment their environments, monitor their functions and even track their movements.

BP has applied a variety of wearable technologies in its operations, particularly tagging devices that track employees for safety and to optimize worker performance. BP Technology Principal Blaine Tookey said such devices have been met with an overwhelmingly positive response where they have been implemented.

“We’re moving into scale deployments in some facilities where [managers] are saying ‘This can really make a difference to our operations,’” Tookey said. “This is really impactful for their emergency and safety performance. But also they can see the value added in day-to-day operations around understanding how people move and how we can support them better.”

He said companies typically rely on a worker to communicate physical or environmental problems they may be experiencing while on the job, which is a challenge wearable tracking devices potentially solve.

“We’re leaving it up to them to report that they have an issue,” Tookey said. “People might be getting fatigued or they might be getting dehydrated, but now we’ve got the capabilities through wearables to monitor those key parameters and understand in advance whether their performance might be dropping off and call them up and tell them to take a break, tell them to get some water and even intervene in more problematic issues.”

As wearable devices become more commonplace and accepted in the industry, they could become a part of a worker’s usual personal protective equipment that all field or facility workers take out with them, Tookey said.

“This opens up brand new opportunities—particularly in biometrics and streaming video, neither of which people don’t commonly use in a facility or as a wearable at the moment,” he said. “We’re also looking at exoskeletons, which are basically structures that you strap to yourself allowing for better endurance, safer lifting, safer holding and longer carrying.”

Although Tookey said wearable devices are still in the early days of being applied widely in the industry, there may soon come a time when wearables are in high demand by industry workers and may even become essential tools.

“In the longer term, wearables will evolve to more sophisticated monitoring and visual/cognitive aids essential for the work role and be seen as normal upgrades,” Tookey said. “People will wonder how they ever worked without them in five years’ time. And then [wearables] will develop to the point where people will perform significantly better with them. What you may see in 10 years’ time is people demanding, ‘I want to be upskilled, I want to have more capability, and I can’t compete without a wearable to help me understand the world and do my job.’”