Oil and gas companies and their suppliers are under mounting pressure to innovate, keep costs down, and improve the speed, safety, and agility of their exploration and oilfield R&D efforts. But the increasing sophistication of and demands on exploration and drilling techniques presents both benefits and new challenges. Technologies for reservoir simulation and management; 3-D seismic imaging; and advanced process, production, and completion analytics are enabling companies to explore leads that might not have been possible to drill or even evaluate in the past. At the same time, these tools (as well as other information streams related to legal issues or leases) are unleashing a tsunami of data that organizations are struggling to integrate, make sense of, and manage.

The crux of the problem is that many organizations continue to rely on previous-generation informatics technologies to handle the complexity and volume of today's generation of R&D data. Attempts to share data or deploy unified processes across the entire E&P operation result in a "patchwork quilt" that can quickly become an integration and maintenance nightmare. Improving E&P efficiency and profitability demands that oil and gas companies find a better way.

Islands of information

Answering this question of whether to drill requires a vast amount of data – everything from 3-D seismic images to reservoir simulations, well designs, charts, graphs, and detailed analytics and reports. In addition, numerous stakeholders need to weigh in, including geologists, geophysicists, chemists, materials scientists, engineers, and executives. Yet all too often, the data required for efficient and effective E&P decision-making is marooned on "islands of information." Critical samples are hidden away in a file drawer in the geology department. Chemical analysis reports are saved on a proprietary system that only the chemists can access. Offshore engineering expertise is tied up in a project four time zones away.

Typically, geoscientists and development practitioners have turned to manual approaches to bring together disparate data sources – spending hours searching through files, reformatting data, and cutting and pasting reports together, or enlisting IT resources to hand code customized, point-to-point connections to move information between various systems and applications (building the patchwork quilt). But these ad hoc and poorly structured attempts at information management are time consuming, error-prone, and expensive, especially as organizations seek to shrink their time-to-oil cycles. A simple and automated solution is needed for capturing, managing, processing, and sharing R&D data so information critical to E&P can be leveraged quickly and profitably. This requires an end-to-end, enterprise-level approach to scientific informatics.

Bringing the 'bigger picture' to E&P

Accelrys' Enterprise Lab Management Suite was created to help chemical R&D organizations move beyond informatics solutions that trap information in silos, create barriers to collaboration, and add unnecessary effort and expense to innovation. Designed to transform a collection of point solutions into an integrated information environment, components of the suite are built on Accelrys' Pipeline Pilot platform, a services-based open architecture that supports the plug-and-play integration of diverse data sources and processes, and has the ability to capture and manipulate complex scientific information.

For E&P companies, this means individual stakeholders can more easily access, analyze, report, and share data across departments and disciplines and that IT resources can be freed from the burden of manually supporting the varied requirements of multiple information consumers. Most important, it allows a more integrated picture of all knowledge assets related to E&P, improving decision-making and research speed.

An Enterprise solution

A large global oil and gas company deployed the Accelrys' Enterprise Knowledge Base (EKB), a professional services solution built on the Enterprise Lab Management suite, after facing challenges managing its inventory of samples from field assets.

In one particularly vexing case, the company lost or could not find a series of samples. A big part of the problem was that, like many other companies in the oil and gas space, this organization still had pockets of information management procedures mired in the Dark Ages, while deploying state-of-the-art technology in others.

Test drill samples were all handled manually – with thin sections glued to a piece of card stock, stored in filing cabinets, and mailed halfway around the world when someone needed to use one. Not wanting to repeat the costly data handling error that resulted in the lost samples, the company sought a more systematic way to capture valuable research, integrate data with other sources of information for integrated analysis and testing, and share data with global E&P project participants without putting the data at risk of loss.

With the Accelrys solution in place, the company captures sophisticated image data in digital form and makes it immediately available to the entire research enterprise for viewing and analysis. Once in the enterprise system, this data also can be searched easily, automatically integrated with other information for analysis, used to create detailed reports and visualizations, and dropped into automated process workflows that help the company streamline and speed its research efforts. Specific benefits have included:

  • The ability to quickly locate field samples that are needed in a matter of minutes rather than days or weeks;

  • The ability to track samples, including who is using them and who has requested them; and

  • The avoidance of expensive re-work and lost time that would have resulted from missing or inadequate data.

The problem has been that the different stakeholders have come to the table with different perspectives, different tools and systems, different information hierarchies, and even different ontologies. The Accelrys solution respects these differences by allowing the generation of domain and concept-specific meta data while making it all accessible, searchable, and actionable as a cohesive whole.

Better, faster, more innovative

When it comes to scientific informatics, a flexible, services-based enterprise approach makes it possible for companies to use data generated by sophisticated imaging and modeling technologies, legacy systems, domain-specific databases and more, while overcoming the integration challenges these myriad systems present. This enables time-consuming and error-prone manual tasks like image retrieval, formatting, processing, and reporting to be automated, which frees IT resources and speeds research efforts. Project participants can share information and work together more effectively. And from a competitive standpoint, E&P companies can drive faster, better and more innovative research discoveries.