The buzz over e-commerce - the company start-ups, the mergers and acquisitions, a few notable failures along the way - has probably served to sour some on talk about new technology when it comes to computers, information systems and the Internet. Yet folks, particularly if they're in decision-making capacities within oil and gas companies, would be well advised not to turn their backs on computer technology too soon. At some distance from the e-buzz hype is a quiet but major revolution, as computer and Internet technology is poised to change the way oil company employees work in a way not seen since the introduction of PCs to the workplace in the 1980s.
Several new developments are either already being deployed or will be commercially feasible within a short time. These can generally be grouped into discussions of visualization technology, Internet collaboration and application service providers (ASPs).
Visualization
Visualization is not particularly new to the oil industry, but acceptance has taken awhile to grow. This is probably because visualization requires people to look at their data in a new way. It seems like geoscientists would welcome the opportunity to look at a three-dimensional rendering of the three-dimensional subsurface. But old habits die hard.
Deborah King Sacrey, a consulting geologist in Houston, Texas, uses visualization software with her clients, many of whom are older geologists used to working on paper. "These guys have a hard time understanding a three-dimensional concept," she said. "They've been drawing maps all their lives, and they can't see it in depth as well."
Another issue that continues to plague the higher-end form of visualization, which includes the immersive environments that many companies are building, is that compromises often must be made with the equipment to lower cost or make the facility more ergonomic. Tracy Stark and Mary Cole, formerly with Arco, spent several years developing an immersive environment for that company before Arco was bought by BP and Arco's Visualization center was donated to the University of Colorado at Boulder as the bp Center for Visualization. Using Arco's first system, researchers discovered that even though the immersive software needed for these systems was still in its infancy, great value was gained, and even the use of standard applications generated positive results.
"We quickly discovered that while completely immersive displays were useful, the cheapest and best bang for the buck is getting a lot of folks in and seeing the same thing," Stark said.
Arco went farther with its second system, working with MechDyne Corp. to create an environment that could be opened up for a large group or closed to create a box-like system in which scientists could be totally immersed in their data. The system was successful, and Stark and Cole recalled several instances where it saved the company money by causing interpreters to reconsider well locations or by reducing cycle time.
In one instance, Arco was working with partners to design a fairly complex development scheme. "There's always a communication issue when multiple disciplines collaborate to interpret complex data, but large-scale visualization can improve communication," Cole said. "In this instance we were able to integrate and discuss complex data to quickly agree upon a solution. There were surface hazards on the ocean bottom with gas chimneys and mud volcanoes, and we had 3D seismic but no nearby wells. They were trying to hit multiple targets. It was a complex situation, but the ability to quickly reach a unanimous decision regarding well location made the use of large-scale visualization valuable."
Magic Earth, a privately held company spun off from Texaco, has taken the immersive visualization concept one step farther, developing a software product called GeoProbe specifically for these environments. GeoProbe can process enormous datasets in much less time than conventional interpretation software, and its technical highlights include volume-based and multiattribute auto picking; interactive display and interpretation of data in any arbitrary orientation; volume rendering, interactive calculation, display and interpretation of volume attributes; well log and cultural data display; calculation and display of horizon-based attributes; and simultaneous multiple volume viewing.
But bigger isn't always better. Louis Liro, a senior geologist with Veritas Exploration Services, said that depending on where an interpreter is on a project, workstation visualization is a perfectly viable part of the process. "In the early part of the last decade, visualization was viewed as a novelty that a few people used in very expensive visualization centers using Imax-like technology," he said. "But it's evolved to the point now where visualization is not defined by the size of the screen. It's defined by the problem you're addressing."
This is not to say Veritas eschews the concept of a visualization center. The company has four such centers around the world, and Liro said the centers are invaluable in certain parts of the workflow. For instance, when a dataset comes in to be interpreted, the team views it in the visualization center with all of the project interpreters and interested clients present. While viewing the data in this setting the team will devise a strategy for tackling the dataset.
"This pays incredible dividends in terms of efficiency," Liro said. "If we start at 8 a.m., by lunchtime we know where we're going with that project."
The team then splits into smaller work groups in workrooms or as individuals using desktop tools. Once this phase is finished, the team will reassemble in the visualization center to QC the data and possibly merge the individual interpretations. Then clients are invited in to view some of the data's points of interest.
"Whether it's individual workstations, group workrooms or visualization centers, it's really dictated by where we are in the project and the degree of interaction we need to have with each other and our clients," he said.
What Sacrey does is even more basic. She uses Kingdom Suite software from Seismic Micro-Technology, which integrates different types of available data and allows the interpreter to see the resultant earth model in three dimensions. Kingdom Suite is not the only software that offers this technology, but it manages to offer it at a cost that doesn't frighten the average accountant.
"One of the differences is scale," said Paul Jones, national accounts manager for Seismic Micro-Technology. "If you look at (some of the other systems on the market), the scale is huge. You're talking about something that's realistically going to cost you (US) $500,000 or more by the time you add the infrastructure. We're presenting a product that's more realistic.
"It puts visualization at everybody's fingertips, brings it down to a working level where you don't have to go to a special room or a special box or a special place. This is something the average consultant or interpreter can afford to have on their desktop."
Sacrey, in fact, has it on her laptop. "On my computer I can go through a dataset, pick my faults and have nice little fault planes in a third less time than I would in 2D data," she said. "That's one of the advantages of using a visualization tool for interpretation."
Collaboration
But most interpreters would probably agree visualization is only part of the battle; collaboration is also crucial. Veritas is close, Liro said, to having its visualization centers remotely connected so interpreters can work on the same dataset from different locations in real time. Major oil companies like Shell also are moving toward this goal.
But this is just one of the ways the Internet is impacting the ability of geoscientists to work together from far-flung locations.
At Landmark the vision is about layers of integration. "We believe that we've done great things with the bottom layer, data integration," said Murray Roth, vice president of exploration, development and information management at Landmark.
"We're moving to a higher level now with workflow integration, and we've been satisfied with how we're able to move a workflow through basin analysis, prospect generation, making development decisions, etc. What we're really targeting now is the highest level of integration, the operational type of integration. If I use that focus, that value chain is not resident within traditional oil companies. It screams for virtual asset teams because it needs the expertise of service companies."
There are numerous ways these virtual asset teams can do their work. Roth said Landmark is installing a workroom at its Denver office to facilitate this type of workflow. Team members have laptops with wireless network cards, and they can congregate in the workroom, flip open their computers and begin working. A stereo projector provides the ability to do 3D visualization.
GeoQuest supplies another model with its DecisionPoint product. Using portal technology, DecisionPoint allows members of an asset team to customize a page that brings up information they need to do their jobs more effectively. A geophysicist, for instance, might want access to seismic lines and a reservoir model. The production engineer might want the production reports and maps of the field highlighting wells of interest. The asset manager might want sales figures and production decline information.
When problems occur, the different team members can work together efficiently to find a solution. Ron Mobed, vice president and general manager of data management for GeoQuest, described a typical scenario. "Production is falling," he said. "The production engineer knows it because he's been seeing his production reports, and the asset manager knows it because his sales are down. They ask the geophysicist if he can see anything in the reservoir model that explains the sudden decline.
"You can introduce a variety of collaborative tools - a chat room, video conferencing, some kind of data reporting tool. The conversations are captured, and thus the whole decision is captured."
GX Technology (GXT) has a slightly different approach to collaboration. The company offers advanced seismic imaging services, with a special focus on prestack depth imaging. The depth imaging process blends interpretation and seismic processing, requiring unusually close collaboration between the client and GXT's depth imaging experts. In August 2000, GXT launched BLink, a service that uses Internet technology to allow this collaboration without physical presence being a requirement.
The idea was hatched while the company was working on a project with Anadarko. Though both companies are based in Houston, even the 30-minute drive between their offices could make it inconvenient to meet and review interim velocity models and seismic images. With that frustration in mind, GXT developed BLink as a Web-based service to provide Anadarko and its other clients with the ability to remotely collaborate with GXT from their desktops. While BLink can be accessed through the Internet, some clients have established dedicated T1 lines to GXT, providing them with more predictable, high-bandwidth connections to GXT's imagers and supercomputing resources.
"BLink makes it possible for clients, their partners and their GXT imaging teams to jointly review and interpret velocity models and seismic images, all without leaving their offices," said Kevin Donihoo, senior vice president of marketing and product manager for BLink. "Unlike other forms of seismic processing, depth imaging is a very iterative process, alternating several times between rounds of velocity/depth modeling and depth migration. Close collaboration between client interpreters and our imaging team after each iteration has a direct payback in image quality, so we're motivated to do everything we can to ensure that collaboration is as effective and timely as possible."
ASPs
In addition to accessing people over the Internet, geoscientists benefit from being able to access data and software applications. ASPs are in their infancy in the oil industry, and no one is quite sure which model will be the most successful. But being able to use any application at any time to get the job done will certainly impact the geoscience workflow.
Geonet Services (www.geonet
services.com) is one such ASP. Geonet's goal is to operate a "pay-as-you-go" type of service, enabling users to essentially rent software applications on an hourly basis rather than paying for licenses and trying to maintain the internal infrastructure necessary to keep the applications working properly.
"We're moving the way people access technical software from a capital expenditure basis to an operational basis," said Jim Honefenger, vice president of marketing, vendor clients, for Geonet. "Most of the software we're offering takes clients 12 to 18 months to acquire by the time they get it in their capital budget, run it through their purchasing procedure, etc. If we can shorten that time frame, someone can solve the problem much faster using the software that they need or that they want and paying for it through an operational spend. There's an increased efficiency just in that process."
Add to that the facilitation of collaborative work environments, centralizing asset team data storage, access to a wider variety of software, reduced need for IT staff and maintenance, and the ability to buy commercial data over the Internet, and ASPs begin to sound like a viable solution. "What the Internet provides is speed and convenience," Honefenger said. "And it's inexpensive."
ASPs will face hurdles, however, until some issues can be worked out. "This is not just a matter of loading software onto a piece of hardware, plugging it into the Internet and saying, 'Have at it,'" Honefenger said. "It's a complicated process to get the applications up and running, to provide authorization to the people that access it and provision all the software, to track how the software is used, and to make sure that the people using it have authorization to use it."
Landmark and GeoQuest also have their versions of ASPs. Landmark has teamed with SAIC to offer Grand Basin. The site will offer competitors' applications as well as Landmark applications, and the idea is to allow oil companies to outsource the headaches associated with software licenses while being able to access the applications that will enable them to make quick, accurate decisions.
GeoQuest has launched LiveQuest, offering users remote access to GeoQuest and Merak applications without downloading the software. The company uses its data management centers as the data and application repositories for this service. While the site hosts only GeoQuest software, Mobed said the company has no objections to hosting competitors' software and in fact outsources the management of applications for some of its users, whether that software is GeoQuest software or a competitor's software.
Many ASP sites are built around the concept of a powerful central ASP location serving a large number of physical locations. GeoQuest has chosen a different route. "When we talk to people about bandwidth restrictions and sensitivities from national governments, we tend to take the view that we're better off leveraging our physical presence in many of these countries," Mobed said. "We have data management centers in 11 different countries already established, so in terms of startup costs and other costs that we'd have to pass on to the customer, we've already established a business around managing the information. Quite frankly, if you physically separate the information and the application, you're probably not going to get some of the advantages of the ASP in the first place."
As long as there are bright minds and high-speed computers, there will be new ways to work in the oil industry. For some of these ideas, the market will be the ultimate judge, but geoscientists will be the ultimate beneficiaries as these new technologies allow them to do more work in less time and get better answers in the process.