Introduction

Collaborative Visualization Environments (CVE’s) have been proven to be highly effective over the past 10 years, bringing decision teams together around their data to both save money and enhance results. This has been especially prevalent in the geophysical exploration and interactive drill-path planning work processes. These environments were first introduced simply as larger dual-monitors on a UNIX workstation allowing a few people to collaborate progressing to today’s very large, impressive theaters with bright stereo projection and state of the art surround sound systems holding in some cases over 100 comfortable theater style seats. Success cases documented fro m the use of these environments are many and compelling thus justifying their broad proliferation in the energy industry and the creation and growth of a number of companies who construct and provide specialty software for them.

A catalyst for the popularity of the CVE in Energy has been the rapid growth of 3D seismic volumes. Offshore data over salt or complexly folded geometries ushered in long offset surveys so that robust pre-stack depth imaging could be performed. A litany of attributes are calculated which are fed into inversion programs to extract rock and fluid property clues. Modern interpretation applications provide the capacity to calculate many of these attributes "on the fly" increasing the spectrum of data and information applied to the analysis adding clarity, resolution and fidelity to the decision. Due to this, the project models have expanded by orders of magnitude, the analysis has became multi-disciplinary and the need for a new and different type of environment to view it has evolved. Along with this, the value of the investment portfolio increases.

The remarkable increase in productivity enabled by the combination of powerful volume-visualization software and high performance graphics engines paves the way for extension of their use to real time data analysis sourced from distributed sensor networks. The use of volume visualization has increased the interpreter’s bandwidth 100,000 fold in 10 years. Not only does this allow massive amounts of data to be analyzed in a fraction of the time it previously took, but it brings these tools into the Real-Time-Reservoir-Monitoring (SmartField) domain because it processes data so fast in real time. New data streams enabled by the real-time aspect will monitor the changing reservoir, in some cases giving the first real views of the time variant behavior of the reservoir so that operational changes can be made as they need to be made to retain optimum operational performance.

Collaborative Visualization Environments – Business Benefits Yesterday and Today

The advent of computerized seismic interpretation in the middle 1980’s made it clear that the approach to interacting with seismic data had irreversibly changed for the better. The early adopters of this new work process paved the difficult road to make these tools user friendly enough for every day use sometimes earning deep scars in exchange for their unwavering evangelism. These early tools were clunky, had poor data import/export routines, created small views of previously wall-sized displays and the resultant "automated" maps all had to be redrawn to look geologic. But, they delivered value by increasing the interpreter’s bandwidth, accelerating interpretations and generating new prospects with better constraints in record time. This created measurable value in both the exploration portfolio and exploration success and illustrated the potential yet to be realized.

The middle 1990’s ushered in the next revolution in visualization – the interactive volume display. This dramatically increased both the amount of data that could be seen, and the rate at which it could be reviewed. Early versions did not allow interpretation, only visualization so the interpretation work remained on 2D sections which were imported into the volume displays for QC. The benefits of seeing the data and the interpretations in a volume setting were immediately obvious highlighting jumped legs and disconnected fault segments that were serious errors in the interpretation. Seeing, finding and repairing interpretation errors not evident from 2D examination both reduced the likelihood of drilling wells into low probability prospective areas and preserved play concepts that may have been abandoned had the well been drilled and found to be dry.

Fig. 01

Around this time, high resolution projectors were becoming available and many conference rooms were becoming equipped with them. Setting up a collaborative session meant putting your workstation on a handcart and wheeling it down to the conference room where it was connected to the projector. This proved highly effective in collecting larger groups of people around the data and discussing various scenarios of what the data might imply.

Today’s big wins are in modern visualization rooms outfitted with state of the art projection systems bright enough to have room lights on and back projected for easier interaction with the data. Today’s rooms are built either as the theaters we know so well, asset rooms to manage reservoirs and increasingly, Remote Operations Centers (ROC’s) to consolidate all the relevant information from an oilfield operation into a common space and bring the operators together with the field managers around their data to reduce cost and increase production.

The application category that has generated the most immediate and measurable value has been the interactive geosteering of drilling wells. These applications, which combine large volumes of seismic data with immersive tools to plan and monitor complex well paths have cut up to hundreds of millions of dollars off of drilling programs and added hundreds of millions of dollars of additional oil to the tank. The biggest cost savings are from two factors. First, reservoir planning in this environment has been proven to greatly reduce the number of wells necessary to recover the oil. Second, the incorporation of precision directional and formation evaluation data from measurement while drilling systems tell the earth scientists how to modify their geological model based on new drilling information. This is then sent to the drilling engineers who can recalculate the drilling plan, in real-time. By adjusting the plan while drilling, targets that would have been missed due to subtle variations in the seismic velocity model are now hit first time eliminating plug-back and redrill procedures of the past.

The increase in recoverable oil derives from the wells being more precisely placed in the reservoir. As the well is being drilled and new information is incorporated into the plan, well paths can be tuned to penetrate optimum locations, recovering maximum reserves from the well.

Major companies around the world have found that assets deliver up to $400 million more value by using these tools than if they do not.

Hardware and Software Infrastructure

The hardware and software information technology core to deliver these extended capabilities is mostly available and proven, such as are showcased in the large install base of Reality Centers and the growing number of Remote Operations Centers.

The hardware for the new trend, Remote Operations Centers has at its core powerful visualization servers that handle the large, volumetric data and interactive applications for seismic interpretation, earth modeling and interactive well planning and geosteering. But the visualization hardware and the applications only display pictures. The pictures are derived from modern computational servers connected via high-speed dedicated networks – Storage Area Networks – using shared file systems so that files are not replicated or transferred across the networks for analysis or visualization. This balanced system provides rapid and ready access of the data displayed either in large format screens or on the desktop.

Fig. 02

Exciting new concepts and approaches to supporting Remote Operations Centers are under development or are being integrated. Principal among them are innovative ways of fusing media in a collaborative format, bringing data from multiple sources into single, common views. This updates the approach where individual monitors or screens house windows dedicated to applications. The Media Fusion approach brings multiple data sources together into a single windowed view where the teams can assess, interact with, evaluate and make decisions.

New approaches to monitoring both subsurface and surface operations will generate new streams of data that mandate real-time assimilation into distributed decision models via graphical presentation to distributed teams. Examples of this are self-organization networks that link widely but densely distributed sensors forming a mesh network of monitors in the field, 4D seismic and new methods for passive (always-on) seismic.

Application and middleware fusion software will drive the decision environments linking operations teams with reservoir management and modeling teams. Scalable, extensible, reconfigurable, secure, distributed and system-balanced storage, computing and visual serving systems will be the infrastructure upon which existing and new powerful software will drive this new business process. This business process will be the catalyst to dramatically reducing operating costs while simultaneously increasing production, extending the economic life of the fields and increasing ultimate recovery from the reservoirs.

New Data Sources, New Transmission Backbones

A virtual onslaught of sensors such as fiber optic arrays, wireless-battery-powered-sensors in self-organizing networks, 4D seismic and passive seismic will challenge the capacity of all aspects of the infrastructure, but the business benefits that they will deliver will justify investments in maturing and deploying these technologies and the infrastructure. Broad-band wireless infrastructure on the horizon and the current capacity for self organizing mesh networks will facilitate the deployment of much of this equipment into operations where running copper or fiber cable is prohibitively expensive. As these technologies mature, they will create economically viable deployments of the infrastructure in areas never before possible and even replace aging wired infrastructure as a more economic solution.

Fig. 03

A few forward looking researchers are envisioning and building the designs to deploy IPv6, which when coupled with broad-based, wireless, self-organizing nano-sensors will deliver real time information from virtually every square meter of the earth's surface, subsurface, air and space. GPS aware systems will communicate with data management environments to query both static and time-variant data associated with locations or regions. Queries will be analyzed by high-performance computers, the results transferred to visualization servers for transmission to decision makers or operators. The fecundity of this system will be dependent upon the performance of the information technology infrastructure, which will be defined by the smallest bottleneck. IT departments will need to assess where these bottlenecks are, what their cost is to the business and what the cost is to eliminate them thus providing a measurable ROI for the solution.

Challenges – Not All Technical

It is clear that many aspects of the sensor-network enabled oilfield are not mature and the eventual total solution may not be something we can clearly see in total today. It is certain that as these technologies develop we will be faced with unforeseen technical challenges that appear to threaten the viability of the approach. Identifying these roadblocks and mitigating them early will be a great but necessary challenge to meet.

Solutions are never broad-brush. They must apply to specific real world problems with measurable and predictable results. But every intervention into established routines carries risk and that risk must be assessed and managed relative to the value it might create. The typical risk evaluation focuses on technical risk which sometimes can be significant. But more often than not, the major risk is not the technology but rather its successful deployment, including and especially the transformation of the work process and company culture to take the greatest advantage of it. Smart people have tackled difficult technical problems time and again in the energy industry delivering extraordinary value. Moving these breakthroughs from prototype to standard operating procedure however can take decades, even after the technology is well demonstrated to create significant value.

Especially in operational settings, companies do not want serial number 1 of any product. And in today’s oil business, research and development has been largely outsourced to software, hardware and service vendors. Commercialization of new technologies must occur for future research to be conducted by the service industry, thus robust demonstrations of value to the industry must be conducted as early as possible to a willing audience. Early adoption will be mandatory for the operating companies to ride an earnings growth curve fueled by innovation. Discovering appropriate and affordable ways to demonstrate both value and potential in the early phases of commercialization will be a shared challenge of the providers and consumers of new approaches.

Tomorrow and Beyond

The oil reservoir remains vastly data underconstrained even after a century and a quarter of drilling, mapping and collecting and analyzing data. Seismic provides high spatial resolution compared to wells, while wells provide high vertical resolution compared with seismic. The combination is a sparsely sampled subsurface world, yet we make in some cases, multi-billion dollar decisions from this. Manufacturing process improvement conclusively illustrates that measuring the process improves its efficiency.

Even realizing the tsunami of new data on the horizon, the fields will remain underconstrained but this new information can and will create tangible and measurable business benefits if managed, processed and delivered to the decision makers in a fast, secure, robust and collaborative manner. How do we add high-density/high-resolution constraints to the subsurface? How do we measure the behavior of the reservoir in detail to modify it to meet business models? How do we manage the enormity of the data in a distributed work process that will come from successfully answering these first two questions?

Tomorrow’s information technology will deliver this, and deliver it transparently to the user. It must. Else it will not be used.

End

     

About the Author(s)

Bill Bartling is Senior Director of Market Strategy for SGI, responsible for SGI’s strategy and position in the oil and gas industry. Prior to joining SGI in April, 2001, he was President/CEO of SciFrame, Inc, a software company providing leading-edge collaborative solutions to the energy and pharmaceutical markets. Prior to SciFrame, he was manager of technical computing for Occidental Oil and Gas, where he set strategy, architected solutions and implemented E&P technology to pioneer the use of collaborative volume earth models in large-format visualization centers. Bartling has served senior vice president, software engineering, for CogniSeis Development, where he oversaw the development of their flagship products, Focus, VoxelGeo and GeoSec. Before joining CogniSeis, he spent 15 years in exploration, production and research with Chevron Corporation, where he was involved in the use, testing and design of computer interpretation and analysis products. Bartling holds a M.S. degree in geology from San Diego State University and a B.A. in biology from the University of California at Santa Barbara.

David Hughes with a background in product design, structural analysis and the flight simulator business, where he developed the some of the world’s first simulation software products and solutions on the first SGI systems, David Hughes joined SGI in 1987 as a technical specialist, supporting the simulation and animation market place.

David’s involvement in 3D simulation has continued throughout his career at SGI, where he has held roles in support, sales and global marketing, including the position of global market manager for the Simulation Industry, when he was based at SGI’s headquarters, Mountain View, California. David returned to the UK in 1993 when he conceived, developed and then operated the world’s first SGI Reality Centre™, a centre of excellence for 3D visualisation. He then managed and oversaw the SGI Reality Centre™ solution, working to help business and industry solve their toughest business challenges through visualisation technology. After establishing this technology as a global business, he has been developing strategies that enable customers to use Reality Centre technology to facilitate remote collaborative working on a global basis, resulting in higher availability, increased productivity and wider access. As part of this, and unusually for someone in market development, he filed for patent on a key innovation for intuitive user interaction in virtual environments. David is currently responsible for managing the growth of SGI’s graphics and visualisation business across the EMEA region.

David recently received the award of UK Information Technology Professional of the Year for his work in pioneering the Reality Centre concept and business success, is a Fellow of Teeside University and has been recently appointed Honorary Professor at Glasgow University.

He lives in the UK with his wife and 2 sons aged 9 and 11.

Eric Frost has been on the faculty in the Geological Sciences department at San Diego State University for 25 years working on a variety structural geology, remote sensing and visualization projects. He is currently Co-Director of the Center for Information Technology and Infrastructure (CITI) and the SDSU Visualization Center. His work focuses on the use of real-time sensor networks that are aggregated into a command and control center where data fusion products are developed and visualized for decision support. These decisions are then returned to the field for action in disciplines such as Earth Science, Homeland Security, and law enforcement. He has worked with the US IPv6 consortium and is helping implement applications in remote sensing and Homeland Security. Frost is also involved in several major NSF projects such as OptIPuter to help build optical networks and Internet capabilities for the future working with the California Institute of Telecommunications and Information Technology, or Cal-(IT)2. Frost also works extensively with areas such as Central Asia (Kazakhstan, Kyrgyzstan…) and SE Asia to help with humanitarian and development uses of computing, visualization, and communication. He teaches primarily graduate classes in Collaborative Visualization, Sensor Networks, and Earth System Science and taught Extensional Tectonics for the Chevron Structure school for ten years.

About SGI

SGI, also known as Silicon Graphics, Inc., is the world’s leader in high-performance computing, visualization and storage. SGI's vision is to provide technology that enables the most significant scientific and creative breakthroughs of the 21st century. Whether it's sharing images to aid in brain surgery, finding oil more efficiently, studying global climate or enabling the transition from analog to digital broadcasting, SGI is dedicated to addressing the next class of challenges for scientific, engineering and creative users within the government and defense, sciences, manufacturing, energy, and media markets. With offices worldwide, the company is headquartered in Mountain View, Calif., and can be found on the Web at www.sgi.com. IBM invests hugely in research. Its $5.7-billion research and development budget is the largest for any technology company. We have about 3,000 researchers in eight laboratories across the world, including 5 Nobel Prize winners and 35,000 patents – for the past 10 years IBM has been the leading company in issued patents across all industries.

References

Appendices

Join the Conversation

Interested in starting, or contributing to a conversation about an article or issue of the RECORDER? Join our CSEG LinkedIn Group.

Share This Article