Philippe Doyen is known for his early work on determination of porosity from seismic data using co-kriging technique. Since then he has made several technical contributions on reservoir modelling, uncertainty analysis and quantitative 4-D seismic interpretation. After graduating from Stanford University in 1987, Philippe worked for Western Geophysical in Houston and London. Shortly after the merger between Western Geophysical and Geco-Prakla, in January 2002, he was transferred to Schlumberger and appointed Research Director for Schlumberger Information Solutions. Philippe joined CGG in October 2003 where he and is currently Product Manager for Seismic Reservoir Characterization in London.
Please tell us about your educational background and your work experience?
I first got a Mining Engineering degree from the University of Louvain in Belgium. I always liked mathematics, physics and rocks, so studying geophysics was a natural step for me to combine the things I enjoy. Opportunities for graduate studies in geophysics were limited in Belgium so I obtained a fellowship to pursue a Masters degree in the US. I chose to go to Stanford because the university has a great school of earth sciences. I ended up doing a masters in exploration geophysics and a PhD in rock physics there. Upon graduating late 87, I joined Western Geophysical where I started as a scientist working in a small reservoir geophysics R&D group in Houston. I got transferred to London late 1989 and worked with Western as R&D manager for several years. I was managing a small team of research geoscientists and software engineers. We were working on the development of interactive software tools for seismic reservoir characterization using geostatistics and rock physics. My team developed the SigmaView® system, one of the first commercial tools available for seismic-guided mapping of rock properties using geostatistical techniques. The tool was developed in collaboration with Mobil. The software also included a unique dynamic graphics environment for data analysis and well-seismic calibration. With the same team, I worked on the development of a 3-D seismic-based geomodeling package, called EarthGM. I also participated in a large number of integrated reservoir studies involving the construction of 2-D and 3-D geomodels by seismic and well data integration. It was really fun to be involved in these projects from different oil provinces in the North Sea, Middle East and South America and successfully apply the tools we had developed.
So you remained hooked on to Western Geophysical. How did the move to Schlumberger take place?
Just after the creation of WesternGeco by merger of Western Geophysical and Geco-Prakla, I was transferred to Schlumberger where I became research director for Schlumberger Information Solutions (SIS). It was an interesting and challenging job where I was managing different research programs with activities in geometric modeling, uncertainty quantification, reservoir geology and geophysics, production optimization, financial risk analysis and petrophysical laboratory measurements. It was hectic as I was commuting between Cambridge in England and Ridgefield, Connecticut where the different departments were located.
In 2003, you moved to CGG, What responsibilities does your present job entail?
I am currently product manager at CGG for all software products related to seismic reservoir characterization. I manage R&D activities related to the development of new tools involving seismic inversion, rock physics, geostatistics, 4-D interpretation and seismic fracture prediction. A lot of the tools we develop are integrated in a software platform called StrataVista, which we use for example for prestack elastic inversion. We are working on the development of a new petrophysical inversion technique, which allows us to directly invert pre-stack seismic data for rock properties such as porosity. This is the technique I presented during my talk at the CSEG. We are also working on stochastic inversion and uncertainty analysis.
What areas of geophysics fascinate you in particular and why?
I have always been interested by the issue of data integration. In particular, how can we combine seismic and log data to construct improved reservoir or earth models. Workflows in oil companies are becoming more and more earth model–centric in that a lot of the decisions about field management such as planning of new wells are made with the help of a numerical earth model. There is therefore a strong demand for better integration of geophysical information in earth modeling workflows. Recently, we see great interest in issues such as rapid update of earth models, integration of 4-D seismic information and better quantification of subsurface uncertainties. All these topics are also of great interest to me.
Your earlier work on determination of porosity from seismic data, using colocated cokriging was interesting? Tell us about the later research work you did?
As I said before, my main research interest has always been improved seismic reservoir characterization and earth modeling. Over the years, I was involved in the development of a number of techniques and workflows linked to this theme. I have worked on a number of new stochastic simulation and estimation techniques that can be used to combine seismic attributes and log data for improved rock property modeling. For example, with my colleagues, I developed Bayesian updating techniques that go beyond traditional sequential simulation schemes and allow flexible incorporation of seismic constraints. I also worked on the issue of seismic downscaling; that is how to constrain a fine-scale 3-D reservoir model with band-limited seismic data. To me this remains an important topic that is only partially solved. If you talk to engineers, you find that they are often skeptic about using seismic to constrain reservoir models because of the limited vertical resolution and their need to construct vertically detailed models for flow simulation. In the late 90’s, I was also involved in an interesting multi-year project with Statoil where we worked together on the development of the “4-D earth model” concept. The idea was to see how we could link 4-D seismic data with flow simulator outputs. We also worked on the problem of estimating changes in saturation and pressure from time lapse data using rock physics and geostatistics. At Schlumberger, I was involved in an interesting project involving seismic pore pressure prediction and the development of uncertainty propagation techniques. The idea was to build a 3-D probabilistic mechanical earth model where we can store uncertainties about all the variables used in predicting pore pressure. We developed stochastic simulation techniques to propagate all uncertainties in the rock physics models that are used to link seismic attributes to pore pressure. This was done in full 3-D so we could assess uncertainties in the mud weight window along any proposed well trajectory. At CGG, there are several exciting R&D projects in which I am involved, including stochastic inversion, direct inversion for rock properties and a new lithology Bayesian classification tool that uses a flexible non parametric modeling technique.
Tell us about the most challenging project you may have done and how it turned out to be one that you won’t forget?
It is difficult to choose a single most challenging project. To me, as an R&D guy, the main challenge in seismic reservoir characterization is when you try to solve real-life reservoir projects with the technology you have developed. Invariably you find that the reality is more complicated than you imagine. Also, you find that every reservoir characterization project is different and you need to customize the workflow each time. When I was working at Western Geophysical, I was leading a team of geoscientists and software specialists. We were doing R&D in seismic reservoir characterization, developing interactive software tools and applying these tools in the context of complex reservoir characterization or monitoring projects. This task integration in a single team was very challenging but also rewarding. It meant that the software tools were developed with a good practical knowledge of real problems. We ended up developing software systems such as SigmaView and EarthGM that were very flexible and could cope with difficult problems. In fact this relates to a concern I have with some geomodeling tools that are on the market today. They tend to be push-button applications supporting linear workflows. If you try to deviate from the set workflow, which you often need to do, you find that you cannot do what you want.
To give you a concrete example of challenging and rewarding project, I would like to mention a study I did with Saudi Aramco whilst working at Western Geophysical around 2002. The goal of the project was to construct a large reservoir model of the Khuff gas reservoirs in the Northern part of the Ghahwar Field by combing log, core and seismic data. Lennert den Boer (a colleague of mine at the time) and I had to spend several weeks in Dhahran to work on the project using the EarthGM software. In a few weeks and in close collaboration with Saudi Aramco staff, we were able to construct a detailed seismic constrained model of lithology, porosity and permeability. The project was really challenging as we had to combine many different types of data together, including vast amounts of core data, which are very important for complex carbonate reservoirs. The project had an important impact on the calculation of gas reserves for the field and was used to plan several successful producers. We actually ran our software in Saudi Aramco’s virtual reality room and used it to plan the new well trajectories and extract predictions of the reservoir properties that were later compared to the logged results from the new wells. The project was really enjoyable and involved close integration of several disciplines. We had many meetings around the workstation and in the virtual reality room with Aramco petrophysicists, geologists, geophysicists and reservoir engineers and to me this was a great example of an integrated project.
S: What advances are being made in the area of seismic reservoir characterization?
In the last few years, we have moved from acoustic inversion to elastic inversion where a number of angle stacks are inverted simultaneously to provide estimates of P- and S-wave impedances as well as density when large offset data are available. Using simultaneous elastic inversion we are now able to better estimate rock properties such as lithology, porosity and fluid saturation. We now have 2 or 3 inverted elastic attributes that we can use together to reduce uncertainty in rock property predictions.
We also see a trend in the industry towards better integration of seismic interpretation and inversion workflows with geomodeling applications. In the past, seismic interpretation was made using specialized 3-D interpretation workstations with no link to reservoir modeling tools. 3-D geomodeling tools have become mature and it is now possible to link in the same workflow the structural and stratigraphic interpretation of seismic data with the construction of the stratigraphic grids that are used to provide geomodel frameworks. This link allows a faster update of the geomodel as the seismic interpretation is refined or changed, for example when a new fault is identified from 4-D data. At CGG, we are also pushing to integrate inversion and geomodeling workflows. We perform seismic inversion in a stratigraphic grid that can be used directly by downstream reservoir modeling applications. We have also developed a new interpretation procedure based on the calculation of seismic dips which automatically extract a micro-layer stratigraphic framework from seismic data. This layered framework, which closely follows seismic events, provides a better framework for inversion.
Another recent trend in seismic reservoir characterization is the increased use of rock physics. If you look at (C)SEG or EAGE conferences a few years ago, rock physics was seen as the domain of lab specialists and there were relatively few applications of rock physics models to field projects. Today rock physics has become mainstream and is used routinely to calculate rock property cubes from inversion results. We also see more and more applications where rock physics is combined with geostatistics to evaluate subsurface uncertainties. For example in a 4-D context, when we want to interpret quantitatively changes in elastic properties in terms of changes in reservoir pressure and fluid saturations, we use rock physics to define a petro-elastic transform. We know that the transform is non-unique in that there are many different scenarios of reservoir porosity, lithology, fluid and saturation changes that will yield the same 4-D elastic response. The solution is then to combine rock physics with geostatistical techniques to look at how uncertainties propagate in the rock physics transforms.
In general, uncertainty assessment is a big topic in the industry today. As a result, we see a lot of interest for stochastic inversion approaches. This technology has been around for several years since the early work done by Dubrule and Hass at Elf in the mid nineties. In the last few years, we have seen a lot of interesting work in this area, for example the work by Buland at Statoil who uses an elegant Bayesian framework and has developed a very fast stochastic inversion scheme. At CGG, we have a collaboration with Total on this topic and we are introducing a new, fast stochastic inversion scheme that inverts simultaneously multiple angle stacks and works directly in stratigraphic grids. This facilitates the integration of the inversion results with downstream workflows where different realizations from the stochastic inversion can be used to construct multiple geomodel scenarios for uncertainty analysis.
In your experience what would be a good example of technology integration?
A good example of technology integration at CGG in the area of seismic reservoir characterization is the recent development of a technique for direct petrophysical inversion. This technique was developed in the context of a collaborative R&D project with Marathon. It is a good example of combining together different technologies related to 3-D gridding, time-to-depth conversion, seismic inversion and rock physics. Traditionally, integration of seismic data in a geomodel is achieved using a cascaded workflow. First we invert seismic data into estimates of elastic properties. This elastic inversion is typically band-limited and performed in the time domain. In a second step, we convert the inverted data cube from time to depth. Finally, we resample the inverted data into the 3-D stratigraphic grid of the reservoir model and use the seismic attributes to guide the interpolation of well data using techniques such as cokriging.
There are several difficulties in the traditional cascaded workflow. The first one is time-to-depth conversion: the well-seismic petrophysical calibration is very sensitive to depthing errors and usually the velocity field used for depth conversion is not consistent with the velocities obtained from the elastic inversion. Another problem in the traditional workflow is the downscaling of the elastic inversion results: the thickness of individual cells in the geomodel is typically a few meters whilst the inversion results typically have a vertical resolution of tens of meters. This difference of scale is often ignored in traditional cascaded workflows.
The new workflow for direct petrophysical inversion is quite different. We start with an existing, fine-scale 3-D geomodel in depth, constructed, for example, by well interpolation. We then use a simulated annealing inversion process to update the properties of the geomodel – for example the porosity or fluid saturations – to ensure that 3-D synthetics generated from the reservoir model match the observed seismic amplitudes. In this new application, we combine a number of new technologies. For example we use a multi-domain, multi-scale stratigraphic grid as an inversion framework. Using this grid structure we can automatically map the model attributes from depth to time using the velocities stored in the geomodel. We can also automatically upscale the model attributes from the fine-scale geomodel to the seismic scale. Another key technology involved in this new inversion scheme is rock physics: we use a Petro-Elastic Model (PEM) to link the reservoir properties to the seismic response. This model can be customized in the inversion loop and we can easily test different scenarios.
What is the present status on determination of permeability from seismic data? Do you think we can do this? Can we trust it?
We cannot estimate permeability directly from seismic data. If we are lucky and work hard on the data we can sometimes estimate changes in porosity. Unfortunately, permeability is usually only weakly correlated with porosity, especially in carbonate rocks where other factors such as rock fabric control the connectivity of the pore space and its flow potential. Time–lapse seismic is probably the tool that gets us the closest to obtaining information about rock permeability. With 4-D seismic, we are able to image preferential flow paths for fluids that are injected and produced in the reservoir. We are also able to image flow barriers such as faults that divide the reservoir into separate flow compartments. Again, 4-D provides only indirect information about permeability and this information is usually used only in a qualitative manner to update the flow model; for example, based on 4-D interpretation, the reservoir engineer may decide to introduce a new flow barrier in this model by playing with fault transmissibility multipliers.
In the industry we now see a lot of interest for more quantitative 4-D workflows where time-lapse data are used directly to update the 3-D permeability fields used by flow simulators. The basic idea is to jointly invert well production and 4-D seismic data and obtain a permeability model that best matches both set of measurements. This is sometimes called 4-D seismic history matching. To my knowledge, one of the first people to attempt this was Xuri Huang as part of his PhD work at the University of Tulsa and subsequent work at Western Geophysical in the late nineties. A number of universities and oil companies are now working actively on this topic. An interesting approach based on Ensemble Kalman filtering has been developed at the Centre for Integrated Petroleum Research (CIPR), which is part of Bergen University. This is an elegant technique based on sequential Bayesian updating which allows continuous update of the reservoir model as new production or 4-D data become available. This “real time” data assimilation capability is particularly desirable for permanent seismic settings where new seismic images may be obtained on a monthly base and have to be used to update the reservoir model. At CGG, we have a collaboration with Bergen University on this topic and are applying the EnKF technique to update a 3-D permeability model of a large North Sea Field by combining 4-D pre-stack seismic inversion with Ensemble Kalman filtering. The technique is also able to provide uncertainty estimates on production forecasts.
What other interest do you have?
I enjoy teaching. Over the years, I have given a number of industry courses on reservoir geostatistics and seismic reservoir characterization. It is always fun and challenging. I find that you learn a lot yourself from the audience when you give these classes and develop a deeper understanding of the stuff you teach and the technical limitations of the methods you present.
On a more personal basis, when I get a chance, I like to travel with my family to visit new places and enjoy outdoor activities such as hiking, especially in tropical areas of Central America. I have a particular interest in cave diving. I have been doing it for the last 10 years or so. I try to go at least once a year to the Yucatan peninsula in southern Mexico where you can explore miles of underground rivers, beautifully decorated with amazing speleothems. It’s a fascinating place and also quite a challenge to dive safely in this overhead environment.
What would be your message for young geophysicists entering our profession?
With the current boom in exploration and persistent high oil prices, it is probably a good time for young scientists and engineers to enter the oil business and work as geophysicists. The geophysical business is very high-tech and there are enough technical challenges ahead in our industry to keep the next generation busy. I personally have always been amazed by the number of different scientific and engineering disciplines that play a role in our business from applied math, physics and geology to computer science, mechanical engineering and material science to name a few. I also think that a geophysical career in the oil and gas industry is great if you like to travel and discover new places.
Share This Interview