Workshop on Seismic Attributes
Compiled by Satinder Chopra

Chairpersons: Brian Russell & Eric Andersen

Introduction by Session Chair

Seismic Waveform Classification: Techniques and Examples
Eric Andersen and John Boyd

Seismic attribute analysis with well to seismic validation
Eileen Huang

Reconnaissance of geological prospectivity and reservoir characterization using multiple seismic attributes on 3-surveys: an example from hydrothermal dolomite, Devonian Slave Point Formation, northeast British Columbia, Canada
Strecker, U., Knapp, S., Smith, M., Uden, R., Carr, M.,Taylor, G.

Predicting stratigraphy with spectral decomposition
Matt Hall and Eric Trouillot

Azimuthal NMO as an indicator of natural fracturing
Bob Parney, Edward Jenner and Marty Williams

Discussion

The workshop on Seismic Attributes, held May 12 at the 2004 CSEG National Convention, invoked a lot of interest and was well attended. A profusion of seismic attributes have been defined in the last two decades and this has created problems for seismic interpreters, the foremost being what these different attributes mean and how to extract meaningful detail from these different volumes. The goal of the workshop was to address some of these issues by way of presentations and follow that with questions and discussions afterwards, which forms a valuable part of any workshop. The many questions and answers exchanged during the discussion period confirmed the high level of interest this topic invokes.

The first presentation was given by Eric Andersen, who showed that seismic waveform shape and character can define facies and reservoir parameters with far greater detail than traditional time and amplitude mapping. Using an example of Slave Point carbonate play and Lower Cretaceous channel play in Western Canada, Eric showed how two different types of waveform classification methods combined with multi-attribute analysis by concurrently evaluating trends in numerous seismic measurements such as instantaneous attributes, semblance, acoustic impedance and AVO.

The next speaker was Eileen Huang. She elaborated on some of the factors responsible for unsuccessful prediction of the distribution of well log properties in the subsurface, and then discussed how to approach seismic attribute analysis to avoid pitfalls. Using examples from WCSB and North China Basin, Eileen recommended a five-step workflow: 1. well-to-seismic calibration using synthetics and study of the energy distribution in the synthetic by reflection decomposition; 2. forward modeling to create 2-D models with fluid substitution or interval velocity substitution; 3. attribute extraction from the seismic model; 4. qualitative and /or quantitative comparison of seismic with model; 5. multi-attribute analysis.

The next presentation was by Steve Knapp, who advocated the use of geometric attributes in conjunction with relative acoustic impedance and frequency derived seismic attributes. In a case study from the Slave Point in British Columbia, Steve showed how to use post-stack seismic attributes to identify “sweet spots” in the subsurface. Potential benefits include better prospect economics through early successes, reduced finding costs, and value increase of new & vintage 3-D data.

Matt Hall spoke next and showed how spectral decomposition could be used to understand and predict stratigraphy. Since the stratigraphy resonates at wavelengths dependent on the bedding thickness, the high frequency response of reflectors, and is attenuated by the presence of compressible fluids, this process helps the interpreter to not only image subtle thickness variations and discontinuities, accurately predict bedding thickness quantitatively, but assist in the direct detection of hydrocarbons. Presenting examples from Williston Basin and WCSB, Matt showed how modeling the frequency and fluid substitution frequency dependent responses of different thicknesses, geometries and fluids significantly enhances the seismic interpretation.

The last presentation by Bob Pharney was about using azimuthal NMO as an indicator of natural fracturing. Fractures occur in directionally aligned sets and so their detection can be measured and corrected by azimuthal approaches. On an offset sorted gather, one may notice the fracture effects manifesting on the far traces as appearing scattered. The net result of stacking these offsets would degrade the stack. If these far offsets are sorted by azimuths, a sinusoidal pattern appears, suggesting that seismic waves travelling perpendicular to the fractures are slowed down. These sinusoidal patterns can be flattened and the time shifts stored and corrected to velocities. While on the one hand this enables correction for azimuth velocity effects, the least squares fit can be converted into a number of velocity volumes which describe the anisotropy, and hence fracturing, within the reservoir. The methodology was explained by depicting a tight gas sand reservoir example from the Rocky Mountains.

Discussion:

In addition to the six presenters, Jazz (R and R. GeoTech AS) and Dave Gray (Veritas) were the participants from the audience.

Brian: The talks in the sessions focussed on different attributes, but we did not have one that focussed on a combination of attributes.

Eric: It is interesting, till a few years ago there were only three attributes, viz instantaneous amplitude, phase and frequency. It is amazing now how many attributes are coming through.

Dave: There has been some discussion that if you look at too many attributes, you come up with spurious correlations. With such a large number of attributes how do you ensure that the attributes make geologic sense?

Eric: There has to be some intelligence involved in any such correlation with multiple attributes. There may be many random correlations but they need to make sense, for example if there are some zero crossings in instantaneous frequency sections, clearly these need to be backed off. Next those attributes need to be correlated with well logs.

Bob: There were two geophysicists in fairly large oil companies, who gave the two ends we can cite as answers. One of them was theoretically oriented and he said unless I have a sound physical explanation, or something that makes sense, I am not even touching it. The other one said I’d just cross-plot everything. Whatever seems to work, I’ll use that. My personal thought is that I’ll start with cross-plotting. But unless I can come up with a story that lets me believe why things are the way they are, I will not use it. My story then has to be backed up by theory. So, that’s the way I would go.

Steve: I think the key is to go back to the geology, the rocks and the wireline data and we try to look for some sort of discrimination of rock and fluid properties at the wireline scale or seismic scale. Also, a little bit of fundamental understanding of the rocks is helpful for understanding the seismic data. Evaluate what you need to know to nail down the prospect better than what you knew before. By combining the different classes of attributes (like geometric, wavelet, instantaneous Q-based), one can achieve a better understanding of their properties as they relate to interfaces or layers.

Brian: I’ll like to add that by using the neural network trained analysis, we can try to calibrate our seismic attributes with logs. We can then use cross-validation analysis in which we leave the wells out one by one and redo the analysis. This allows us to see which of the attributes are statistitically significant.

Jazz: The number of parameters we can cross-plot and play around would give even a bigger set of attributes. Is there a way of eliminating some of the attributes which do not yield anything meaninful? The other part of my question has to do with the quality of the seismic data being used for generating attributes. Is it possible to determine the quality of the seismic data on a scale of say 0 to 1. Then we can forget about using seismic data which has a quality range from 0 to 0.3 and use data that has a range say 0.5 to 1. How do we define that quality of seismic data?

Brian: The question is ‘will we ever have a standardised set of attributes? My feeling is ‘no’.

Steve: I think Eric touched this aspect in his talk; one of the ways is to use the waveform classification to combine attributes and the other way is to use the neural networks technology to come up with a meaningful attribute volume that has more information than the single attributes.

The other thing to say is to look at pre-stack data as well. Fluid and lithology prediction are based upon variations as a function of reflection angle. Stack can sometimes mask these variations. Also, you are right in that while processing if we realise that the quality of the seismic data is not adequate or optimum for the type of detail we are looking for, we need to stop there: garbage-in garbage-out. As a member of the SEG interpretation Committee, we are actually trying to bring out a volume focusing on how, if the quality of the seismic data is not good, this can affect the results derived from it.

Jazz: In the last 30-40 years the number of attribute volumes have multiplied and so there should be a concerted effort in this direction to meaningfully reduce them and also standardise the quality of the input seismic data.

Brian: As geophysicists we do understand terms like CDP and stack, but there is a lot of terminology coming from multi-variate statistics which is not understood very well by geophysicists. Until we understand this terminology it is going to be difficult to communicate.

Dave: As regards the quality of the seismic data two statistical attributes, standard deviation and generalised correlation coefficient, are useful. So there are tools that can help in deciding if AVO is going to be useful.

Bob: The quality of the data also depends on the problem you are trying to resolve. The best example I can give is about a client who said: “we want you to process our data for fractures; since there is not much structure.” We processed the data and I think it was done well, as the faults could be seen clearly and the frequency content was good. The client then said: "I can’t interpret this data. This is too complex". We went back and smoothed out the fault block and diffractions and came up with auto pickable reflections. Now where is the quality for that particular client? And there are many people like that.

Eric: Another thing we need to be reminded of is that all the processing of data is subjective. If the same data is sent to two different processing shops, you may get quite different looking data sets and the attributes generated from them will also be different. So, it will depend on what the interpreter wants to see. There is a subjective side to the interpretation of attributes and it might be difficult to come up with just one standard attribute. One could follow a set of guidelines and come up with an interpretation strategy.

Dave: Addressing your comment Eric, one thing that is useful is standard deviation. If you got the sections back from two different companies and they both gave you standard deviation plots, then you might want to see if the two sections are within the error bars. This would be one way that you might want to assess the quality. If the error bars are large then you will see different results. If they are small, and if you get the same results from the two contractors, you can have more confidence that they both did it right.

Brian: Attributes is such a wide topic that we could have a full day workshop. There is no doubt that attributes are here to stay. There is a growing trend. People who like to do deterministic analysis think attribute determination should be done based on geophysical models alone. But there is also a growing statistical trend. The optimum interpretation result is somewhere in between.

I thank all the presenters. It has been an interesting discussion.

Workshop on Heavy Oil/Oil Sands
Compiled by Helen Isaac

Chair: Doug Schmitt

Introduction by Session Chair

A brief overview of the geology of heavy oil, bitumen and oil sand deposits
Murray Gingras and Dean Rokosh

In situ recovery methods for heavy oil and bitumen
Ron Sawatsky

Rock physics of heavy oil reservoirs
Doug Schmitt

A case study: QC analysis of time-lapse monitoring in a heavy oil reservoir
Yayun Zhang and Doug Schmitt

Do wormholes play a role in heavy oil cold production?
Sandy Chen, Larry Lines and Pat Daley

VSP study of attenuation on oil sands
Gabriel Solano and Doug Schmitt

Discussion

The session opened with the paper "A brief overview of the geology of heavy oil, bitumen and oil sand deposits", given by Murray Gingras. Dr Gingras described the three major oil sand deposits: the Athabasca, Cold Lake and Peace River. He explained the major lithologies, stratigraphy and environments of deposition.

Ron Sawatsky talked about the in situ production of heavy oil and bitumen. The most promising method is SAGD (steamassisted gravity drainage), which has been tested successfully but is energy intensive. Other themal techniques under testing include VAPEX (vapour extraction) and hybrid steam/solvent processes. Cold production is also a viable technology but the percentage of recovered oil is low. Ron concluded by observing that seismic methods offer complementary approaches that could be integrated with reservoir engineering tools to monitor heavy oil recovery technologies more effectively.

Doug Schmitt discussed the rock physics of heavy oil reservoirs and the response of the rocks to pressure and temperature changes, especially in terms of P-wave velocity. He stressed the importance of understanding the effects of fluid substitution on the seismic response and stated that each case is unique but laboratory measurements allow the seismic response to be predicted.

The QC analysis of time-lapse seismic data was presented by Yajun Zhang. A feasibility study based on rock physics predicted that time-lapse information might be derived from reflection amplitude changes but not from traveltime increases in a thin reservoir. In the field study, great care was taken to achieve repeatability in both acquisition and processing of the time-lapse data. The final sections displayed only small variations, which were attributed to changes in the steam chamber.

Sandy Chen described the numerical modelling she produced to study the seismic response to wormholes. Wormholes are very thin, high porosity channels generated during cold production. They provide a high permeability network which boosts oil recovery but also trigger a drop in pressure, causing dissolved gas to come out of solution to form foamy oil. The 2D numerical models were designed to examine the footprints of wormholes on both PP and PS data. The results suggest that amplitude anomalies and traveltime delays might be more readily observed on PS data than on PP data because of the greater contrasts in Vs than Vp.

Gabriel Solano used VSP data to estimate attenuation of seismic energy through the McMurray Formation. Such data are useful because they may provide information about lithology or fluid content. Gabriel’s work investigated the effects of different filters commonly used in VSP processing on the attenuation estimates. The results from f-k and median filtered data are fairly stable and indicate that these oil sands are relatively attenuating, having an average Q value of 13.

Discussion:

Doug Schmitt (chairman) offered several topics to stimulate discussion:

  • What roles can geophysics play in exploration and development?
  • Time-lapse is seen as a quirky experiment – how can we better show its value to the community, especially with regards to monitoring of expensive production techniques?
  • What geophysical techniques need further development?

Kim Head (Conoco-Phillips) remarked that it was beneficial to have speakers in this session from outside the discipline of geophysics to give insight on how to relate geophysics to other disciplines.

Uli Theune stated that we need to bridge the gap between reservoir engineering and geophysics. Each professional does his own job but does not know what the other does.

Murray Gingras remarked that geologists serve as the interface between engineers and geophysicists. He then addressed Doug’s third question by stating that our long-term goal is to increase the subsurface resolution of geophysical methods and asked if the practical limits are getting smaller?

Larry Lines (U. of C.) reminded us that vertical resolution depends on frequency and velocity and since we can’t change the velocity (well, sometimes we do by injection of fluids in enhanced recovery schemes but the decrease in velocity doesn’t provide a drastic shortening of the wavelength – Ed.), the only way to shorten the wavelength is by increasing the frequency content of the data. The near-surface acts as a low-pass filter so we need to exclude it. We could use borehole techniques; crosswell surveys are expensive but provide the high frequencies needed for short wavelengths.

Murray Gingras suggested that we might perhaps create a hybrid of reflection seismic and radar. There are also other techniques such as gravity, magnetics and resistivity. How might they be used effectively?

Ron Sawatsky said that reservoir engineering models are based on simulations without using a seismic image. That information could be used in forward seismic modelling to show the implications of seismic reservoir monitoring. We need a collaborative and iterative partnership between reservoir engineering and geophysics. Geophysics brings the imaging capability that should be integrated with reservoir engineering to determine the reservoir parameters.

Keith Hirsche’s (Veritas) experience has taught him that the seismic response is often non-unique, depending on, for example, temperature, gas saturation and pressure. The reservoir engineering models give constraints on such parameters that could be integrated into the seismic data interpretation.

In summary, a recurring message here was that geophysicists must work closer with reservoir engineers, learn to talk in their parameters (do not even mention milliseconds) and address the questions that the engineers need to be answered. Reservoir engineering data must be integrated with the seismic data before we can convince reservoir engineers of the benefits of seismic monitoring.

Workshop on Reservoir Characterization and Integration of Disciplines
Compiled by Jason Noble

Chairpersons: Larry Lines & Rainer Tonn

Introduction by Session Chair

Deep Panuke: The Integration of Geology, Geophysics and Reservoir Engineering for Field Appraisal
Rainer Tonn, Steven Brown, Robert Riddy, Rick Wierzbicki

Geostatistical Simulation for Reservoir Characterization
John Pendrel, Miles Leggett, Peter Mesdag

Reservoir Characterization and Heavy Oil Production
Larry Lines, Joan Embleton, and Ying Zou

Characterization of a heavy-oil reservoir at Pikes Peak, SK
Ian A. Watson and Laurence R. Lines

Integrating reservoir simulation with time-lapse interpretation: An example from the Weyburn field
Keith Hiirsche and Ryan Adair

Discussion

The afternoon of May 12 at the 2004 CSEG convention provided the workshop on Integration of Disciplines and Reservoir Characterization chaired by Larry Lines and Rainer Tonn. The introduction by the session chairs highlighted the cross disciplinary nature of the reservoir characterization process involving geology, geophysics, and engineering. The tone for the afternoon was set with a keen look forward to some interesting examples of this integration.

Rainer Tonn started things off with a discussion of Deep Panuke. It was a very interesting look at the full process from initial consideration of the geology through to full reservoir identification, complete with a discussion of the role which each discipline plays in the process. The benefits of a team approach to the process were clearly evidenced, as was the effectiveness of the method; given a major offshore prospect it clearly demands the rigor of a complete examination.

Next up was John Pendrel with a discussion on the use of geostatistics for reservoir characterization. Once I had managed to resist the urge to run for the door; that geostatistics word is a little frightening, I was pleasantly surprised with an understandable presentation of how geostatistics can actually be made to work using real data. It was pointed out that geostatistics can provide for accurate estimates of reserves which is an important thing these days. A brief discussion on the importance of approaching your results and methods cautiously to ensure believable results qualified the impressive examples; however, the constraints seemed very realistic.

Larry Lines highlighted the need for using every available product in his presentation regarding heavy oil production. To be effective all of the information needs to be integrated, from rock physics to AVO, to time lapse seismic, to name a few. The differences and similarities between requirements for hot and cold flow production were looked at in light of the importance and prevalence of heavy oil in Western Canada. The talk drifted to wormholes and the difficulty in finding them seismically but their importance to the production properties of the reservoir.

As we came out the other end of the wormhole discussion Ian Watson gave an interesting example of an actual onshore practical characterization of a producing reservoir. We had started the afternoon off with an approach that would frighten off most exploration managers, but Ian showed how similar results could be achieved practically for smaller scale producing fields. Via mostly 2-D seismic, well logs and geology a very thorough identification of the reservoir was obtained. It provided a very good counter to the misconception that reservoir characterization requires years of effort and countless dollars in high tech expenditures.

We closed out the afternoon as we started, and indeed, as the entire afternoon had gone, with an interesting example of not only looking at the reservoir but the importance of cross-disciplinary integration. The first half of the discussion involved some background on Weyburn field and the use of time lapse seismic, as well as modeling to optimize production and was presented by Ryan Adair, an engineer. Keith Hirsche wrapped things up showing how the reservoir model could be used improve the understanding of the reservoir, as the time lapse seismic was reconsidered, the model could be updated to fit. Not only was the need for effective cooperation within the team demonstrated, this example showed the benefits of the inter disciplinary feedback, as the interpretation improved, the model improved which in turn led to the understanding of what was going on better.

The informal discussion following the workshop was relatively brief but informative.

It was pointed out that most of the geophysical tools employed tend to be somewhat band limiting, and was it possible or desirable to obtain higher frequency results. Several people mentioned that frequency content had never been an issue in their work, the bandwidth they had was more than adequate for the task at hand. It was generally felt that higher bandwidth data would not pose a problem however.

One of the key problem areas identified was the inherent problems in the collaborative process itself. While we tend to think of the various disciplines as distinct, the workshop served to show the blurring of the line between the various fields. Traditionally the geologist geophysicist and engineers worked individually, in a sense passing their work along to the next in line as they finished their part. The workshop showed and the discussion highlighted that in fact it continues to be a work in progress; at any time any member of the time can have more to add that their specialty provides. On of the biggest problems to this end brought forward in the discussion was identified as hardware and software incompatibilities. The engineers are using PC’s, the geophysicists are using UNIX workstations, and the geologists are split between the two, with all three using completely different software . Surprisingly this creates a major stumbling block as the work of the individuals involved cannot be easily transferred. Additionally pointed out were problems of geography; the geoscience groups were simply not located on the same floor or perhaps not even the same building as the engineers and other members of the team. While some companies see the process as a team effort, others with different corporate culture don’t see the benefit of reorganizing.

On the topic of corporate culture it was asked why the process of reservoir characterization is not done more in Western Canada. Simple economics and necessity were identified as the most likely culprits. Normal Western Canadian plays don’t justify the cost and time involved; the exceptions seem to be the more complex production environments such as major offshore projects and complex heavy oil fields. While there is a benefit to the process itself, a single well in a known field doesn’t justify the time and effort required to map out the reservoir. It was generally felt that this was not necessarily a bad thing. Time can be a crucial factor in Western Canada with land sales and drilling commitments, and committing several key staff members full time to such a project is simply not economical.

What further information can be brought in and what benefits it brings was also discussed briefly. Whether non seismic geophysical methods such as gravity surveys for example can add clear value to the picture was left as inconclusive. It was mildly debated that often non seismic methods are the origins of the geological model and, if only indirectly do contribute to the overall picture, whether it be an anomaly from an EM survey or an oil seep in the jungle.

One of the key points mentioned as a potential hazard is the need for a cautious eye on the uncertainty of estimates used at various points. The non uniqueness of solutions is an unavoidable problem in the modeling, and can lead to non realistic results. A further danger identified here is the uncertainty and instability that can be brought in by these poor estimates and the need to be aware of this pitfall.

Workshop on AVO/Rock Physics/Lithology Prediction
Compiled by Oliver Kuhn

Chairpersons: John Logel & Mike Burianyk

Introduction by Session Chair

Pitfalls in Rock Properties and Lithology Prediction in Dolomite
John D. Logel

Calibrated Three-term AVO to Estimate Density and Water Saturation
Jon Downton, and Alvaro Chaveste

Porosity-Thickness Prediction by Application of AVO/LMR Analysis to Seismic Data: A Case Study from a Clastic Lower Cretaceous Gas Reservoir at Crossfield, Southern Alberta
Holger Mandler & Len Stevens

AVO investigation of the Ben Nevis reservoir at the Hebron asset
Andrew J. Royle, John D. Logel and Laurence R. Lines

Application of Rock Physics to an Exploration Play: A Case Study from the Brazeau River 3D
Heath Pelletier, Jay Gunderson

Case Study: Seismic Reservoir Characterisation of a Deep Water Prospect Offshore Newfoundland
Rainer Tonn and Daria Pusic

Discussion

Thursday’s AVO workshop ended with a lively 45-minute discussion. After a significant amount of editing and paraphrasing, some of the highlights of the discussion are featured below, including Lee Hunt’s now famous (or infamous) ungulate parable. Our apologies to people who participated but remain unidentified, or whose comments were entirely unintelligible due to poor recording quality.

Michael Buryanyk: If we could have all the speakers come up to the front please. I would encourage anyone who would wants to stick around to move forward and we can have a good long discussion on these subjects. Lee, would you like to start off by asking the questions that did not get answered in the regular session and maybe generate some discussion?

Lee Hunt: At these conventions I always hope to walk away with a better idea of what my approach to my work should be. I expect to have to continually adjust my workflow. There is no universal answer to the question, since the challenges and environments that each of us are in differ significantly.

I work at a small E & P company, and as my V. P., Darryl Metcalfe, always tells me, our environment is like that of the ungulate, short on food and water, and having to choose the best path through the forest the first time or risk starving. And so it is with technological choices. We must choose the approach that is going to actually materially benefit us very quickly. If we waste time doing too much, or the wrong things, we will miss opportunities and our company could perish. If, on the other hand, we fail to employ advantageous new technology, some other (smarter) beast will steal our meal!

So, if people from small companies like mine come to these conventions looking for ways to find efficient, effective, advantages and to estimate quickly what technology may be of help to them (and what will not be helpful), what have we seen this year to help them? Limiting the observation to AVO or elastic methods, I think that some of the petrophyiscal comments that Keith made hit closest to the mark.

Keith wanted us to think about petrophysics. Physics has been a rallying cry by others such as Dave Gray in this convention (“Please for the love of God think about the physics!”), and for good reason. Keith’s approach did not involve a reckless stampede into attribute crossplot space, either. He very sensibly displayed petrophysical attributes versus depth (log views) first. I think that is a key in understanding if AVO type methods are likely to be of much use. The reason is that this display of Keith’s (Jon Downton has also been very fond of that kind of display) gives us a feel for both sensitivity and resolution. If we have a feel for the resolution that our data quality will support, then these displays should give us enough information to immediately make a guess as to whether AVO methods are likely to be able to add value.

There was a talk earlier this afternoon on a Cretaceous channel during which I found myself asking, do I need to be in LMR space? Was this a resolution problem that should’ve been attacked, or was it an AVO problem? I wasn’t really sure about that because we were shown cross plot space before any logs. I mention this just to emphasize the point that comparing discrimination alone is not enough. We need to consider discrimination, resolution (or scale), and attribute certainty all at the same time, and all very early if we want to reach a decision about technological approaches quickly

So, at the end of the convention the biggest thing on my mind is approach over algorithm. And this is not to say we did not see some great work from people (like Downton with his stretchless AVO inversion - something I hope people will ask about), but that first we must decide on approach or strategy before getting into tactics. This means thinking about the petrophysics in a way that relates to the resolvability and discrimination in real data. If contractors can help the E&P companies to arrive at a reasonable feel for these things early enough, we will continue to see these technologies used more to practical advantage. I’d be interested to hear other comments on this.

Jon Downton: Responding to Lee’s comments about certainty or uncertainty. I think there are two issues concerned with certainty. One is how accurately we can predict these parameters for customers – there are always issues with data quality, noise, and systematic errors. But the second issue is one thing that I think the interpretation team needs to address, and that is the uniqueness between rock properties and elastic properties, and acquiring data so we can get a better understanding due to this transfer. I think there is a lot of non-uniqueness there. We have done a lot of work to address that issue. So with certainty, I think there’s two parts to it and I think the second issue is that with physical analysis, mapping, elastic properties, rock properties, we need the data to do that, and lots of that is not available. That’s the big issue when making our predictions.

Bill Goodway: I have a question for John Logel about the origin of that LMR anomaly he presented, and I hope it will provoke some discussion. This was the anomaly that was apparently the result of multiple contamination. What type of AVO class anomaly where you looking for?

John Logel: Predominantly it’s a class IV type anomaly that we’re looking for.

Bill Goodway: And would you say it was a Q phase rotation problem you encountered?

John Logel: Very much so.

Bill Goodway: It would be my expectation, looking at either the multiples or in fact your gather displays, that presumably there is some residual moveout on the multiples. They couldn’t be dead flat, could they?

John Logel: They’re awfully close to dead flat, if you look across there. I mean there’s some amplitude variation in them, but we can attribute that directly to noise. We are still looking at very small things.

Bill Goodway: So I think my perspective would be that I can’t understand this, because your multiples are unique in terms of having no AVO; they’re actually flat in terms of amplitude variation versus offset. Generally we don’t see that. Usually a lot of interference in gathers which is varying with offset could be multiples, but I don’t understand how a multiple could so systematically mimic a good AVO type IV anomaly.

John Logel: I think it’s bound to, if you think about where it’s at, higher up in that higher velocity zone. It’s bouncing around in a higher velocity zone. So its angle is actually much smaller than what we expect the angle to be down in our zone of interest. So it’s really a near trace, it’s really a smaller angle reflection masquerading at higher angles. And when we model this in our zone it’s exactly what we get. If you remember the one model I had, it had the multiples and you can see them directly moving across there with basically the same gradient. We used a full elastic model, with higher order multiples and converted waves.

Bill Goodway: So did you try and take the model gather through the inversion and see whether it actually did create an anomaly?

John Logel: It created the exact anomaly, even in cross plot space. That’s the scariest part of all this - even after the fact, even going back to the same data set to explore one more time, you can’t say definitively we’d get rid of the problem. I mean, we’d have to go back to older techniques or to something that’s radically different now, to try to get rid of it.

Bill Goodway: Well I think if you have a multiple that is flat after NMO correction and has no variation in its amplitude with offset, and it shows up as a strong trough anomaly, which is what you’re looking for, then I don’t think there’s any hope, as that’s exactly what you’re looking for. It’s just hard for me to understand how it could be so systematically correct in terms of being an good anomaly, and so wrong in terms of being what you were looking for. Most multiples I’ve seen, even in a land situation, do not create an anomalous type IV AVO gradient as they have variation with offset, be it residual moveout or amplitude. Generally de-multiple methods exploit these variations.

John Logel: I fully agree with you – this has been a head scratcher for quite a while now. And as I said, the cross plots right now, everything we have, looks exactly the same as our prospects.

Dave Gray: I’d like to get more general, and address the question of how many attributes you really need, and which ones. What I like to do when I’m looking at a play I’m not familiar with is to just ask some basic questions – What type of reservoir might you actually have? What might the cap rock be? What could some of the possible scenarios be? Get an idea of the two-way velocities that might be associated with those different scenarios, and the densities that might be associated with a porous reservoir rock versus a non-reservoir rock. If you can, get information on the shear wave, and if not, use some kind of petrophysical relationship that’s appropriate, like mudrock properties. Then just try to examine the various elastic properties, which we know we can get from seismic data, to determine which one is going to give you the biggest contrast. That’s where I’m coming from in the work that I’m doing.

Unidentified person: I feel one of the biggest challenges we face is how do we take all these smart ideas, and get our management to do something about it? How do we invoke a run stream of interpretation so we can get all this stuff? And of course I would start out by saying that we’re missing a step by not looking at pre-stack data, but of course as was mentioned earlier, I’m kind of preaching to the converted, so to speak, with this crowd! But I’ll put myself in a hypothetical situation, where I’ve got a landsale in two weeks and I’ve somehow got to do all the work involved with that, plus I’ve got to satisfy all the gentlemen up here, to tell them that their algorithms are all solved perfectly. So the question to me is, “How can I get the job done on time?” I think we need to concentrate more on the pre-stack data, and get an overall philosophy into management’s heads, and then we can start with the LMR work, and the AVO and so on.

We’re about to drill a well, so what do we do about that? Well I’ve run a VSP, and then a dipole shear, and then the engineers come and ask me why should they pay for a dipole shear twice. Is anyone aware that they run a dipole shear all the time when they frac, in the zone of interest? How can we very quickly go from a quick pre-analysis that’s done to get some sort of reconnaissance investigation indicating possible AVO, and then push it over to a contractor like Jon, and say help me, get me some kind of AVO analysis quickly? So we’re talking about third order this, and fourth order that, but how do we change the philosophy of the oil companies, the big oil companies, so that it’ll help us poor interpreters make decisions in order to minimize risk. Am I on topic here; is this making any sense?

Lee Hunt: Can I try to answer here? Well one trick I tried just recently was not to get after management but the next best thing is the engineers you’ve been talking about. Those guys spend a lot of money and they’re no nonsense types, generally. And so the trick was to invite some of the processing experts in to give geophysical talks, and we managed to get our drilling and operations engineers in there. By the end of it, they were demanding to do some of this stuff all the time, which was perhaps a bit overenthusiastic! I guess what I’m getting at is one way to change the cult of management is to change the entire culture. Instead of just trying to educate just the geophysicists and geologists, get the engineers on your side too.

Jay Gunderson: I’m Jay Gunderson, petrophysicist at Veritas. One of the things I’d like to comment on is the quality of the log data. With most or all of the examples we saw today it wasn’t an issue, but it can be, and that’s why it’s important to have a petrophysicist involved in the process. Log analysts are typically looking at one small zone instead of looking at a large sequence of zones, so when a petrophysicist cleans up curve data it’s a much different mind set. Log data in general is in pretty poor shape in certain areas. If you go to Mackenzie Delta you’re talking about 30 to 50 percent of the density curves are lousy, and maybe 20 or 25 percent of the sonic logs. I think that holds true in other young clastic basins - the Gulf of Mexico, probably the East Coast as well. In Western Canada it’s probably not such a problem but when you get to the deeper carbonates and there’s all kinds of zones that are flowing, you don’t get density, you don’t get sonic, but I’ve seen these curves being used routinely by geophysicists. I’m not pointing a finger at anyone, I’m just saying I’ve seen this happen a lot. I’ve seen sonic curves that were digitized wrong being used for velocity models, and unedited density logs being used for Lambda-Mu-Rho work, and that’s service companies all the way to majors. So I think it really is important to have somebody involved who understands logs, because they can provide a sense of what’s real and what’s not, and how to correct it. They also provide pore and evaluation aspects. We can incorporate the pore, VSP, and mud information to best figure out what the lithology is, and to try to get the best idea of attributes compared to reservoir properties - you need to know this kind of lithology, porosity, and so on.

John Logel: Wouldn’t you say though, that it doesn’t take that long for a geophysicist to get comfortable with logs? Usually they can tell if the logs are unreliable.

Jay Gunderson: Well, I’d like to say you’re right. And in a lot of cases you are right because it’s easy to tell there is a river there. But some of the subtle things, like whether it’s bad hole, whether it’s coal, whether it’s a volcanic sand, that’s tricky. Sometimes the density comes out very complicit like it did in the synthetic example, and you might not believe that’s real. So you certainly have a point, geophysicists can recognize some of the poor quality curve data, but just from the things I’ve seen, I know that the opinion of a log analyst can really help, because they understand the tools and the geophysics.

Andrew Royle: With my project I was fortunate to actually work closely with a petrophysicist, Mike Donovan at ChevronTexaco. (And Jay actually did some of the log editing work prior to my starting the project.) I found this to be very important, because as I went along I’d do my inversions, and then afterwards we’d find that there were spots that didn’t seem right. So I’d go back to Mike and he’d actually double-check the logs. This was actually a little bit shallower than the zones Jay had worked on, and so Mike would correct the curves, and then I’d do the inversion over again and the results were greatly improved. And Mike was really helpful in many other ways too. So I think it’s really important that you look really closely at your logs when you’re doing this kind of work, and it is really, really beneficial to work closely with a petrophysicist too.

Mike: Jay, Andrew – you make a good point about the importance of petrophysicists, but I know of only two in town that work with geophysicists. Where are these people going to come from? Two isn’t enough.

Jay Gunderson: That’s a good question. There needs to be a meeting in the middle between log analysts and geophysicists. Log analysts need to start understanding what the geophysicists do, how they create attributes. I mean creating attributes is pretty easy from the bottom side certainly, but what’s important is what those attributes mean in terms of lithology, what kind of AVO anomaly to expect, what kind of impedance change, and so on. Likewise, the geophysicists need to meet the log analysts half way. Log analysis is one of those fields where it takes years of experience to really recognize what those things mean, so I don’t think this is going to happen over night, but it needs to be a joint effort.

John Pendrel: I am John Pendrel from Jason Geosystems. In the Jason Houston office, there are two full time petrophysicists, and about 5 people doing geophysical projects. But demand is so heavy, that we need another full time person doing petrophysics. The Canadian office has zero petrophysicists in the office, and there’s just not that much demand. We do a little bit of work ourselves, but I wouldn’t call us petrophysicists. I think that the reason it’s hard to find petrophysicists in Calgary is simply a lack of demand thing.

Unidentified person: Everyone in the conference is talking about petrophysics and logs. I find that a lot of companies want those answers but they are not willing to pay for it. It can take a long time to weed through a whole bunch of well logs; it’s not glamorous, with long hours, and it takes a lot of experience as well. I just don’t see that people doing that kind of work when they’re not getting paid properly for it. Do you think people will start paying more for higher quality?

Penny Colton: I’m Penny Colton with APEGGA. I’d like to comment on this issue of petrophysics in Canada versus the United States. Let’s not forget Canadians, or at least Albertans, have had all the EUB logs and cores available for years and years, while Americans had to really hustle to find core information and well information. So I think a lot of Canadians, whether they were called petrophysicists or not, have worked with logs, sonic, density, gamma ray, whatever, for a long time. A lot of Canadian geophysicists have had access to well logs, so we’ve gained a lot of expertise ourselves, and I think a lot of us Canadians are really good at looking at logs. That’s why the Canadian Well Logging Society is the oldest petrophysical well logging society any where in the world.

I have a question for you John (Logel) concerning your question about how can we tell if our pre-stack data is good enough for AVO: Do you remember when we had a discussion at a CSPG luncheon where somebody, a geophysicist I believe, was suggesting that geologists have access to pre-stack data on their workstations, for instant QC checks? Do you have pre-stack data on your workstation, so you can get a feel for the levels of noise, and multiples, and so on?

John Logel: I would say for most of our active projects we have pre-stack data for. And we have it somewhat local, in an AVO package that someone can look at with. We’re going a step further – we’re a Landmark shop, so we’re trying to integrate a Landmark product that would give most of our interpreters online access to pre-stack data.

Dave Mackidd: I’d just like to offer a couple of comments regarding the question about which petrophysical parameters you should focus on to get a quick answer. My opinion is you can’t, unless you focus on the geophysics first. If the data is unsuitable for AVO analysis then there’s no point in doing it. There are a couple of very quick tests that you can run to see whether this is the case or not, and I don’t see them very often, at least published, or in AVO talks that I’ve seen. The first of these is to look at spectra plotted as a function of offset. Now if you’ve got a Landmark system you can do that in Promax, and I’m sure the other systems have something similar. But if your data’s frequency content is varying dramatically with offset then you’re pretty well hooped, unless you correct for that. The second test is to look at the gradients that you’re getting out of your data. There are theoretical limits on what they should be.

For instance you can take RS and RP and cross plot them; they should fall between certain maximum slopes. If they fall outside these slopes, you’ve got a problem. These two simple tests will tell you whether your data is suitable for AVO analysis in the first place. And if it’s not, then you need to fix the data before you go and worry about the petrophysics.

Dennis Couturier: Hi I’m Dennis Couturier with Petro-Canada. I’d like to start out with a statement – “Even the rocks can be wrong!” Some of you might remember some interesting talks and papers on Hibernia. I had always naively thought, the rocks are always right - if you have well logs that are edited by a very competent petrophysicist, correcting for all washout zones and other problems, and calibrating to an extensive amount of core and core plugs, then you’ve got correct well logs. Well, when we were drilling our first one, two, three, four wells, the porosity came out much higher than any of the wells that had previously been drilled. And one eventually we got to thinking, “It can’t be that much.” To make a long story short, what had happened was there was an unknown systematic trend in the laboratory measurements, on the cores and core plugs. Simply put, the reservoir rock was so good that the whole core and the plugs became invaded, and they weren’t cleaned out properly before the measurements were taken. So one has to remember that there are a lot of insidious things that can happen, mainly things a geophysicist might not be aware of. We think the rock measurements carried out by experts are absolutely fool proof, but as with anything, they’re not. We feel that if a confident petrophysicist has calibrated the logs with core, and restored them to proper pressure and reservoir conditions then it’s going to be correct, but it’s possible that it’s not. So I can say that there are some things even more insidious than you’re talking about now. Even the rocks can be wrong, because they weren’t the rocks that you thought you had, in the reservoir.

Michael Buryanyk: I think that we’ve had an excellent session here for about 45 minutes, but probably everybody’s getting tired and we should call it a day. I just want to thank everyone; thanks for hanging around here on a Calgary Flames playoff hockey night in the city! <Applause>

End

     

About the Author(s)

References

Appendices

Join the Conversation

Interested in starting, or contributing to a conversation about an article or issue of the RECORDER? Join our CSEG LinkedIn Group.

Share This Article