Jerry Harris is Professor and Head of the Department of Geophysics at Stanford University. While in Calgary recently, to deliver the SEG/AAPG 2002 Fall Distinguished Lecture on Crosswell Seismic Profiling: The Decade Ahead , Jerry was kind enough to spare some time for an interview for the RECORDER. His impressions and opinions on different aspects of his favourite topic are contained in the following excerpts from the interview.
Jerry, tell us about your educational background and experience?
I did my undergraduate studies at University of Mississippi in the early 1970s, and then went to the California Institute of Technology and majored in electrical sciences. After my master’s degree, I worked for 3 years in atmospheric geosciences, looking at microwave attenuation due to rain. You may say I am an electrical engineer at heart but I have done wave propagation all my professional career, first electromagnetics and now seismic.
Why did you decide to go in for a teaching career? What do like best about this profession and what is the most difficult thing about being a Professor?
In a research university like Stanford, we teach in a many different ways. Most of us like the variety. It is not just in a classroom but all the interaction we have with students. For example, in a Ph.D. program we are really involved in research. And through our research we are educating and teaching our students. What I really like about it is there is always an opportunity to learn yourself and an opportunity to work with smart students. So, it is an environment that you can never really outgrow because you are always renewing continually. I really enjoy the teaching career in a research institution. There are no boundaries as to what you can work on; in my case electromagnetics, seismic imaging, laboratory, field, etc. So it is something always challenging and interesting. The biggest challenge is balancing all the demands for your time, such as the teaching, your own research, proposal writing, and of course project administration.
At times I hear people say ‘teaching is a thankless profession, because teachers do a lot for the students and when the students do well, they get the credit’. What do you have to say about it?
In some ways it is like working with your own children, your own family. You teach them and you feel happy when they grow up and do good things. You always know that you were a part of the beginning, for example when you taught them how to solve a problem or code a solution or whatever. In our case, we teach them to ask questions and to become critical thinkers. So when our students graduate and move on and succeed in their careers in industry or academia, we get some personal satisfaction for being a part of there training.
How did you get started with crosswell imaging? Who is credited with this idea? Was it yours?
The idea of crosswell imaging is certainly not mine. Crosswell seismic imaging dates back to 1940s and 50s. In the early days, small explosions were used to locate the boreholes themselves and there was always lots of talk about imaging. But there were limitations in that there were no practical downhole sources and data acquisition was too slow to make it a practical imaging technology. So my contribution was really to introduce a modern downhole source, rapid data acquisition and modern processing methods.
I think what I am going to ask you now you may have already touched on in your talk, but it is just to place on record for the benefit of the members not able to attend your talk. What is the power of crosswell imaging?
I think its primary utility is to produce true high-resolution images at a scale not available to other methods. By high resolution I mean an order of magnitude higher than what is available from conventional surface seismic surveys. As I pointed out in my talk, resolution is an issue when it comes to reservoir management, particularly when you want to understand the detailed stratigraphy, to develop reservoir models for flow simulation and of course monitoring. When you need to estimate small-scale features, crosswell may be the only technology capable of producing the kind of reservoir models that are needed for flow simulation. Crosswell images are actually pushing the limits in terms of resolution of grid blocks in the flow simulator itself. Where else do you get this small-scale information between wells? So crosswell profiling is the only direct measurement technology available today for this problem.
What type of sources are used in crosswell (imaging) data acquisition?
I have used a number of different sources: airguns were probably the first modern sources to be used, but piezoelectric sources are probably the best and easiest to use and also are relatively powerful and easy to operate. The piezoelectric sources run sweep waveforms or pulse codes that are easy signals to distinguish from background noise so high peak pressures are not required.
There are other sources of course. One approach was to take low frequency surface seismic sources like hydraulic or pneumatic vibrators and repackage them for downhole use. Others have tried this with limited success. Instead, I took the approach of repackaging high frequency logging technology, making it more powerful and lower frequency. With logging-based technologies, I could build on all the expertise and experience of logging operations, that is high temperature, high pressure, and wireline operation. Of course the traditional sonic logging tool is not capable of transmitting the long distances required in crosswell, say up to a 1000m. So, we had to re-engineer the piezoelectric devices to produce the right frequencies, 100 - 2000 Hz, a feature that repackaged surface seismic technology was never able to accomplish.
You spent a lot of time as you were developing this technology. You would have many stories to tell.
Not only about sources, but downhole detectors and operations were issues as well. Some of the first downhole receivers we used were actually Teledyne marine streamers and hydrophones.
There were some good stories and bad stories about them. They worked very well as detectors but sometimes we would get things stuck in the borehole and some of the stuff we put in first actually came out first.
Apart from the bad stories, we had some good times as well, for example the first time we detected a 1000 Hz signal over 500 m or so. Others could not believe that we could see those high frequencies over those distances. It was pleasantly surprising for all of us. In fact the piezoelectric source that we use now was primarily built as a reference source in many of those early tests. We were comparing a number of different sources and needed one as a reference. I built the piezoelectric because it was very repeatable and reliable. And it turned out to be the best of all the sources. So, what started off as a reference tool ended up as a tool of choice. And we compared that source to hydraulic vibrators, airguns, explosives, all kinds of fancy exotic sources literally from around the world. From Norway to Japan and other places, the piezoelectric always emerged as the best source.
What type of source spacing are we talking about?
Typically, 1m; it depends on the imaging objective, that is whether only tomography or reflection processing is needed. The sampling interval is dictated by the frequency content and the desire to avoid spatial aliasing of the low velocity events. Another problem is the depth control needed for for small source intervals. Imagine now that you are trying to position the source at 1 m intervals at 5000m depth. How do you do it? If you just start at 5000m and call for the source to move 1m at these depths, sometimes it moves, sometimes it sticks and doesn’t move the entire 1m. So having a source that operates like a logging tool, keeping it moving continuously in the borehole, takes care of the problem and keeps the source on depth with repeatability of a few inches. As the source moves, the computer is telling it to fire on depth. Again we borrowed that operation from well logging but we, at Stanford, were the first to use it with a modern crosswell source while others were operating by stopping, starting and sometimes clamping the source at depth. This shooting on the fly became very important to data quality because we have to go back and occupy those shot points several times. By keeping the downhole source continuously moving, we found we could repeat locations and the event moveouts in common-shotgathers became much smoother. We introduced that approach, shooting on the fly, at Stanford with the piezoelectric source. Now it’s used with other downhole sources, such as the airguns.
Is there a limit to the distance between the wells to get good results?
Certainly there is a limit. But again, it depends on what you are trying to image and how you hope to accomplish that. Eventually high frequencies are going to be attenuated to a level below delectability. So, if you are willing to use the same frequency as surface seismic, say 100 hz, then you will be able to see the signal over distances comparable to the distances you see in surface seismic, say 1000s of meters. But in my opinion, the advantage of crosswell comes when you use the higher frequencies. The higher frequencies will still have good signal-to-noise ratio over the shorter raypaths that are set by well spacing, not the target depth. A rule of thumb is that you are going to see your signal up to about 200 wavelengths. So given the frequency and the velocity of sound, figure that you will see adequate signal over paths of about 200 wavelengths. If that’s 5000m at 100 Hz, it’s probably only 500m at 1000Hz.
Since we are using high frequency sources and because we are recording close to the borehole in the zone of interest, there are smaller associated Fresnel zones. All these contribute to the high resolution images that we see in crosswell data. Is there anything more to that? For processing crosswell data, do you use VSP processing techniques or are there special techniques?
I have already said that the distances the waves have to propagate are shorter. So you can transmit and receive higher frequencies over the shorter distances. There isn’t nay magic here. The geometry of the survey is also different. You are propagating say parallel to bedding in most cases rather than perpendicular to bedding. We can borrow all the processing concepts from seismic and VSP technology except the details are different because of the frequency content and geometry. Conceptually one could process crosswell data like the offset VSP, except in practice you will have to process 300-500 offset VSPs for each pair of wells. Certainly, you cannot do it one offset at a time. So, the way you do it is to borrow the concepts and algorithms from VSP but organize data in a form suited to the crosswell geometry.
The problem with migration of crosswell data is the limited aperture of the crosswell survey. Unlike medical imaging or surface seismic we are trying to image a complicated structure from data collected along two lines. We don’t have the vertical aperture you would ideally like to have for migration, so we you use is a limited aperture migration. Well, if I keep limiting the migration aperture more and more, the migration turns into a CDP reflection mapping process. So instead we tend to start with a CDP reflection mapping and then open the aperture until we start seeing unacceptable migration artifacts. One last comment is that we have incorporated Fresnel zones into the tomography algorithms, though not routinely, to capture finite bandwidth effects on the inversion.
So it is a sort of trade off?
It is a trade off between wanting to collapse the Fresnel zone effects with a migration technique, but being forced to control the aperture to reduce the artifacts with a limited migration aperture. Again, the conceptual advantages and disadvantages of migration, say regarding Fresnel zone effects, are the same as for surface seismic but the practice is different.
You referred to anisotropy determination in your talk. How much effort is being put into including anisotropy in processing crosswell data?
Not nearly enough in my opinion. We can extract anisotropy in the plane of the survey but not done routinely. Azimuthal anisotropy is a different beast. Tomoseis is recording crosswell using multiple wells, so they can in principle detect azimuthal anistropy. One challenge though is to separate anisotropy from heterogeneity. More sophisticated modeling is required. Nevertheless, this is one area where the detailed understanding that you get from the crosswell survey could be used to enhance the value of surface seismic by unraveling how scale affects anisotropy. As you know, heterogeneity below the scale of resolution, say due to aligned fractures, may appear as seismic anisotropy. In the simple cases of laminated shales that I have shown, high frequency crosswell data can resolve some scales of heterogeneity while other smaller scales still appear as anisotropy. Of course, the scales we resolve are an order of magnitude smaller than surface seismic so we may be able to resolve heterogeneity that appears as anisotropy in lower frequency seismic data.
Are we in a position to do 3D imaging with crosswell technology and if yes, does that justify the cost that may be incurred?
We do have the technology to do 3D imaging, at least the velocity and attenuation tomograms in 3D. The issues with migration are more complicated. Now that we can survey several wells simultaneously, the aperture issues improve but sampling is still a problem. The basic technology is there but data are scarce and the devil is in the details.
Now, I’ll move to the other part of your question. The cost is acceptable if the result you produce has value or answers the questions being asked. For example, someone might ask, “Do I have to shut the wells in to do this, because that will cost me money?” The answer is, yes you must make the wells available to do this imaging. But shutting the wells is not a problem if the results add value. If the engineer asks the operator to shut in for a well test, there wouldn’t be any question because they know the value. So, if we can establish the value crosswell brings to reservoir analysis and monitoring, the cost for 2D or even 3D will not be an issue.
With this 3D coverage, is it possible to get an idea about azimuthal anisotropy?
The limitation of crosswell in this respect is constrained by where the boreholes are located; you will not be able to get uniformly sampled azimuthal data and this lack of uniform coverage will be a problem for estimating azimuthal anisotropy. From the point of view of tomography, the rate of convergence of the different complements of the anisotropy model may be horribly different.
It will depend on the location of the borehole.
Yes, and you’ll want to keep the geometry as uniform as possible. You do not want to shoot between wells a few 100 m apart and interpret anisotropy with data from other wells that are 1000 m away. So what it means is you are forced to work with the geometric pattern you have for the wells.
Remember too, in surface seismic this anisotropy you are seeing may be due to heterogeneity. It appears as anisotropy because you are not resolving the heterogeneity, say fractures. It may not be intrinsic anisotropy at all. If I have the higher resolution imaging that crosswell brings you may actually image the isotropic zones between the fractured zones or see other wave phenomena associated with the fractures like guided waves, etc. So it is no longer an anisotropy problem, it is a heterogeneity problem.
Apart from Tomoseis, what other companies offer crosswell imaging service?
There is Schlumberger and Paulsson Geophysical and companies that do 3D VSP, though others have not been doing surveys for as long a time or as routinely as Tomoseis. The national laboratories have active programs. In addition to their work in oil and gas, the labs also work on near surface shallow environmental applications.
Your present assignment as SEG/AAPG Distinguished Lecturer takes you to different places in North America and UK. How does this help you professionally?
It is very interesting. In some ways, it is an opportunity for me to see what other people are doing, what their research interests are. I am enjoying it, so far, though I am only on my 7th or 8th stop. I have 26 in all.
This will take you to North America and UK or beyond?
This program is limited to North America and Western Europe, even though I had requests from South America and the Far East. I think the SEG should be responding to these distant requests, but there may be a combination of financial and political reasons for not doing them this year.
I was wondering if you would like to share with us what research you are doing now?
I am giving this lecture on crosswell imaging now, but in fact much of this work was done 5 or more years ago at Stanford and elsewhere. Most of the work we do in crosswell at Stanford is related to environmental and near surface applications. Moreover, my personal interest is mostly on attenuation of seismic waves. This interest is driven by the observations of attenuation that we see in crosswell data. The problem is that even when we produce an attenuation tomogram, we have no good idea about how to interpret it. We cannot go to the laboratory and measure attenuation on a piece of rock, because the way attenuation is typically measured on a core is with ultrasound waves in the megahertz range. Attenuation scaling is just not reliable as velocity scaling. I have developed this technique called Acoustical Resonance Spectroscopy, for measuring attenuation or Q of a small sample of rock in the lab at frequencies of 1000 Hz or so. This will provide some ground truthing for later being able to interpret the in situ attenuation data. We can also measure Q and velocity on irregularly shaped pieces of rock, like cuttings from a borehole. It is difficult to measure those in the lab even at high frequencies because of their irregular shape. We are looking for a technology that we can potentially locate at the well site, so that as the cuttings come out from the well, we can determine their acoustic properties. So, we spend most of our time working and thinking about numerical and physical models for attenuation in porous media, and how to process data to produce attenuation images.
The other area where we have funding for is related to problems related to carbon sequestration, that is, using geophysical methods to monitor the injection of carbon dioxide in depleted oil and gas fields and aquifers. So, you might say that crosswell data, because the quality is so good, opens the door for a lot of basic research in terms of how waves propagate in porous media. I am spending more of my time looking at those basic issues than say specific applications.
Jerry, what message do you have for young geophysicists entering our industry?
It is a question that I hear often. My response is that technology is important. They need to understand to some extent and not fear to use it when a problem calls for leading-edge technology. Moreover, they must support its development and think ahead longer term than just a few months. The majors need to be competitive and leaders in developing and applying technology, and should support research and education. Young geophysicists should follow their hearts in choosing a career path. They should build and basics and strive to be the best at what they do.
Jerry, we thank you for sharing your experience and spending this time with us. We wish you good luck with your work.
I enjoyed it. Thank you.