Let us begin by asking you about your educational background and your work experience? How did you get into geophysics?
From as far back as I can remember, I’ve been interested in science. When I started university, I was leaning towards computer science but was a little disillusioned with the dry 1st year programming courses. My father, a geologist, suggested I consider geophysics as it was a good blend of computers, physics, and earth science. I took his advice.
Doug, you joined Chevron in 1997, and decided to stay put since then. This is very unusual, and so my question to you is, how did this happen? Usually, professional or financial considerations lure people to different opportunities, but it seems all that was immaterial for you.
Yes, I’ve been with Chevron for over 20 years now, but it doesn’t seem that long. Out of University, I joined CGG in Calgary back in 1989 and then moved to Chevron. With Chevron being a large, global company, I’ve had the chance to move around internationally, work in acquisition, processing, and interpretation and get perspectives from both the asset team and R&D/tech services. With the wide range of opportunities I’ve had, I’ve never really felt the need to change companies. Of course, through the ups and downs of the industry you need to keep your options open – in fact, back in 2004 when Chevron divested most of its Canadian and US land assets, I thought the writing was on the wall for our Calgary office. That is when I accepted an assignment with Chevron in the UK which in hindsight was a great move. Following my UK posting, I transferred to an R&D position at Chevron’s head office in California. I then returned to Calgary in the summer of 2014 as Chevron was staffing up for our unconventional assets in Western Canada.
You have gained extensive experience in acquisition, processing and interpretation of seismic data from not only in and around Canada, but also from Alaska, Cuba, Argentina, Australia, Turkey, Greenland, North Sea, Brazil, West Africa and the Gulf of Mexico. Tell us about some of your experiences and general comments about the challenging geology in certain areas, the quality of seismic data you acquired and processed, and the ease of interpretation in certain areas.
I can give you several examples. In the North Atlantic/ Norwegian Sea the main challenge was imaging within and below the thick basalts. These basalts have a substantial velocity contrast with the surrounding sediments and a very low effective Q due to internal scattering – standard marine seismic surveys returned data with very limited bandwidth and distorted ray paths. In order to improve interpretability in these areas, many new technical methods were tested (like broadband source and receiver techniques, ocean bottom nodes, advanced processing tools, and joint CSEM/seismic imaging) – several with good results. A particularly challenging project was in the thrust belts of SE Turkey. Interpretation in this structurally complex zone was very difficult due to having to initially rely on old, sparse 2D data – reprocessing with modern Pre-SDM workflows helped. We also attempted to acquire new surveys in key areas but this was quite difficult due to access restrictions both physical and military. We pushed the processing envelope to try to compensate for missing data in the field but this was only partially successful. Another challenging project was exploring in the deep water offshore Nova Scotia back in the early 2000s – this huge area has rugose seafloor, complex salt tectonics and little well control at the time. We relied on marine 2D and narrow azimuth 3D datasets and tried to handle the complex multiples and ray path distortions as best we could. Surface related multiple attenuation was one new tool we tested along with Chevron’s proprietary Gaussian Beam pre-SDM (adapted from GOM use). In hindsight, we were severely hampered by the limited offset and narrow azimuth nature of the data we had. One unique project I was fortunate to participate in early in my career was a 2D PP, SS, PS seismic experiment conducted in Western Canada by Bill Goodway (at PanCanadian). This was a set of orthogonal lines acquired with an inclined impulsive source (fired inline and crossline and combined to generate P & S) and 3C receivers in the early 1990s. CGG had a good set of shear wave processing tools based on the earlier works of CGG pioneer Robert Garotta and others. Under the guidance of my CGG mentors along with the excellent data quality, I honed my skills in pure shear and converted waves.
Chevron has been making use of 3C3D seismic data for the characterization of conventional as well as unconventional plays comprising oil sands and shales. From your experience, tell us how much value-addition results from making use of prestack joint inversion in the analysis for these plays.
While it is difficult to clearly quantify the additional value derived from using prestack PP-PS joint inversion, in the case of oil sands, there have been many studies presented which highlight the incredible success that converted wave data has brought to the static and dynamic characterization of the reservoir for in-situ production. I’ve found similar results in the projects I’ve been involved with. As for shales, I don’t think we have been successful in clearly demonstrating differentiated value by including the converted wave in the same way as for heavy oil – but I’m hopeful that we will see examples in the future as data quality and processing/analysis methods improve. Where I do think converted waves add value in the shale case is by exploiting shear-wave splitting for fracture characterization.
What precautions do you observe in carrying out ‘true amplitude and phase processing’, for your seismic data?
There are three main aspects to this which I try to address as much as the data allow: 1) Identify the main sources of distortion for the type of data and geological setting I am dealing with. These may include variable water depth, variable coupling on land, heterogeneities in the overburden, illumination, processing algorithms, etc. 2) try to deal with each as appropriately as possible by employing deterministic corrections when available and then resorting to statistical means (being careful to avoid any biases introduced by noise). Calibration to synthetic data from well control is beneficial to help determine if the data is moving in the right direction. And 3) perform careful QC in the appropriate domains to verify that the corrections employed are working as they should and to catch any unexpected data corruption which may occur. While we still have a way to go to achieve fully “true” amplitude and phase processing, we continue to get closer. One thing I’d like to see matured in the near term is a practical way to derive useful Q models for land data which account for heterogeneities in the overburden and apply these Q corrections in imaging. I’ve seen great examples of this in marine data.
In a talk I attended at the GeoConvention this year, the speaker, who is an experienced geophysicist from Calgary, very emphatically said that all seismic data being processed today never has a zero-phase wavelet embedded in it. It is always mixed phase. I am also aware of some big oil companies in Calgary that go for ‘controlled amplitude- controlled phase’ processing flows and ensure that the wavelet embedded in the final migrated data volume is close to zero phase. Could you tell us, what your take on this issue is?
To the degree that the precautions I’ve outlined in the previous question are effectively carried out, I would hope that we achieve data which is close to zero-phase and stable across the survey. At the end of the day, it is critical that we try to measure what phase we have ended up with and adjust accordingly in a residual sense.
What differences did you perceive in the interpretation of surface and downhole microseismic data?
While I wouldn’t consider myself an expert interpreter of microseismic data, I do have some experience with both surface and downhole data. The fundamental geometry differences strongly influence how you approach interpretation. For example, in the downhole case it is very difficult or impossible to determine focal mechanisms unless you are fortunate to have more than just a basic single vertical receiver array. Surface data are better suited for establishing focal mechanisms but of course the challenge there is recording in a generally noisier environment and thus not detecting the lower magnitude events that downhole configurations excel at. In the downhole case, one must account for the sampling bias introduced by the sparse listening location(s) relative to the areal extent of the microseismic events. Another difference is in the sensitivity of event locations to the velocity model used – in the downhole case, the impact of velocity anisotropy (VTI) is usually amplified by the more horizontal ray paths.
Seismic data with compressive sensing technique has been in the news in the geophysical circles. Tell us about your personal experience on this. Does it hold promise, or there is more hype around it?
My introduction to this emerging technique began with discussions with my friends and colleagues Gilles Hennenfent and Sam Kaplan – two technically astute individuals who have each had a pioneering role in adapting this new paradigm in sampling theory to the geophysical case. I personally think it holds enormous promise if we are bold enough to apply it and learn what its capabilities and limitations are in a practical sense. I understand it this way: If we can substantially compress our seismic data after we record it (which we typically can), then we should be able to achieve the same signal fidelity by recording it using a compressive sensing scheme right at the start. We can’t ignore the value that compressive sensing has brought in other fields such as medical imaging, A/D converters, etc.
What machine learning applications have you employed in the interpretation of conventional and unconventional plays that you may be engaged in?
I have dabbled in machine learning at several times over the years starting back when I was finishing my undergraduate degree: at that time, I wrote a paper on the theory and application of Fuzzy Logic to geologic classification. Later, in the mid-1990s while I was at CGG, I was fortunate to be an early user of the neural network based seismic waveform classification application “Stratimagic” for conventional plays. More recently I’ve been using neural network tools for facies classification and property prediction in unconventional plays. It is important to remember that machine learning methods are not miraculous black-box solutions – the old adage “garbage in, garbage out” very much applies.
You seem to be conversant with a range of software interpretation packages. Tell us which one you like the best and why.
I’m not going to endorse any package by name but I can make some general comments. A key feature for me is to be able to efficiently visualize the pre-stack seismic along with final image volumes. Another feature is the flexibility to incorporate and visualize 2D & 3D seismic and a diverse array of non-seismic information in a unified environment. It is also important that the interpretation package communicates with or is a component of a larger subsurface environment which encompasses reservoir modelling, geomechanics, reservoir engineering and petrophysics. On the interpretation side, as we achieve better seismic and its derivative attributes, it will become desirable for packages to include effective machine learning tools to assist with fault framework building, multi-attribute visualization optimization, seismic stratigraphy, geobody characterization, etc.
What motivates or influences you and your work?
I guess one common thread throughout my career has been to seek out challenging or novel projects. These force you to push yourself beyond your comfort zone and often require out-of-the-box thinking. I’ve enjoyed that my work has been stimulating and non-routine. I had my fill of non-stimulating, routine work when I first started - I was tasked with setting up acquisition geometry definitions for 2D land and marine processing (not using punched cards, but close). That drudgery prompted me to seek out and apply myself to more interesting projects.
Doug, with a long and varied professional experience in acquisition, processing and interpretation of seismic data, have you not thought of documenting these experiences for others to benefit from?
Good question. In fact, I have documented and shared my learnings for others, unfortunately this has been mainly for internal audiences. In the early-90’s at CGG, just as the WWW and HTML was getting going, I recognized the value of using web pages with hyperlinks (an early form of Wiki) to document and share information and workflows with my colleagues. Since joining Chevron, I have been active in presenting talks at our internal, international geoscience meetings. Up until the mid-2000s, Chevron hosted a dedicated internal geophysical conference every year which had run continuously since the early 1940s. Since the mid-2000s this has merged into a broader sub-surface forum. For the external audience, I have contributed to several industry papers, either as co-author or other contributions. For a variety of reasons, I have found it easier to share my work internally rather than externally.
If I were to ask you for your ‘off-the-cuff’ assessment, how would you rate yourself on a scale of 1 to 10, in terms of (a) communication skills (b) willingness to admit mistakes (c) extempore speaking and (d) awareness about the tricks of the trade?
With 10 being the highest, I would say (a) = 7, (b) = 9, (c) = 7, and (d) = 8. My wife may disagree with me on a few of these…
What has been your philosophy towards your professional growth?
I’m not sure if I have a philosophy per se – but I do have several observations. At some point most earth scientists will be faced with a decision on whether to stay on a more technically focused career or to move into a supervisor/manager role. In my case, each time I found myself drifting too far away from technical work, I made a conscious effort to get back to the technical side – this is more fulfilling to me. It should go without saying that life-long learning is a key to growth. Embrace every type of learning that you have access to: textbooks, journals, classes, workshops, mentors, and working with colleagues. This feeds into the idea that an open mind is more prepared to take advantage of opportunities that may present themselves. And if opportunities are not forthcoming, then be a catalyst to generate them.
Would you consider yourself as an extrovert or an introvert? A follow-up question then is: it is usually said that, ‘introverts make good leaders’. How would you react to that statement?
I have taken the Myers-Briggs personality assessment and a few others and usually come out as a solid introvert. Regarding the idea that introverts make good leaders, I agree – but not to say that they are necessarily better that extroverts. I also believe it depends on the personalities of those being lead. I’ve read that for employees needing direction, extroverts can be most effective while employees that are more proactive respond better to the leadership style of introverts. Both introverts and extroverts have equally important roles to play in the leadership of any organization.
What are your other interests?
I love spending time outdoors with family and friends. I’m drawn to most forms of self-propelled recreation including hiking, snowshoeing, X-Country skiing, road biking, mountain biking, fat biking, canoeing/ kayaking. Since returning to Canada, promoting cycling as a viable and beneficial form of year-round, active commuting. Fundraising for causes such as the Alberta Cancer Foundation and the MS Society. Reading science fiction and non-fiction. Music – both as listener and performer (I’ve played saxophone professionally in the past. In fact, that is where I first met Rob Stewart – we played sax in the same band). Travelling and photography.
What would be your message for young entrants in our industry?
Our industry is entering a new, transitional phase unlike anything we’ve seen in the past 50 years. While it is difficult to predict the exact pace and detailed changes this transition will entail, very few question that it is underway and will manifest itself over the coming few decades. Young entrants must be prepared to adapt as they pilot their careers through these significant changes. I’m reminded of one article which attempted to describe some of the key drivers (and strategies) for this transition – to paraphrase: 1) resource abundance leading to a critical focus on cost, efficiency and speed, 2) disruptive technical advances changing how we work and interact with our machines and the vast volumes of data, and 3) demographic shifts impacting the work environment and societies’ view of the oil and gas industry at a fundamental level. While oil and gas consumption is not going away overnight, we must acknowledge societies need and desire to move to low carbon energy and the profound impact this will have on our industry over the span of a young entrant’s career.
Share This Interview