Out of all the innovation to take place over the past 20 years in land processing, I believe that 5D interpolation has emerged as the single most important algorithm. Whether or not you agree with me, you cannot deny the importance of 5D interpolation (or “5D” for short) in today’s production processing flows. Along with fame, of course, comes a cost. In the case of 5D, I believe that a significant price has been paid in recent years in terms of algorithm over-hype, with the end result being that many practitioners have experienced disappointment as 5D failed to deliver on certain unrealistic promises. Today, however, I believe that most of us have a well-grounded set of expectations surrounding 5D’s realm of utility. For example, we all seem to agree that 5D is well-suited for minimizing PSTM artifacts; also, most of us now appreciate that 5D is incapable of improving resolution by somehow synthesizing new geological information from existing sparse field soundings. Still, if you consider those last two examples as end-members of a spectrum of potential algorithm applicability, you’ll surely acknowledge that debate and confusion persist regarding the question of precisely where the tool resides in the spectrum. This Focus session will hopefully shed some light on this question, as the following articles are aimed at gauging 5D algorithm effectiveness via practical field and synthetic experiments and decimation testing.

The first article is by Daniel Trad, whose pioneering contributions in extending the so-called MWNI kernel to five dimensions laid down the ground work for all the other 5D algorithmic enhancements that have followed. This paper provides a contextual backdrop for the other articles by giving us a snapshot of the current state-of-the-art in 5D, as well by providing a list of outstanding challenges and likely future directions. In particular, Trad discusses 5D’s ability to preserve subtle effects like AVO, azimuthal velocity variation, and diffractions. He also discusses mitigation of errors related to binning irregular data onto a regular grid, optimal selection of output target geometries and input data processing blocks, and he describes a strategy for improving interpolation of noisy data via incorporation of a clean “guide” data set. Although his figures show that a sound algorithmic implementation can produce great results, he also makes the comment that “traces created from data interpolation contain only a portion of the information that a real acquired seismic trace would have…”, a comment consistent with the notion of 5D’s realm of applicability residing somewhere on the spectrum mentioned above.

The second article, by Duncan et al., treats the topic of interpolation effectiveness as a function of input acquisition geometry. The authors examine an orthogonal and a parallel survey of similar field effort, and they study the image evolution through the 5D process. Real data results before and after 5D are examined for both acquisition types in order to assess the degree to which AVO and AVAz responses are preserved post-interpolation. Also migrated stacks with and without 5D are compared to one another and to synthetics at a well-tie. The authors emphasize that many QC techniques exist for appraising algorithm performance, and they illustrate some of the challenges associated with interpreting the results of any one technique. They also make the very important point that all 5D interpolations are not created equal since implementation details undoubtedly vary from contractor to contractor.

The third article, by Charles et al., examines some coarsely-acquired oilsands legacy 3D data. The authors observe that 5D serves as a useful tool for reducing migration noise on these legacy data, with the PSTM image showing uplift over its poststack-migrated counterpart; they also show that a subsequent anisotropic PSDM gave further improvements still. Despite 5D’s success at minimizing migration noise they show that it was unable to recover fine details in the signal, as evidenced by a comparison between the post-5D PSTM volume and a high-resolution 2D test line, as well as by an additional comparison generated via decimation and interpolation of high-resolution 2D data. Their carefully controlled experiments clearly reveal interpolation’s utility as a tool for minimizing migration artefacts as well as its lacking the “magic” to overcome the need for fine sampling in the field. Like Duncan et al., they recognize the fact that 5D interpolation quality may vary depending on the details of algorithm implementation.

The fourth article, by Bellman, examines the effects of imperfect spatial sampling on quantitative interpretation (QI) and explores whether such effects can be mitigated through the use of prestack interpolation. The first part of her article features a synthetic modelling study in which the spatial sampling of a finely-acquired control data set is perturbed in a controlled fashion. She concludes that the distance of the minimum offset trace in each CMP gather has a significant influence on the quality of P and S impedance estimation. In the second part of her article, she decimates then interpolates a well-sampled real oilsands data set, and submits the interpolated output to a QI workflow. Multiple QI realizations are performed corresponding to the various decimated and interpolated volumes, and facies and fluid estimations are compared to well-control. This analysis yields many specific conclusions, but her main takeaway is clear: sampling matters when it comes to QI, and we should try to improve it in any way possible.

End

     

About the Author(s)

Mike Perz received his BSc in Physics from the University of Toronto in 1990 and his MSc in Geophysics from the University of British Columbia in 1993. Shortly thereafter, he joined Pulsonic Geophysical where his main focus was writing prestack time migration software. Four years later he moved to Geo-X Systems Ltd, later part of Divestco Inc., where he spent thirteen years, first writing geophysical applications software then managing the R and D group. In 2010, Mike moved over to Arcis Corporation where he was employed as Vice President of Technology and Integration at the time the organization was purchased by TGS. His current title is Manager Technology, Arcis Processing and Reservoir Services, and his role is two-tiered, focusing on both geophysical software and business development. His research interests span all elements of land processing including data regularization, prestack migration, fracture detection, deconvolution, tomostatics and multiple attenuation.

References

Appendices

Join the Conversation

Interested in starting, or contributing to a conversation about an article or issue of the RECORDER? Join our CSEG LinkedIn Group.

Share This Article