Mostafa Naghizadeh received a BSc in mining engineering from the University of Kerman, Iran, and an MSc in geophysics from the University of Tehran in 2003. He received his PhD in geophysics from the University of Alberta in 2009. He worked as a postdoctoral researcher with CREWES at the University of Calgary from January 2010 until December 2011. His interests are in seismic data reconstruction methods, sampling theory, and seismic imaging. He currently holds a postdoctoral fellow position with Signal Analysis and Imaging Group (SAIG) at the University of Alberta. In 2011, Mostafa received the J. Clarence Karcher Award from SEG.
If I were to respond to the question ‘Which single mathematical idea had the greatest contribution to the modern world of technology?’ I would reply, without any hesitation, ‘The Fourier transform’. No one can deny the great role that the Fourier transform has played in telecommunications, the electronics industry, and signal-processing methods over the past century. Therefore, it is no surprise that the Fourier transform also has an indispensable part to play in geophysical data analysis. While the presence of Fourier can be felt in every discipline of geophysics, here are some highlights from exploration seismology:
Data acquisition. Almost all of the electronic and mechanical devices used in field data acquisition are designed and tested using the Fourier analysis. Also the collected signals have to be checked by Fourier analysis to ensure acquisition of a proper bandwidth of data. Bandwidth has a direct relationship with the strength and resolution of a signal; it’s a good measure of a signal’s capacity to obtain information from desired depths of the subsurface. The Fourier transform also plays a prominent part in data compression and storage.
Noise elimination. The collected seismic data are often contaminated with random or coherent noises. In the traditional time-space coordinates that the seismic data are recorded in, one can not easily distinguish between noise and signal. However, by transforming the data to frequency and wave-number domains using the Fourier transform a significant portion of noise is separated from the signal and can be eliminated easily.
Interpolation. Seismic records contain specific features of simplicity such as a stable wavelet and spatial continuity of events. This simplicity has even more interesting consequences in the Fourier domain. Seismic data composed of linear events will only have a few non-zero wave-number coefficients at each frequency. This sparsity, as it is called, is used in most of the seismic data interpolation methods to produce a regular spatial grid from irregular spatial sampling scenarios. Also, a simple relationship between the low frequencies and high frequencies of the seismic records helps to overcome the spatial aliasing issues by reducing spatial sampling intervals.
Time-frequency analysis. Seismic traces can exhibit non-stationary properties. In other words, the frequency content of the data varies at different arrival times. Likewise, spatially curved seismic events have non-stationary wavenumber content. In order to analyse these signals, time-frequency analysis is required. Several transforms such as Wavelet, Gabor, Stockwell, and Curvelet transforms have been introduced to perform these analyses. In essence, all of these transforms are windowed inverse Fourier transforms applied on the Fourier representation of the original data. These Fourier-based transforms can be used for interpolation and noise elimination of very complex seismic records.
Migration and imaging. Migration of seismic records is a necessary processing step to obtain the true depth and dip information of seismic reflections. The Fourier transform is a very common mathematical tool for solving wave-equation problems. Many of the wave-equation migration techniques deploy the multidimensional fast Fourier transforms inside a least-squares fitting algorithm for imaging the subsurface. The Fourier-based wave-equation migration algorithms avoid the common pitfalls of ray-tracing migration algorithms known as ray-crossing.
Q&A:
In your article you mention about Fourier analysis being used just about everywhere in geophysical data analysis encompassing seismic data acquisition, noise elimination, interpolation, time-frequency analysis as well as migration and imaging. Can you think of another mathematical tool that has such a versatile application in geophysics?
I think the most important mathematical tool in geophysics, especially seismic data processing, is summation. Nothing replaces the power of stack for enhancing the signal. In some sense, the Fourier transform is also a concatenation of summations. In addition to Fourier transform, I like Radon Transform which is some kind of directional summation. Radon transform provides the ability to separate entangled seismic events based on their slope and curvature, which is very beneficial in seismic processing. What makes the Fourier transform special is its orthogonal property i.e. the inverse and adjoint Fourier transforms are equivalent. However Radon transform is not an orthogonal transform and applying it requires extra care. To better understand the importance of forward and adjoint operators, I recommend reading Jon Claerbout’s online book “Basic Earth Imaging”.
Your Ph.D. work has focused on seismic data regularization issues and how they can be addressed. What I believe has been your succinct message?
My PhD work was focused on interpolating multidimensional seismic records. I tried to come up with robust reconstruction algorithms to interpolate severely aliased, irregularly sampled, and curved seismic events. To achieve this goal I utilized various mathematical tools such as prediction filters, Fourier transform, S-transform, curvelet transform, etc. To be brief, in my PhD work I showed that the low frequency portion of seismic records can be utilized to overcome severe aliasing in interpolation applications.
While in general it is said that 5D interpolation application for prediction of missing data is not a substitute for field data, still 5D interpolated data has advantages. Could you elaborate on these?
Many seismic data processing sequences such as multiple elimination and migration algorithms could benefit from a data that are distributed regularly and sampled on sufficiently finer grid spacing. However, you can rarely find a real data that meets these conditions. Therefore, having a mathematically robust intermediate steps like 5D interpolation algorithms that can create regularly sampled data from real data is advantageous. Some might say that the irregularity and alias issues can be addressed by designing proper operators and weights inside migration algorithms. But this statement ignores the non-linear nature of the interpolation algorithms that are applied using several adaptive iterations. Expecting to interlace such a non-linear operation into an already complicated migration algorithm is a bit ambitious and will not be as effective as applying them separately. Yes, interpolation does not add any new information but the same might be asserted about migration too. Seismic data goes through numerous mathematical operations before it is finally transformed to a usable image. It is important to make sure that these mathematical tools do not leave unwanted footprints on final image. Interpolation methods alleviate these kinds of harmful artifacts and footprints.
The 5D interpolation techniques have been applied to the seismic data where the field seismic data has also been acquired, and the two compared. Cited examples look favorable, but what do you think needs to be done to an accurate job for the former?
The ultimate purpose of Interpolation algorithms is to estimate data at new positions using the available data. The estimated data should resemble as closely as possible the real data if we were able to acquire them at those locations. To achieve this goal it is crucial to know the distribution, complexity, and alias severity of the original real data. When you completely understood the condition of original data and the issues that need to be resolved, you can go ahead and pick the proper interpolation technique. It is also important to be aware of artifacts that might be leaked to data due to the nature of the mathematical tools used in interpolation algorithms. For example, for Fourier transform, side-effects (Gibbs Phenomena) need to be address carefully by proper zero-padding or tapering of the data on the boundaries. Inputting aliased data to an interpolation algorithm which does not have anti-aliasing capability or an irregularly sampled data to an interpolation method that cannot handle non-uniform data could result in degraded interpolation volumes. So you need to have a full understanding of both your data and interpolation methods to create a decent and accurate interpolated volume.
What are some of the challenges that still exist in terms of seismic data regularization? Do you see light at the end of the tunnel?
I think the interpolation methods are developed to near perfection with all the research conducted on this topic in recent years. The issue is no longer the interpolation methods rather finding data acquisition designs that they are compatible with the latest advancements in sparse Fourier reconstruction algorithms. I think it is time to come up with seismic acquisition layouts that are specifically designed by considering the requirements of 5D interpolation methods. For example, traditional orthogonal acquisition designs are not suitable for sparse Fourier reconstruction algorithms. The sparse Fourier interpolation algorithms prefer random distribution of available data samples for robust reconstruction. It is also possible to find semi-regular patterns that are suitable for multidimensional sparse Fourier interpolation methods. I have shown an example of the latter in my article “Sparsity and Band-limitation: Two sides of the same coin” in December 2013 issue of Recorder.
Some of the challenges with seismic data have been enunciated as high resolution of the acquired data amongst others. Similarly, noise elimination, amplitude friendly migration and determination of accurate velocity fields are some other challenges. We have made progress in all these areas, but with increasing expectations from the seismic data, such progress has fallen short. What is your take on these challenges?
While seismic is the most accurate and reliable geophysical prospecting for oil exploration, there is still a limit on how much information we can squeeze out of seismic data. So maybe we need to adjust some of our expectations from seismic data. However, as you pointed out correctly, it is very crucial to acquire the best quality seismic data from field. The success of seismic processing method relies gravely on the quality of input data.
In the first edition of his book on seismic data processing, Oz Yilmaz had mentioned that one challenge in seismic data processing is to select an appropriate set of parameters in the processing sequence and another one is to evaluate the resulting output from each processing step. In your expert opinion, is this being done in our industry?
I believe so. Our industry benefits from very skilled and talented seismic data processors who take extreme care to get the best quality output from any seismic data processing step. It is also very important to have robust and comprehensive Quality Control (QC) measures to achieve best results from processing techniques. A seismic data processor should have sufficient knowledge about the assumptions and prerequisites of data processing sequences.
In the company you work for or in our industry at large, do you think appropriate use of borehole data, i.e. well log curves and VSP data is being used during processing so that the processed data represents an accurate cross-section or volume of the subsurface, which in turn can lead to a more meaningful interpretation?
Definitely yes. Borehole data with their important core and well-log information provide the most needed calibration component for seismic data. The depth information in seismic data cannot be trusted until it is well-tied. The horizon tops information from well-logs is very important for initial velocity model building in depth migration of seismic data as well as subsequent velocity update and corrections. The borehole data becomes even more important for AVO inversion tasks in Quantitative Interpretation (QI).
On the lighter side Mostafa, why do think common sense is called common, when it is not common?
“Common Sense” has been coined by philosophers at the early stages of human civilization to rule out some “Obvious” concepts from debates. It is based on the presumption that our senses are reliable. With the advancement of modern science, we now know that none of our senses are reliable and we cannot rely on our senses to push the frontier of our knowledge! So, “Common Sense” no longer makes sense. We can still use the phrase but it should be used to point to scientifically proven concepts. For example, it should be a common sense that evolution by natural selection is a fact. But a large percentage of people, even in most developed countries, deny it! So until people take science seriously and give the respect she deserves, “common sense” will be a rare phenomenon in this millennium.
Share This Interview