It’s a little cliché to say, but technology is changing so fast that it’s difficult to keep up. Computer storage, processing speeds and visualization capabilities continue to grow exponentially; “integration” (however difficult it is to define) is the dream of most geoscience departments and generic “big data” analysis techniques are evolving rapidly – this is indeed an exciting time to be a geophysicist, a scientist, a human being.
Geoscientists are the Google Translate of the oil industry; our job is to translate the coded information in the data we gather to English (or other language of choice). We used to do this with creatively hand drawn and coloured maps and a generous amount of intuition – now we use sophisticated algorithms and powerful visualization, although intuition and creativity are still most definitely required. With all of these new tools, what is the current state of our effectiveness as translators? What are some of the exciting developments? What is the future vision and how do we get there?
Most companies and research groups are working to some degree to incorporate multiple data types (such as seismic data, well logs, production information, microseismic, etc.) in their interpretations of and predictions from the data they acquire; the more pieces of the puzzle added, the clearer the picture. Integration methods range from simple visual comparisons of maps made from two different data sources to numerical modelling and statistical procedures at a basic data level. Quantitative Interpretation (QI) is a broad approach that encompasses many linked techniques in its aim to extract geological properties from seismic data. These geological properties can then be included in analytical methods to determine the key factors in predicting the future performance of a hypothetical well or field.
QI involves a series of analysis steps that each require input datasets, mathematical functions and parameter selections. Choices are made at each point that affect the outcome to some extent, so the more experienced the practitioner; the more times they have seen a particular situation and learned from the results, the better the choices and the higher the chance of a satisfactory outcome. By satisfactory, I mean a realistic prediction of geology from seismic data that not only matches the existing wells, but also predicts the geological conditions in an undrilled location that turn out to be correct. This is an accepted and effective process that has been adopted around the world to reduce exploration and development risk.
Much of the credit for the degree of success of the QI outcome can be attributed to factors that are beyond our control, such as the inherent elastic contrasts and intrinsic properties of the rock, the conditions in the near surface (on land or water), the weather on the day the seismic data were acquired. However if we go beyond face value, regardless of quality, the seismic data always contain more information than we think. QI encompasses the best methods currently available to dig deeper and reveal the hidden information. So, how do we make it better?
Aside from improvements in the theory, which is ongoing, we make QI better by increasing the quality of the inputs, ensuring the appropriateness of the assumptions underlying the mathematical functions, testing the correctness of the parameter choices, and doing everything faster than ever. This plan sounds straightforward, but it’s almost never obvious how to make these workflow improvements. This kind of challenge is where big data analytical techniques with a corresponding increase in computer processing speeds and capability (eventually even quantum computing) can be introduced. Seismic data has always been big, but seismic analysis is mostly linear: the output from one process is the input to the next. Analytical techniques allow lateral analysis that geoscientists are only just starting to touch on. Statistics and machine learning are much more mathematical than most of us are comfortable with and the approach doesn’t necessarily come easily for geoscientists who are used to seeing a direct cause and effect to their analysis. However, as long as we maintain a good balance of objective mathematical process and subjective geological sense, this new direction should reveal new insights and enhanced efficiencies and, perhaps most importantly, be a catalyst for integration.
But how do we get there? It’s hard to argue with the potential benefits of a more complete and thorough analysis of the range of available data, but there is plenty of debate about appropriate and effective procedures, near term objectives, and in a business environment, the best use of limited money. Shortcuts are tempting. Instead of saving money, however, shortcuts usually expose large gaps or inaccuracies in our knowledge – which is not always a bad thing. Collaboration, integration, models of all kinds (scientific and business) and a little bit of faith are therefore necessary to understand, effectively communicate and eventually achieve the ultimate benefits of a significant paradigm shift.
My presentation will not necessarily answer all the questions posed in the abstract, but there will be explanations, examples and opinions.
Join the Conversation
Interested in starting, or contributing to a conversation about an article or issue of the RECORDER? Join our CSEG LinkedIn Group.
Share This Article