Matt Hall is the founder of Agile Geoscience. A sedimentologist who found geophysics later in his career, Matt has worked at Statoil in Stavanger, Norway, Landmark and ConocoPhillips in Calgary, Alberta, and is now running Agile from a coworking space he co-founded in Mahone Bay, Nova Scotia. He is passionate about communicating science and technology, and especially about putting specialist knowledge into the hands of anyone who needs it. Find Matt on Twitter as @kwinkunks or at email@example.com.
How many times have you heard these?
- The signal:noise is lower (or higher/improved/reduced)
- It’s too thin to see (interpret/detect) on seismic
- You can’t shoot seismic in the summer (winter/snow/wind)
- More fold (bandwidth/signal:noise/data) is too expensive
- That won’t (will/can/can’t) work
- It looks geological (ungeological/right/wrong)
I say these sorts of things all the time. We all do. We feel like we’re bringing our judgment to bear, we’re exercising our professional insight and experience. It’s part of the specialist advisor role, which many of us play, at least from time to time. Sometimes, when time is short or consequences are slight, this is good enough and we can all move on to more important things.
Often though, we do have some time, or the consequences are substantial, and we need a more considered approach. In those situations, at least for most of us, it is not enough to trust our intuition. Our intuition is not a convincing enough reason for a decision. Our intuition is unreliable (Hall 2010).
Science is reliable. So challenge your intuition with a simple task: prove it. The bed is sub-resolution? Prove it. More fold costs too much? Prove it. This attribute is better than that? Prove it.
First, gather the evidence. Find data, draw pictures, make spreadsheets, talk to people. What were the acquisition parameters? What actually happened? Who was there? Where are the reports? Very often, this exercise turns up something that nobody knew, or that everyone had forgotten. You may even find new data.
Next, read up. Unless you’re working on the most conventional play in the oldest basin, there has almost certainly been recent work on the matter. Check sources outside your usual scope – don’t forget sources like the Society of Petroleum Engineers (spe.org) and the Society of Petrophysicists and Well Log Analysts (spwla.org), for example. Talk to people, especially people outside your organization: what do other companies do?
Then, model and test. If you’re measuring signal:noise, seismic resolution, or the critical angle, you’re in luck: there are well-known methods and equations estimating those things. If you want to shoot cheaper data, or operate out of season, or convince your chief that vibroseis is better than dynamite, you’ll have to get creative. But only experiments and models – spreadsheets, computer programs, or even just mind-maps – can help you explore the solution space and really understand the problem. You need to understand why shots cost more than receivers (if they do), why you can mobilize in June but not July, and why those air-guns in the swamp were a waste of time and money. Modelling and testing take time, but the time is an investment. Most projects are multi-year, sometimes multi-decade, initiatives. The effort you spend may change how you operate for many years. It’s almost always worth doing. If your boss disagrees, do it anyway. You will amaze everyone later.
Finally, document everything. Better yet, do this as you go. But do wrap up and summarize. You aren’t just doing this for the geoscientist picking up your project in five years, you’re also doing it for your future self. Since most interpretation tools don’t have built-in documentation capabilities, you’ll have to find your own tools – look for ones that let you add links and comments for a rich report you can easily share. Wikis are perfect.
At each step, either find or pretend to be the most skeptical person in the world. Ask the tough questions. Doubt everything. Then prove it.
Hall, M (2010). The rational geoscientist. The Leading Edge 29 (5), 596 , DOI 10.1190/1.3422460
Matt, you specialized in sedimentology and then worked as an explorationist and as a geophysical advisor. Tell our readers how has it worked out for you? What exactly are you doing these days? The blog called ‘agile geoscience’ that you and Evan Bianco run is quite popular and interesting. You both do that as a hobby, or as part of some other objective?
I got into sedimentology because I was interested in stratigraphy and found a great field project in the Spanish Pyrenees, an area I love. But eventually I found geology too qualitative and subjective, with lots of semantic squabbling (“Is that a sequence boundary or a regressive surface of erosion?”). So when I went to work at Statoil, and discovered seismic data, I was hooked right away. Seismic geophysics, especially the analysis side – inversion, essentially – seems more objective. And I’m a nerd at heart, so the computational side appeals to me.
Today, we do four things at Agile:
- Integrated interpretation and analysis.
- Knowledge sharing consulting, especially around wikis.
- Development of web and mobile apps for geoscientists.
- Scientific publishing, including blogging.
We blog for lots of reasons: for fun, to maintain links to the world beyond Nova Scotia, to hone the discipline of writing. But above all, the blog feels like the best way for us to deliver on our central business aspiration: to be useful to geoscientists.
There are three of your articles included in the book, but we are including only one of them with this interview. You opine that ‘our intuition is unreliable’, but the ‘science is reliable’. So geoscientists should challenge their intuition with a simple task and prove it. This seems to be a good message. Do you think our geoscientists are doing this? What would be required for doing this better?
Like the rest of the book, the message is aimed at those of us in petroleum geoscience. There’s a special kind of optimism in this pursuit, especially in exploration – we’re always looking for the upside, for the opportunity. But if we’re not careful, we can fool ourselves, and this leads to bad decisions. (Luckily we’re very good at forgetting our bad decisions in this business, but that’s another story!)
Here’s the sort of thing I mean – and once you start looking for it you’ll see it regularly. A researcher comes up with a new seismic attribute or inversion algorithm to predict something geologically interesting, porosity say. The work produces a map, ostensibly of porosity. This is compared with the old way of doing it, an amplitude map perhaps. The reader or audience is asked to compare the maps and observe how much more geological and convincing the new map looks. We marvel at how much easier it is to draw a polygon on the new map. But these superficial comparisons are not science, they’re just optimistic arm-waving. In the porosity example, blind wells or new wells are the best way to provide proof. A crossplot of predicted and measured porosity is essential, along with the expected error in the estimate. Better still, show how the reliability is dependent on the data quality or proximity to calibration wells.
How do we get better? First, by not just letting it go. We should to hold ourselves and others in our profession to high standards of proof. I recommend reading Blau’s brilliant paper of 1936 – the first paper in the journal Geophysics. Before publishing anything, I like to imagine the most skeptical person in the world reading my work. After every sentence, she looks at me with a doubtful expression and says, “Are you sure? Convince me.”
Read, listen and learn, and write, talk and teach, are good tips that you give to geoscientists. We all realize that technical communication is paramount. Many of our geoscientists are good researchers, but do not possess the skills required for effective speaking, and also not good at documenting their work. This is particularly true for those geoscientists whose first language is not English. What would be your message or suggestion to them?
That’s easy: take my geoscience writing course: agilegeoscience.com/courses
Seriously though, I think we have to lighten up a bit. We scientists have built publishing up into a big deal, with too much prestige. So we have academics publishing anything and everything in the scramble for funding, oil company scientists afraid over over-sharing, and service company scientists treating it as a marketing channel.
But publishing is really not a big deal, it’s just part of the conversation we have every day about our profession. It’s important that the quality of that conversation is high, but it’s also important that it’s inclusive and that it generates new ideas. And the only way for that to happen is for more people to do it, more of the time, with letter-style articles, short case-study papers, blog posts, and so on. I believe this would massively increase our community’s ability to communicate and collaborate, and our rate of innovation with it.
Learning to program is a good capability, as it allows one to think about solutions and automate them. However, it does require time, as programmers spend long hours working on their programs. For those taking it up as a career, the job could be very tedious. Would you agree?
Our profession tolerates a lot of tedious and unproductive things: manually picking thousands of tops, laboriously interpreting seismic horizons, constructing maps in PowerPoint, attempting multivariate data analysis in Excel, attending endless meetings, and so on. I’m not saying that all of these things are always unnecessary, or that programming skills can free us from all of them, but I believe that those hours spent improving code are more likely to result in smarter science, and a smarter industry.
I’m also not saying we all need to be professional programmers. Just that learning how to approach problems like coders do, to rapidly prototype solutions, and to communicate with them so they can help us better, are useful skills – and they start with learning the basics of programming. Once you have the basics, you quickly find that creating new tools is easy – drop in on a hackathon some time and see for yourself.
You have been very active in technical communication, what with your active blog running continuously, and your book entitled ’52 things…’ being well received. This in itself should be an inspiration for geoscientists. How did you and Evan think of these ideas, and what challenges did you face initially to get it off the ground? What keeps you going now?
It’s an exciting time for scientists. We depend on publishing for progress, but because of the expense of printing we have lived with publishing gatekeepers for centuries. But now we have this new world, in which anyone can publish a blog, paper, or book – for free, or almost free. With this democratization of the written word, the roles of academic publishers and technical societies are rapidly changing. It’s fascinating to watch, and be a part of.
There are, of course, challenges. There is prosaic stuff like learning new software, and esoteric stuff like copyright law and open licensing. There is the community engagement aspect of getting 40 authors to write a book together. There is the creative bottleneck of coming up with something to blog about twice a week. And there’s the small problem that there’s no money in technical publishing – at least not without rebuilding the same industry we’re trying to disrupt!
What keeps us going? The usual: naivety and stubbornness.
You and Evan had organized a session at the 2013 GeoConvention called ‘Unsession’. You also followed it up at the recently held 2014 GeoConvention. Tell us about the objective for holding it and if it was achieved? How did it go this year?
We hosted the first Unsession (ageo.co/unsession2013) for two reasons:
- We wanted to make a list of the most pressing questions – the unsolved problems – in applied subsurface geoscience.
- We wanted to explore how a different kind of conference session might work, as session with no experts (just expertise), no lectures (just conversation), and no prepared material (just outcomes).
The most pressing problem, rather surprisingly, was ‘Too much secrecy’. So this year we opted to continue the experiment and dig into that problem, asking what more openness in the industry might do for us, what new opportunities it might create. You can read about the event on our wiki at ageo.co/unsession2014. The punchline: we need more public seismic data. This will be the subject of a future discussion.
If we restrict ourselves to applied geophysics, what according to you are the three leading unsolved problems? How and when do you think we will be able to surmount them? Are there any revolutionary ideas taking shape, or may be around the corner?
Great question! This is one of my favourite subjects – see ageo.co/unsolvedproblems.
My vote for the most pressing technical question in our field is “How can we represent and quantify error and uncertainty throughout the geophysical workflow?” Today, some parts of the workflow capture uncertainty well – geomodeling and reservoir simulation, for example – others ignore it almost completely – well logs, well ties, horizon interpretation, and so on. We have to figure out how to not only capture uncertainty, but also how to carry it through the workflow in a consistent way.
After that I think it’s “How can we visualize and exploit the full complex spectrum throughout the geophysical workflow?” Spectral decomposition scratches the surface of the spectral dimension, and 5D interpolation takes it another step further, but I think we are far from truly exploiting the frequency and phase content of our data.
The last one might strike some people as being ‘solved’, and in a strict sense it is. But I’ve seen it messed up too many times to say it’s not really still a problem: “What is the best way to tie a well to seismic?” If this seems too narrow, I think it’s really part of a broader problem that interpreters have with forward modeling in general. Lots of interpreters never make even simple models, other than 1D synthetics, and this seems like a huge missed opportunity to get better at interpretation.
What are some of your future plans that you would like to share with us?
In line with the modeling problem I just mentioned, we have big plans for our new forward modeling tool Modelr (modelr.io). We want to bring simple forward models to every interpreter’s desktop and mobile device. So that promises to be an obsession for the new several years. It’s all part of our quest to be useful.
What else? We’re currently compiling contributions to two new ‘52 Things’ books, one on Rock Physics and one on Biostratigraphy. Soon I will start soliciting chapters for one on Geocomputing. And we’ll be hosting another geophysics hackathon – an exploratory idea generation and prototyping session – around the SEG Annual Meeting in Denver. I think it’s going to focus on another unsolved problem: insufficient seismic resolution.
On a different note, Matt, let me ask you this: we tend to think about only where we have reached, but not where we could have reached. If we also consider the latter, it might serve as an inspiration to excel even more. What would be your take on this?
I like it. We could have been worse off, and must take time to appreciate what we achieve. But we could have been better off, and I believe we must – as individuals, as a community of practice, and as an industry – articulate our greatest aspirations. ‘Good enough’ is not good enough. It’s not even close.