Recent advances in geophysical workstation hardware and software allow interpreters not only to view pre-stack data, but also to manipulate it. Are the days of processing numbered?

Main

Marlo Baydala is a geophysicist with Post Energy, based in Calgary. While recently working on a 3D survey in the Triangle zone of the Foothills, the 12-year veteran turned up an interesting anomaly. “It was a bright spot, but the area was noisy, and we wanted to get a qualitative look, so we decided to go to the pre-stack data,” he recalls.

Rather than troop down to the processor, Post Energy loaded the pre-stack data onto their in-house workstation and examined and manipulated the voluminous data right in the office. “From the pre-stack data, it was obvious that it was difficult to get a good velocity function, and the data had stacked as an anomaly,” says Baydala. “We discarded the lead.”

Post Energy is one of a dozen companies in the oil patch that operate Genetek’s EarthWorks geophysical workstation. In addition to integrated 2D and 3D seismic analysis, modeling and mapping and 3D visualization, the system now manipulates pre-stack data on a real-time basis.

Mark Sun is president of Genetek Earth Research. For the last decade, the 40-year old geophysicist has been putting his experience in the oil patch into the development of what he considers his ‘dream machine’, an inexpensive system that allows the geoscientist to fully manipulate the fundamental parameters of both stacked and pre-stack data, in real-time. “Up until now, there was no easy way to do it,” says Sun. “We can now do it because of the high-power 64-bit machines and software out there. Our entry-level system, hardware and software included, is less than $60,000.”

Although based in Vancouver, much of Genetek’s R&D work is done in the Calgary office. Sitting in front of dual monitors, Sun activates the 64-bit Compaq Alpha computer that drives the system and calls up a test program showing a 2D stacked section. “The original reason for stacking in the late 1950s and early 1960s was to improve the reflector strength,” says Sun. “When you have a poor signal, stacking is the best way to go. It can reduce the interference from multiples and improve signal to noise ratios by statistically averaging the data.”

Over the last three decades, however, advances in acquisition and processing have alleviated much of the noise problem. “Our data in Western Canada is now generally good. The original idea to see reflectors is much less of an issue; stacking is now actually removing valuable data.”

Sun offers an analogy. “Let’s say you’re having a party and you want to hire a caterer. So, you call the caterer and you tell him the average age of the people in the room is 40. Now, there could be 2-year old kids and 95-year old grandparents, but if all the caterer knows is that the average age is 40, he’s going to bring inappropriate food for many of the people.”

To illustrate his point further, the geophysicist points to a lateral amplitude variation on the 2D seismic line on his workstation. “As an interpreter, I see this anomaly, and I can come up with a model to explain how porosity accounts for the change in character. But, if I were to drill this anomaly, it would come up dry.”

To explain why, Sun activates and drags the workstation’s pre-stack analyzer over the post-stack anomaly. The display expands to show the individual traces. The dipping, 2d pre-stack gather over the anomaly clearly indicates a geometry problem. “Someone put the shot or receivers in the wrong position,” he explains. “It’s a simple geometry clerking error that is undetectable on post-stack data. Visualizing the pre-stack data in real-time is a great way to check the quality of processing; it helps identify static, velocity and mute problems at a glance.”

Genetek didn’t write the pre-stack imaging tools simply to catch errors, however. “We wrote them so that interpreters would be able to learn more from the data. In the first cut, we said, let’s just view it, so that we can learn about the quality of processing and spot problems. In the second stage, we move up the processing ladder.”

Using the pre-stack facilities, Sun quickly images Ostrander supergathers, rangelimited stacks and AVO difference stacks in real-time. “Dragging the analysis window allows you to see how the pre-stack data changes as you move into and out of stacked anomalies.”

The pre-stack imager can be used to draw more value out of the data. “Pre-stack is really good information mixed in with poor information,” says Sun. “The detailed information is in near-offset traces; these traces give you higher resolution, in general. You can pinpoint smaller beds, and see more detail in those beds.”

That detail can allow determination of lithological data. “AVO (amplitude variations with offset), for instance, can exploit the fact that amplitude does change with the angle of offset, and can be used to distinguish phase, porosity and rock type.”

Marlo Baydala knows the advantages of having in-house access to pre-stack data. Post Energy is involved with a variety of plays throughout the plains and Foothills of Alberta, with 25 exploration wells planned for 2000. A well in the Foothills represents a significant percentage of their $30 million drilling budget.

Post purchased a Genetek system in early 1999, and installed the pre-stack feature in January, 2000. “The ease of manipulating pre-stack data in-house has helped us high-grade drilling locations,” says Baydala, who has five years experience as a processor with Shell. “If you’re going to drill an anomaly on stack, and you can quickly treat it apart in pre-stack, you can eliminate a potential dry hole. It’s made a significant difference to us already.”

And, says Sun, the development of that advantage is still in its infancy. “In ten years, we’ll have immense computing power that will allow interpreters to go further and further into the processing stream. We will be able to do seismic processing, interpretation and reservoir analysis simultaneously.”

But, does this mean the end of processing as we know it?

Rob Vestrum is the manager of R&D at Kelman Seismic. Located on the fifth floor of a downtown Calgary office tower, the seismic processing company handles roughly 15% of the seismic processing in Canada. “In the Calgary office alone, we’ve got three Terabytes of disk space, 55 workstations to process data, four Enterprise 4000 supercomputers with 12 CPUs in each, and a few dozen processors with something like 400 years of combined experience,” says Vestrum.

Like the other major processing houses in town, Kelman takes the Gigabytes of raw data generated by acquisition crews and massages it into workable form.

Activating his computer, Vestrum calls up the batch stream, showing 30 different surveys being simultaneously processed in their system. “There are a dozen different parameters with subtle, but significant interactions,” he explains. “You look at a printout and say; I need to change the BP filter or the decon, or statics. It’s not a simple thing to do; it takes two or three years of experience before you get the hang of it.”

According to Vestrum, processing houses exist and thrive due to a combination of expertise and economics. “I think Shell Canada is the only oil company in Canada that still has in-house processing, but the economies-of-scale aren’t there for most E&P companies,” he notes. “We’ve got dozens of processors and a lot of knowledge base. It takes about two weeks to turnaround a large land 3D; it would take several months for a non-specialist.”

Kelman has noted that, in recent years, more and more geophysicists are coming down to their shop to work with the prestack data. “I’d say we see about 80% of Foothills interpreters pick or QC their own imaging velocities and 50% of plains interpreters do the same,” reckons Vestrum. “We regularly have clients fly in from South America to our Houston office just to look at the pre-stack data.”

Vestrum is intrigued to learn that Genetek has developed a real-time ability to manipulate pre-stack data for a workstation, but doesn’t see it as a threat to business; rather, the opposite. “As interpreters become more aware of pre-stack issues, there’s going to be more of a demand for higher-end, pre-stack services. For processing companies with a strong R&D focus like us, that’s going to be a bonus.”

The service-oriented company is already moving to satisfy customer demand for remote access to pre-stack data. “We’re currently developing an on-line system called Seismic Viewer so they can do pre-stack quality control on a web browser. They can even log in from home and use our processing system.”

Victoria Schut is a geophysicist with 28 years experience. She consults with several exploration companies in Calgary. “Most of my work over the last 20 years has been in the Foothills,” says Schut, who spent five years processing with Western Geophysical and GSI. “I realized quickly that migration velocities matter, and I’ve been picking my own velocities for over 12 years.”

When Schut first started working with Kelman, they would load the pre-stack data, and she would sit at a workstation in their office. “In 1994, they set me up to work off-site at my clients, and in 1998, from home.”

Using a high-speed cable line and a desk-top PC, Schut can be remotely plugged into Kelman’s system at her office at home. “I start when the data gets to the mute stage,” she says. “I pick my own mutes, migration velocities and filters.”

Working in the Foothills, Schut sees a wide variety of unusual anomalies. “Sometimes, I don’t know if it’s a processing artifact or a real-geological anomaly,” she says. “Is this reflection in-line, or offline? Let’s say you’re running a line along a ‘subsurface’ cliff - are the reflections coming back from the cliff, or from somewhere else? It’s easy to resolve if you have 3D, but if you’ve only got 2D, you don’t know necessarily know where it’s coming from. When you look at pre-stack data, however, you have a better chance of being able to differentiate. You can also achieve better noise management.”

As a result, Schut spends a significant amount of time on pre-stack. “Frankly, I’ve seen the canned product versus what you can achieve if you get involved. If I have to stand there in front of management and defend my interpretation of the data, I want to know what went into it.”

The Future

As the demand for access to pre-stack data grows, processors and workstation designers will work to improve the ability for the interpreter to massage the data. “We ask, how fast will the hardware be in 3-4 years, and what can we do with that,” says Sun. “Ostrander super gathers and range limited stacks are just two different ways of looking at pre-stack information. We can do 20 different displays or functions; it’s just the tip of the iceberg. There are algorithms we haven’t even begun to look at; we’ll be able to deduce a lot more information on lithology.”

Knowledge about the advantages of working pre-stack will also become more widespread in the profession. “We can’t just be the hardware and software provider, we have to train the geophysicist as to why and when you use the technology,” says Sun. “The tools you use define how you search for oil and gas; a simple workstation allows you to look for oil and gas simply, a sophisticated workstation allows you to search in a more sophisticated way.”

“Some companies think: this is for R&D guys, it’s specialty stuff,” says Sun. “I strongly believe it should be used on every project. The hand-off should be in the prestack domain. Just before stacking, they should peel off a tape. Sure, it’s 20 to 30 times the volume of post stack, but it will easily fit on a hot-swappable tower.”

Does that concern the processor? “Employers don’t want their interpreters to do basic processing,” says Vestrum. “It’s a lot of work, and poor use of their time.”

“I think we’ll see interpreters working closer with processors, but certainly not replacing them,” agrees Schut. “We each have strengths that can contribute to the final product.”

“As basins mature, you’re looking for more and more subtle plays,” says Vestrum. “Exploration companies are having to squeeze every last ounce of information out of the data, and it’s going to keep both processors and interpreters busy for a very long time.”

Sun agrees that, whether it is the processor or the interpreter at the workstation that handles the work, it will benefit the industry in the end. “It all comes down to putting a spot on the map marked drill here,” he says. “You need to squeeze more information out of your data because your entire company is depending on you to put the dot in the right spot.”

End

     

About the Author(s)

References

Appendices

Join the Conversation

Interested in starting, or contributing to a conversation about an article or issue of the RECORDER? Join our CSEG LinkedIn Group.

Share This Article