One of the more notable trends in computing over the past few decades has been the evolution of technology, as it becomes exponentially more powerful while simultaneously shrinking in size and cost. New innovation has created unparalleled system efficiencies that reduce total cost of ownership. Accordingly, lower-cost IT brings vast computational resources within reach of smaller organizations. For Canada’s upstream oil and gas industry, there is the potential to dramatically expand the competitive playing field. Newer, smaller, more agile entrants to the market can move information, process seismic data more quickly and speed the research process.

Sun Microsystems’ Andrew Morrant, Western Regional Manager, offers perspectives on how his company’s leading edge IT, among others, can and will shift the balance of competition on an international scale.

Since Charles Nelson Tripp led Canada’s first official oil exploration efforts in 1854, the country’s oil and gas industry has focused on identifying and tapping petroleum-bearing rock and turning it into a resource for export and domestic use. Historically, the search for oil and gas has been a calculated game of hide and seek. Time and money was expended on examining ground surface features, rock formations and drilling dozens of holes in hope of hitting pay dirt.

Flash forward 150 years and Canada’s oil production is estimated to be upwards of three million barrels per day. In 2005, domestic gas production reached nearly 180 billion cubic metres. Clearly, the industry is doing something right, and while exploration still involves some guesswork, considerable advancements in technology have ushered in newer, more accurate ways of tapping into Canada’s underground riches.

These new technologies have started to dramatically speed up the discovery process by eliminating time-consuming steps.

A typical 3-D seismic project can create anywhere between eight and 1,000 gigabytes of raw data (a single gigabyte is the equivalent of 1,000 novels or almost 20 hours of MP3 format music). This data only becomes valuable to geophysicists after it is handed off to a processing team that will use number crunching programs, from basic processing all the way to advanced algorithms such as PSDM, to turn it into a three dimensional image to help pinpoint gas or oil reservoirs.

Historically, the challenge for many companies was that not only are these processing algorithms complex, but they require a considerable amount of high-end computing horsepower to get the job done. The technology is available and fortunately, recent developments in software licensing, innovative hardware performance and other peripheral benefits have lowered total cost of ownership (TCO) such that it is within grasp of smaller energy industry players. Consider the following examples:

Smaller servers = Less real estate – Servers and storage devices take up valuable square footage but like most cell phones, they have shrunk in size over the years. Ten years ago, a single high-end server was about the size of a pair of refrigerators; today, an entire rack of servers with a comparable amount of power can fit under your arm. In turn, this gives companies more flexibility to expand as business grows, and / or reduces their costs.

Collaboration and visualization – Quickly solving complex problems is best accomplished when they are tackled from several angles by several people, but traditionally in the oil and gas sector this was more easily said than done. High performance computing resources and team knowledge often exist in multiple locations, making collaborative efforts slow and difficult. In recent years however, solutions have entered the market that enable interconnected and highly collaborative computing environments. The Sun Visualization System for example, helps oil and gas researchers by integrating servers and enabling users to simultaneously and remotely access 3-D graphics applications. These tools support increased collaboration, delivering faster results.

Open source for closed wallets – Open source operating systems can give exploration firms a competitive edge by affording them the flexibility to create highly customized software applications for UNIX, Linux or Windows environments. Many organizations run on Solaris 10, Sun’s platform agnostic operating system, which allows developers to tailor applications specific to a task at hand without having to rely on third-party, proprietary software or tangle with prohibitive and expensive licensing. And because crude oil and gas exploration is a continually evolving and internationally competitive business, an open source system ensures a company has the freedom to take on bigger seismic data processing projects without concerns about software capital costs.

Virtualization – Data centers like those used in the oil and gas industry are filled with costly servers and storage. On average, system utilization is between five and 15 percent, meaning that the vast majority of server capacity is not adequately used. This results in high costs for hardware, operations, management, not to mention energy consumption. Virtualization technology “pools” or consolidates resources allowing the same number of applications to run on less equipment. Not only does this mean companies can extract more value from existing IT investments but enables more efficient and less costly system management.

Consuming Less Energy While Searching for Its New Sources

Datacenter power consumption doubled from 2000 to 2005 and North American analysts predict that IT system power usage may end up costing more than the initial cost of the hardware. Upstream oil and gas organizations, like any bottom line focused business, understand the issue of energy consumption and have begun turning to energy efficient IT systems to reap benefits, including increased performance and reduced power use.

For example, Sun Microsystems’ X64 servers offer the ability to run an extremely high performance cluster that consumes dramatically less energy than other servers in the same class. Not only do the servers run more efficiently, they require less cooling, which translates to shrinking electricity overhead costs. A peripheral benefit is that reduced energy consumption is better for the environment and makes a strong statement from a corporate branding perspective.

Complementing Sun’s chip performance is something called Hypertransport Technology developed by processor manufacturer AMD. This new innovation helps eliminate performance bottlenecks and avoid bandwidth and latency limitations that typically slow performance during peak load times – a critical consideration for any company performing seismic data analysis.

One prominent Alberta-based seismic solutions provider to the energy industry experimented with various computing models in an attempt to improve system performance as well as reduce overhead costs on space and electricity use. Soon after implementing Sun Fire servers, the company realized a significant performance improvement. Servers in their array consume about one-third the power and are one-quarter the size of similar technology from competitive vendors. Overall, this meant they could more quickly process seismic data and simultaneously reduce overhead costs on space and electricity use.

Fig. 01

So, What’s Next?

Time is money. Even though the crude oil has been in the ground for millions of years, the oil and gas industry is one that relies heavily on speed and is driven by the imperative to get information into the hands of those who need it. When seismic data is sourced in the field, it needs to be analyzed wherever there is technology capable of handling the job. For example, data gathered in remote Alberta is typically ‘crunched’ hundreds of kilometers away in Calgary, or even Houston. The time between capturing the data and processing it into critical mapping information varies depending on the amount of data. The extra step can be costly to any company working to compete on a global scale.

The world’s first portable data centre became a reality in late 2006 when Sun Microsystems revealed Project Blackbox. Intended to address data centre space and energy constraints, Blackbox is a virtualized, full data centre housed in a standard shipping container. A single Blackbox has capacity for over 700 CPUs, 2,000 cores, or 8,000 compute threads. More importantly, it’s designed to be deployed anywhere, be it on an oil rig or in the field where seismic testing takes place. This increased agility gives any oil and gas firm unprecedented agility to speed time to results as well as take their IT system to the job site.

Of the many assets that form world economic foundations, few have the importance of oil and gas. It is apparent that technological advances and new methods of exploration have given the industry significant competitive advantages, and in turn, thrust countries like Canada onto the global oil and gas producing stage.

End

     

About the Author(s)

References

Appendices

Join the Conversation

Interested in starting, or contributing to a conversation about an article or issue of the RECORDER? Join our CSEG LinkedIn Group.

Share This Article