December 2018

What's new in exploration

Working to prevent a total collapse of the geophysical industry onshore
William (Bill) Head / Contributing Editor

This year’s SEG annual convention in California was a “success,” if your metric is technology advancement. Unless you have been living in a geo-steer shack out of Dickinson, N.D., you may not have noticed that the geophysical service industry, and geophysicists in oil companies, have been under marginalization for a long time.

Typical of when clear direction of market/price uptrends seems less clear or less emphasized, geophysical resources focus or retreat to conversations among tech-noids, not with applied practitioners, who make up the largest segment of geophysicists and geologists with money. A common wait-it-out behavior. Technology developers are the largest segment of geophysicists, not interpreters. Workstation interpreters are the largest segment of geologists, not struc/strat/paleo people.

In this slide from a 2009 RPSEA conference, the distribution shows that the vast majority of operators poorly forecasted reserves in the GOM.
In this slide from a 2009 RPSEA conference, the distribution shows that the vast majority of operators poorly forecasted reserves in the GOM.

The SEG has always been heavy on theory, operating a very good forum, but the problem today is not very many technical people [with the bucks] are that deep “technically.” The majors long ago gave up their role in controlling the intellectual property of the exploration business. The SEG technical program can be viewed at:

The SEG program is worth a view, especially if you already know what techniques will help solve your exploration issues. I do not want our scientists to stop, but it would help SEG to see that they are talking to an ever-shrinking audience [about 4,400 persons this year vs over 7,200 last year—to be fair, there is an “oily” difference between La La Land and Houston].

So, I offer a discussion on two central issues that control exploration technology problems, worldwide.

First, offshore. We either drill too much or not enough. At a RPSEA public technical symposium in August 2009, a technical paper was presented under the title, “Deepwater Improved Recovery, can technology overcome economics?” Great question. Results of an extensive study of GOM drilling history showed a disturbing, but common-sense trend in exploration activity. Since drilling decisions define costs and recovered reserves, impact on cost per barrel is significant.

Gavin Longmuir, then of Knowledge Reservoir, showed a slide (see chart on this page), funded by RPSEA/DOE/the Mineral Severance Tax (secured under the Energy Policy Act of 2005), which revealed what we always suspicioned. No one drills the “right” number of wells, predicting reserves to any accuracy. Budgets then, were off, as well as actual SEC booked reserves. Technology, alone, could not define risk. The distribution shows most operators miscalculated reserves. They did not drill enough up front.

Onshore exploration. Most money is focused on unconventional plays, where tech has been slow to provide real advances in science. We seem stuck on pretty, non-vector microseismic displays to say where fracs and faults are, or are not. Some growth is occurring on watching CO2 move, but most of that is qualitative and in carbonates. More investment is needed in unconventional for conventional technologies to survive. Let’s improve technology that might apply to both shale and sand.

Perhaps regulators of water disposal are providing incentives: “No water removal, no shale.” By law, we are forced to look at efficient oil/gas/water ratios. That’s something we used to do conventionally, and commonly. New work can be found in this webinar: “Water Avoidance, Landing, and Sweetspotting Solutions for the Permian Basin: The Role of Surface Seismic,” by David Paddock, WesternGeco, and six co-authors, on the website.

Twenty-two Delaware basin Wolfcamp “A” lateral wells within Reeves County, with the highest lease cost, were examined for profitability. Seismic attributes were examined for associated causality. A B30 (best 30 days of production) cut-off of 15,000 bbl was determined and used as a break-even metric. Only ten wells passed that hurdle.

Poor wells were found to be afflicted by one or more of the following: landing [ed. the drill bit TD] 200 ft or more too low, fracing into poor reservoir, and/or producing excessive water associated with faulting, either above or below the Wolfcamp “A” target stratigraphic level. Use of the seismic data would mitigate the risk of landing too low to hydraulically fracture up into the reservoir. Seismic data also could be used to predict the adequacy of reservoir quality. Finally, surface seismic data could be used to detect the risk of faults that can bring water from either above or below the lateral. The use of seismic data was crucial for ensuring economic success in the Wolfcamp play.

I could not agree more, except to add, without real subsurface velocity data, azimuthally, your seismic image is still only informative, not declarative. wo-box_blue.gif

About the Authors
William (Bill) Head
Contributing Editor
William (Bill) Head is a technologist with over 40 years of experience in U.S. and international exploration.
Connect with World Oil
Connect with World Oil, the upstream industry's most trusted source of forecast data, industry trends, and insights into operational and technological advances.