March 2014
Shale Technology Review

Advances made in shale-specific geology and geophysics

G&G technology continues to “zero in” on the unique challenges of shale exploration and production.

Weatherford’s ResSure Live service uses unique data-interpretation software to analyze formation evaluation data, surface logging and microseismic data. 

While integration continues to permeate every part of oil and gas operations, one discrete component of the industry stands alone for its outsized contributions to the wildly successful shale developments we see today. Think of geology and geophysics as the “eyes” of the industry. Their vision is becoming ever more acute, as recent technical innovations attest.

As Nathan Meehan, Baker Hughes’ senior executive advisor, and a World Oil editorial advisor, said in a recent blog post, “Major risks exist for the development of shale resources outside Canada and the U.S. None of these seem individually insurmountable; however, in aggregate, they suggest a much more G&G process-driven model, to eliminate a significant fraction of the poor wells present in even the best North American play.

“The author would be personally surprised, if there were not multiple international technical successes comparable to the Eagle Ford, Bakken or Marcellus. However, modeling the actual results of these plays at international costs, and the sorts of timing, costs, prices and terms and conditions of various countries, suggest that countries either must do much better that [sic] these plays geologically, find a way to mitigate the risks (variation in return) or have more attractive cost and pricing models.”

Meehan thinks the technology to stimulate wells is not optimized for local stress conditions. In his blog, he states, “This concern was fairly localized to issues in the Sichuan basin, with high pore pressures and minimum horizontal compressive stresses approaching the vertical stress. Complex fractures that include horizontal fractures have been known to develop in coals, and are often referred to as T-shaped fractures. Stress conditions and layering have generated lab examples of such fractures. Many failed hydraulic fracture treatments have been blamed on T-shaped fractures, often without conclusive proof. Proper geomechanical models are necessary to address this issue; geomechanical models, driven primarily from acoustic logs, are unlikely to solve this concern.”


“There are a lot of challenges, but probably at the top is understanding the variability of the reservoirs. There is a perception that unconventional reservoirs are blankets; that when you start drilling and producing, it is somewhat of a factory-type of an approach,” said Herb Martin, Devon Energy’s V.P. for reservoir optimization and technology. “What we are learning is they are actually quite variable. Unlike conventional reservoirs we have studied for decades and understand the variability, the variability in unconventional rocks is quite subtle. So our techniques for understanding that variability are still emerging.”

“Beyond 3D seismic, I think it is integration of data that gives us the best overall answer,” said Martin. “The information we get from wellbores tends to be point data—quite detailed information about that one point in space, but limited in its aereal applicability. The best data we get out of wells is rock data—the core itself. We only core a limited number of wells, but we log many more. We can tie the cores and the logs together, and begin to extrapolate and build a network. Finally, we tie that network to the 3D seismic and build the integrated 3D picture.”

“I think our understanding of the fluid dynamics of these reservoirs—how the molecules work their way through the pores—is not all that great,” Martin continued. “We need a way to understand these forces in nanometer-size pores, and a way to scale up these observations in three-dimensional space.”


According to David Rainey, president of Petroleum Exploration and chief geoscientist, BHP Billiton, “The first challenge to overcome is actually the recognition that this is a geoscience issue. I think there has been a perception in our industry that this is an engineering issue—that there is no geological risk. I think we’re all finding that that’s not the case.”

“Doing the geoscience basics is the first step. That means understanding the regional geology of the basin you’re operating in. It means understanding the regional variation of the shale that you’re exploring or developing or producing. One point to note is that not all shales are shales. For example, neither of the two biggest liquids plays in the U.S. right now—the Eagle Ford and the Bakken—are shales. Shale is actually a very loose geoscience term. It’s not a term that we particularly like as geoscientists. It’s a field geology term. Both the Eagle Ford and the Bakken are actually carbonates.”

Rainey continued, “So it’s about describing the pore system in rocks that have very small pores. The physics of that is actually much more complex than it is in the conventional world. It’s about understanding the storage in the reservoir, and that’s about describing porosity, and petroleum saturation, and the impact of pressure on fluid volume. It’s then about understanding the producibility of the petroleum fluid, and that’s about the thermal history of the reservoir and the source (if they are different), and how that controls viscosity and gas-oil ratio, and it's about over-pressure and how all of these things convolve to control the producibility of the fluid. And then it’s about understanding the fracability of the reservoir—how susceptible is the reservoir to being enhanced by the application of fracturing technology? And that's about mineralogy and overburden pressure and the natural stress regime in which the reservoir sits.

“Finally, we need to consider how these different factors reinforce each other in a positive way to form the sweet spot in the play; alternatively, they may cancel or reinforce negatively to form the marginal area of the play. So, it’s not about finding or recovering hydrocarbons, but whether the geology concentrates them in economic quantities.

“There are other challenges,” Rainey said. “We’ve got some issues in the Permian—just some drilling issues in the shallow horizons, because it’s buried limestone, so it’s a cost. There are big cavities down there, just like caves. You go through underground caves, and those are things that are difficult to drill through. There are a number of seismic applications—shallow, high-resolution seismic is one—to try and predict where those might be. We’ve just kicked off a team to see what other geophysical methods can be brought to bear on the problem. For example, there is a technology called airborne gravity gradiometry. It measures the gradient of the gravity field, as opposed to the total gravity field, and it’s more sensitive to shallow perturbations in the gravity field than conventional methods.”

“The technology was originally developed for defense applications, and was then adapted for resource exploration by BHP Billiton. It’s been used on the minerals side, because, relative to the petroleum industry, our activities tend to be closer to the surface. We’re not drilling to 10,000 or 20,000 ft in the minerals business. The things we are interested in are all relatively close to the surface.”


Landmark Software provides well planning software to meet another familiar challenge: collaboration. A recent Landmark Software white paper describes the problem: “With increasing pressures to cut costs, plan complex wells more efficiently, and drill more safely, oil and gas companies worldwide are urgently seeking ways to improve collaboration across traditional gaps between the E&P disciplines. One of the most important—and critical—is the chasm between geoscientists and drilling engineers, who usually tackle technical challenges with disparate software tools and platforms. To unlock step changes in performance, operators must employ technological innovations that seamlessly bridge these domains.”

According to Halliburton, developing shale, or other tight reservoirs, requires special treatment. The company says the processes and technologies, which have worked for many years in conventional reservoirs, often are not adequate for the complexities of unconventional reservoirs. Shale and tight reservoirs can contain multiple sweet spots, mixed with nonproductive rock. Misplaced fracturing zones, or missed production opportunities, can result in expensive wells and sub-optimized well economics. Cumulative production can vary widely in a field, depending on well placement.

A new service from the company is intended to address this problem. The service is a collaborative workflow that integrates geoscience, reservoir, and drilling and completion engineering, to allow operators to better predict and produce unconventional reserves. The service is supported by applications that include earth modeling, formation evaluation, horizontal drilling, complex fracture modeling and reservoir stimulation. An iterative process identifies the best well placement and stimulation designs, which are critical to the total well cost and ultimate production.


Shale opportunities in the U.S. will remain vibrant, well into the future. The U.S. Energy Information Administration (EIA) projects that domestic natural gas production will increase from 23.0 Tcf in 2011 to 33.1 Tcf in 2040, a 44% increase. Almost all of this increase is due to projected growth in shale gas production, which is expected to grow from 7.8 Tcf in 2011 to 16.7 Tcf in 2040.

Of course, G&G technology will play a leading role in bringing this remarkable forecast to reality. However, success has its obstacles. Data management—essentially, transforming data into actionable intelligence—is an ongoing challenge for G&G. From seismic to core samples, new G&G tools increase the acuity of our vision; however, they also generate huge volumes of data.

One operator, Devon Energy, has chosen to meet this challenge with the establishment of a formal data management program. The initiative was established to enable agile business operations, by treating data as a corporate asset and providing timely, qualified, integrated and harvested data, from the rig to the desktops of geoscientists. Devon’s program seeks to ensure sustainable data management practices; deliver trusted and reliable information to core G&G business processes and applications; and improve the management of data quality between business applications to ensure that it is timely, consistent and accurate. This practice, which could be called “strategic data management,” is proliferating, as a tsunami of data washes over shale operators.


Microseismic technology is becoming a key success factor in shale operations. A new development, the TerraLocate microseismic event location system, from Transform Software and Services, is intended to meet industry needs for downhole microseismic-monitored oil and gas wells, Fig. 1. The software directly connects with leading microseismic interpretation software and services.


Fig. 1. TerraLocate is used to visualize and interpret raw microseismic traces, interactively position microseismic events and validate previous microseismic positioning (image courtesy of Transform Software and Services).


Starting with raw, downhole, microseismic recorded data, the system automatically identifies potential fracture events and positions these events, in the subsurface, and optionally correlates with previously located events and provides quality control capabilities. Interactive hologram analysis and velocity modeling help to maximize microseismic positioning accuracy. Iterative, automated velocity modeling can be used to refine starting velocity information, if perforation shots, or other calibrating microseismic sources, are available. Statistical tools identify poor or unreliable data, and characterize the uncertainty associated with microseismic positioning or repositioning of microseismic event locations.

“Uncertainty in microseismic locations degrades the value of survey investment and erodes the confidence of engineers wanting to make use of microseismic fracture detection,” said Transform CEO Dean Witte. “Our vision … was to create a reliable and independent tool to enable E&P operators to gauge the quality of contractor microseismic results and, as required, take remedial action to reposition any or all microseismic events.”

Microseismic also plays a role in a new seismic vector attribute, introduced and applied by Schlumberger, to map small-scale discontinuities, which the company calls rock fabric, on 3D reflection seismic data. Rock fabric and other properties—such as stress anisotropy, and intrinsic anisotropy of shales and faults—play a crucial role in understanding the behavior of fracture propagation, during hydraulic stimulation.

In a recent paper,1 the company presented a methodology for integrating 3D reflection seismic and passive seismic. Rock fabric is at the limit of seismic resolution, which makes this type of mapping a challenging task. The company is evaluating to what extent seismic moment and microseismic event location, as determined from hydraulic fracture monitoring, can be correlated with rock fabric observations, from 3D reflection seismic.

Another service from the company enhances microseismic data. The moment tensor inversion service provides enhanced analysis of the dynamics of hydraulic fracture propagation, Fig. 2.


Fig. 2. A unique, intuitive display of moment tensor inversion (MTI) results represents the source mechanism for one stage of microseismic events in an expansion, opening and slip glyph (courtesy of Schlumberger).


“When applied in unconventional reservoirs, the MTI service provides information about the orientation, volume and proppant placement associated with the hydraulic fracture,” said Joseph Elkhoury, V.P. and general manager, Schlumberger Microseismic Services. “This provides a framework for building and interpreting geomechanical models, and enables our customers to improve well completion design for improved production.”

The proprietary processing used in the MTI service accounts for anisotropy. As the microseismic monitoring industry moves toward quantitative source inversion, the rigorous incorporation of anisotropy in unconventional reservoir models becomes more important, to accurately process and interpret the valuable information contained in the microseismic signals.

Analysis of MTI processing, during field trials in the Permian basin, confirms that the incorporation of anisotropy leads to improved interpretation of microseismic data and more robust geomechanical models.


One aspect, which is critical to a successful shale development program, is the definition and refinement of a reservoir model. A reservoir model, defined at the outset of the project, identifies what is already known about the play; what data are needed; and how best to integrate the data acquired from formation evaluation, drilling, stimulation and production results.

A new “reservoir-centric” service from Weatherford aims to link completion efficiency with reservoir drainage. ResSure Model services leverage all of the data collected during various drilling, completion and production activities. An area-specific reservoir model aids in the understanding of variations in reservoir quality and rock quality, within a horizontal well and across the study area. This improves the reliability of individual well forecasts and develops a knowledge base to continually improve field operations, particularly fracing, Fig. 3.


Fig. 3. Early mapping of core and non-core areas in the Haynesville shale (courtesy of Weather-ford).


A magnitude-based, calibrated, Discrete Fracture Network (DFN) methodology, based on microseismicity, induced during stimulation of a wellbore, has been developed by Microseismic, Inc., Fig. 4. It incorporates the magnitude of the event (and associated microseismic moment), rock rigidity, injected fluid volumes and fluid efficiency. Calculated fracture volumes are then scaled to account for any missing portion of the seismic population. The calibrated DFN can then be filled with the measured, injected proppant volume on a stage-by-stage basis, by initially filling fractures nearest the wellbore and then systematically filling fractures outward from the wellbore, until all deposited proppant volumes have been depleted.


Fig. 4. Using reservoir characteristics from microseismic data allows for mapping the network of propped, and productive, fractures (courtesy of Microseismic Inc.).


Fundamentally, for every microseismic hypocenter, fracture areas are calculated with the associated microseismic moment, rock rigidity and displacement along the slip plane. Since displacement along the slip plane is not directly measured, it is initially estimated using an empirical relation, as a function of the associated microseismic moment, and corrected using scaling factors of fracture volumes, fluid volumes and fluid efficiency. The scaling factor is calculated by comparing fracture volumes to fluid volumes and fluid efficiency, following a hydraulic fracture stimulation, of an individual stage, where the sum of the seismic moments is greatest (compared to all stages monitored), and energy released is only associated with fluid injection (e.g., not tectonic activity).

Although the model is simple, it accounts for limitations in measurable data and the missing seismic population. Improved estimations of the displacement along the slip plane and fluid efficiency, as well as improved microseismic detection algorithms, will improve results. Pressure-dependent fluid efficiency can also be incorporated into the methodology since the bottomhole pressure (BHP), at the time of the microseismic event occurrence, can be calculated. Average propped half-lengths, using the method described in this article, were compared to average propped half-lengths, calculated via a technique by McKenna, et al. (2013), and the results were within 15% of one another. This suggests that when these two techniques are combined, the distribution of proppant deposited in a formation, following a hydraulic fracture stimulation, can be well constrained to yield good estimates.


Indirect measurement and observation continue to expand their abilities; however, in the meantime, direct measurement and observation are not standing still. Powerful tools are being utilized to see formation properties, in gas-bearing shale rocks.

To understand gas and liquid flow through shales, an understanding of their pore structure is required. For common earth media, the pore space is easily conceived as the space between the assorted grains that make up the medium. While still conceptually true for shales, imaging of the structure, at the pore scale, is necessary to understand this space.

As part of a larger study, sponsored by the Research Partnership to Secure Energy for America (RPSEA),2 researchers are using the Advanced Light Source (ALS) micro-computed tomography facility and Focused Ion Beam (FIB) technology, at the Molecular Foundry and the National Center for Electron Microscopy at the Lawrence Berkeley National Laboratory (LNBL), to develop high-resolution, 3D descriptions of the pore space. Images are analyzed, using Maximal Inscribed Spheres-type methods to estimate gas shale and tight sand flow properties, at different conditions, including in situ conditions.

These approaches were developed at LNBL and the University of California at Berkeley, and they have been successfully applied to studies of chalk, diatomite and sandstone. Researchers investigated the impact of pore-space geometry, in different formations, on flow properties, including absolute and relative permeabilities, capillary pressure and the Klinkenberg coefficient.

Researchers acknowledge concerns about nanometer-scale examinations of shale samples. These include the scale of observation, as compared to the scale of interest (the researchers’ sample was roughly 27 orders of magnitude smaller than the shale zone), sampling bias (shales are anisotropic and contain heterogeneities across a variety of scales), the effects of sampling, and so on.

Even so, researchers assert a number of important reasons to perform this imaging. From the images, one can often get an understanding of the 3D nature of the pore space, its connectivity, and the location and distribution of mineral and organic phases. In addition, the images provide a baseline for conceptual model building.


Researchers report that in spite of the rich variety of the shale gas structures, as revealed by the imaging study, all studied samples show very little permeability of the rock matrix. This observation led to a model of gas flow to a fractured well. In this model, the permeability of the pristine reservoir is negligible, and gas flow occurs only in the limited stimulated reservoir volume, adjacent to the fracture. A multi-stage fractured horizontal well, employed in this study, crossed a series of such fractures.

The stimulated volumes connected to neighboring principal fractures are separated by no-flow boundaries. This no-flow boundary is determined, either by the depth of the stimulated volume, or the separation between two fracture feeding zones. Researchers hypothesize that the geometric form of the stimulated zone feeding a fracture represents a slab parallel to the fracture face. Researchers assumed that the permeability with the hydrofracture is much larger than that of stimulated reservoir volume. Consequently, they neglected the pressure gradient within the fracture, relative to the gradient in the formation.

Monthly production data from five Barnett shale gas wells were used to test the model. The lengths of time intervals, where data were available, are different for different wells, and vary between 68 and 80 months of production. Considerable fluctuations in the data bring uncertainties in the outcome of the fitting. Nevertheless, the fitting indicates that the transition from the square-root of time production scaling, to exponential decline, for the analyzed wells, happened after about 12 months of gas recovery operations.

To evaluate the predictive capabilities of the model, curve-fitting was applied to early data, and the model-based estimates of the total production over the entire period of 87 months were compared to the measured values. The relative errors for most ultimate recovery predictions are within 5%. The general trend of error decline with increasing data time interval is not surprising. However, the error does not decline monotonically. This behavior can be explained by fluctuations in the monthly rates resulting, possibly, from temporary production disruptions during the respective months. wo-box_blue.gif


  1. Haege, M., S. Maxwell, L. Sonneland, M. Norton, “Rock fabric characterization using 3D re-flection seismic integrated with microseismic,” Progress Energy Resources, presented at EAGE the Conference and Exhibition, London, June 10-13, 2013.
  2. U.S. Department of Energy, Research Partnership to Secure Energy for America;
Connect with World Oil
Connect with World Oil, the upstream industry's most trusted source of forecast data, industry trends, and insights into operational and technological advances.