You are on page 1of 7

DISTINGUISHED AUTHOR SERIES

Probabilistic Subsurface Forecasting What Do We Really Know?


Martin Wolff, SPE, Hess Corporation

Abstract The use of single-valued assessments of company portfolios and projects continues to decline as the industry accepts that strong subsurface uncertainties dictate an ongoing consideration of ranges of outcomes. Exploration has pioneered the use of probabilistic prospect assessments as being the norm, in both majors and independents. Production has lagged, in part because of the need to comply with US Security and Exchange (SEC) reserves-reporting requirements that drive a conservative deterministic approach. Look-backs continue to show the difficulty of achieving a forecast within an uncertainty band as well as the difficulty of establishing what that band should be. Ongoing challenges include identifying relevant static and dynamic uncertainties, efficiently and reliably determining ranges and dependencies for those uncertainties, incorporating production history (brownfield assessments), and coupling subsurface with operational and economic uncertainties. Despite these challenges, none of which are fully resolved, a systematic approach based on probabilistic principles [often including design-of-experiment (DoE) techniques] provides the best auditable and justifiable means of forecasting projects and presenting decision makers with a suitable range of outcomes to consider. Introduction Probabilistic subsurface assessments are the norm within the exploration side of the oil and gas industry, both in majors and independents (Rose 2007). However, in many companies, the production side is still in transition from single-valued deterministic assessments, sometimes carried
Martin Wolff, SPE, is a Senior Reservoir Engineering Advisor for Hess Corporation. He earned a BS degree in electrical engineering and computer science and an MS degree in electrical engineering from the University of Illinois and a PhD degree in petroleum engineering from the University of Texas at Austin. Previously, Wolff worked for Schlumberger, Chevron, Fina, Total, and Newfield. He has served as a Technical Editor and Review Chairperson for SPE Reservoir Evaluation & Engineering and has served on steering committees for several SPE Forums and Advanced Technology Workshops.

out with ad-hoc sensitivity studies, to more-rigorous probabilistic assessments with an auditable trail of assumptions and a statistical underpinning. Reflecting these changes in practices and technology, recently revised SEC rules for reserves reporting (effective 1 January 2010) allow for the use of both probabilistic and deterministic methods in addition to allowing reporting of reserves categories other than proved. This paper attempts to present some of the challenges facing probabilistic assessments and present some practical considerations to carry out the assessments effectively. Look BacksCalibrating Assessments Look-backs continue to show the difficulty of achieving a forecast within an uncertainty band along with the difficulty of establishing what that band should be. Demirmen (2007) reviewed reserves estimates in various regions over time and observed that estimates are poor and that uncertainty does not decrease over time. Otis and Schneidermann (1997) describe a comprehensive exploration-prospect-evaluation system that, starting in 1989, included consistent methods of assessing risk and estimating hydrocarbon volumes, including post-drilling feedback to calibrate those assessments. Although detailed look-backs for probabilistic forecasting methodologies have been recommended for some time (Murtha 1997) and are beginning to take place within companies, open publications on actual fields with details still are rare, possibly because of the newness of the methodologies or because of data sensitivity. Identifying Subsurface Uncertainties A systematic process of identifying relevant subsurface uncertainties and then categorizing them can help by breaking down a complex forecast into simple uncontrollable static or dynamic components that can be assessed and calibrated individually (Williams 2006). Nonsubsurface, controllable, and operational uncertainties also must be considered, but the analysis usually is kept tractable by

Copyright 2010 Society of Petroleum Engineers This is paper SPE 118550. Distinguished Author Series articles are general, descriptive representations that summarize the state of the art in an area of technology by describing recent developments for readers who are not specialists in the topics discussed. Written by individuals recognized as experts in the area, these articles provide key references to more definitive work and present specific details only to illustrate the technology. Purpose: to inform the general readership of recent advances in various areas of petroleum engineering.

86

JPT MAY 2010

including them later with decision analysis or additional rounds of uncertainty analysis. Grouping parameters also can reduce the dimensionality of the problem. When parameters are strongly correlated (or anticorrelated), grouping them is justifiable. In fact, not grouping or Balkanizing a group of such parameters could cause them to be dropped in standard screening methods such as Pareto charts. For example, decomposing a set of relative permeability curves into constituent parameters such as saturation endpoints, critical saturations, relative permeability endpoints, and Corey exponents can cause them all to become insignificant individually. Used together, relative permeability often remains a dominant uncertainty. Assigning Ranges to Uncertainties Having been identified, ranges for each uncertainty must be quantified, which may appear straightforward but contains subtle challenges. Breaking down individual uncertainties into components (e.g., measurement, model, or statistical error) and carefully considering portfolio and sample-bias effects can help create reasonable and justifiable ranges. Some uncertainties, especially geological ones, are not handled easily as continuous variables. In many studies, several discrete geological models are constructed to represent the spectrum of possibilities. To integrate these models with continuous parameters and generate outcome distributions, likelihoods must be assigned to each model. Although assigning statistical meaning to a set of discrete models may be a challenge if those models are not based on any underlying statistics, the models do have the advantage of more readily being fully consistent scenarios rather than a combination of independent geological-parameter values that may not make any sense (Bentley and Woodhead 1998). As noted previously, validation with analog data sets and look-backs should be carried out when possible because many studies and publications have shown that people have a tendency to anchor on what they think they know and to underestimate the true uncertainties involved. Therefore, any quantitative data that can help establish and validate uncertainty ranges are highly valuable. Assigning Distributions to Uncertainties In addition to ranges, distributions must be specified for each uncertainty. There are advocates for different approaches. Naturalists strongly prefer the use of realistic distributions that often are observed in nature (e.g., log normal), while pragmatists prefer distributions that are well-behaved (e.g., bounded) and simple to specify (e.g., uniform or triangular). In most cases, specifying ranges has a stronger influence on forecasts than the specific distribution shape, which may have little effect (Wolff 2010). Statistical correlations between uncertainties also should be considered, although these too are often secondary effects. Uncertainty-to-Forecast Relationships Having been identified and quantified, relationships then must be established between uncertainties and forecasts.

These relationships sometimes can be established from analytical and empirical equations but also may be derived from models ranging from simple material-balance through full 3D reservoir-simulation models. When complex models are used to define relationships, it often is useful to use DoE methods to investigate the uncertainty space efficiently. These methods involve modeling defined combinations of uncertainties to fit simple equations that can act as efficient surrogates or proxies for the complex models. Monte-Carlo methods then can be used to investigate the distribution of forecast outcomes, taking into account correlations between uncertainties. DoE methods have been used for many years in the petroleum industry. The earliest SPE reference found was from a perforating-gun study by Vogel (1956), the earliest reservoir work was on a wet-combustion-drive study by Sawyer et al. (1974), while early references to 3D reservoir models include Chu (1990) and Damsleth et al. (1992). These early papers all highlight the main advantage of DoE over traditional one-variable-at-a-time (OVAT) methodsefficiency. Damsleth et al. (1992) give a 30 to 40% advantage for D-optimal designs compared with OVAT sensitivities. For an extensive bibliography of papers showing pros and cons of different types of DoE and application of DoE to specific reservoir problems, see an expanded version of this paper, Wolff (2010). Model Complexity Given that computational power has increased vastly from the 1950s and 1970s to ever-more-powerful multicore processors and cluster computing, an argument can be made that computational power should not be regarded as a significant constraint for reservoir studies. However, Williams et al. (2004) observe that gains in computational power are generally used to increase the complexity of the models rather than to reduce model run times. Most would agree with the concept of making things no more complex than needed, but different disciplines have different perceptions regarding that level of complexity. This problem can be made worse by corporate peer reviews, especially in larger companies, in which excessively complex models are carried forward to ensure buy in by all stakeholders. Highly complex models also may require complex logic to form reasonable and consistent development scenarios for each run. Finally, the challenge of quality control (QC) of highly complex models cannot be ignoredgarbage in, garbage out applies more strongly than ever. Launching directly into tens to hundreds of DoE runs without ensuring that a base-case model makes physical sense and runs reasonably well will often lead to many frustrating cycles of debug and rework. A single model can readily be quality controlled in detail, while manual QC of tens of models becomes increasingly difficult. With hundreds or thousands of models, automatic-QC tools become necessary to complement statistical methods by highlighting anomalies.

JPT MAY 2010

87

DISTINGUISHED AUTHOR SERIES


Linear Terms Only Linear and Interaction Terms

3 2.5 2

3 1.5 1 0.5 0 1 0.5 1 1 0.8 0.6 0.2 0.4 0.2 0 0.6 0.2 0.4 0.6 0.8 1 1 0.2 0.6 1 1 0.8 0.6 0.4 0.2 0.2 0.2 0 0.6 0.2 0.4 0.6 1 0.8 1 0 0.6 1 2

Linear, Interaction, and Lumped Second-Order Terms

Full Second-Order Polynomial

8 7 6 5

9 8 7 6 5

4 4 3 2 1 1 0 1 0.8 0.6 0.2 0.4 0.2 0 0.6 0.2 0.4 0.6 1 0.8 1 0.2 0.6 1 0 1 0.8 0.2 0.6 0.4 1 0.2 0 0.2 0.4 0.6 1 0.8 0.6 0.2 0.6 3 2 1

Fig. 1Proxy surfaces of varying complexity.

Proxy Equations Fig. 1 shows proxy surfaces of varying complexity that can be obtained with different designs. A Plackett-Burman (PB) design, for example, is suitable only for linear proxies. Folded-Plackett-Burman (FPB) (an experimental design with double the number of runs of the PB design formed by adding runs reversing the plus and minus 1s in the matrix)

can provide interaction terms and lumped second-order terms (all second-order coefficients equal). D-optimal can provide full second-order polynomials. More-sophisticated proxies can account for greater response complexities, but at the cost of additional refinement simulation runs. These more-sophisticated proxies may be of particular use in brownfield studies in which a more-quantitative proxy could

88

JPT MAY 2010

The Kuwait Oil Company invites you to discover the opportunities of a lifetime in its exploration and production. We are looking for petro-technical specialists for full time internal employment within the company and based in Kuwait City. We offer attractive employment benefits that includes a tax-free salary package, annual bonuses, 42 calendar days annual paid vacation with airfare, accommodation and furnishing allowances, world class free healthcare for employees and their families, children education allowance, a fully maintained company vehicle and other great benefits. We also offer a challenging, safe and healthy work environment along with an enviable lifestyle filled with leisure activities such as a golf course, cricket pitch, football field, tennis & squash courts, horse riding facilities, bowling alley, swimming pools, health & social clubs, modern shopping malls and many more...

Available vacancies for E & P SPECIALISTS


Petrophysicist Facility Specialists Reservoir Engineers Reservoir Modelling Specialists Mechanical Training Specialists Specialist Development Geologists Geologists Modeller Petroleum Engineers Instrument Training Specialists Gas Operations Training Specialists Electrical and Power Training Specialists Process and Production Operations Training Specialists

Minimum Requirements
Bachelors Degree 15+ years of experience in the relevant positions
www.jvg-media.com

Excellent English written and verbal communication skills

DISTINGUISHED AUTHOR SERIES


be desirable, but may not always add much value to greenfield studies where the basic subsurface uncertainties remain poorly constrained. A recognized problem with polynomial proxies is that they tend to yield normal distributions because the terms are added (a consequence of the Central Limit theorem). For many types of subsurface forecasts, the prevalence of actual skewed distributions, such as log-normal, has been documented widely. Therefore, physical proxies, especially in simple cases such as the original-oil-in-place (OOIP) equation, have some advantages in achieving more-realistic distributions. However, errors from the use of nonphysical proxies are not necessarily significant, depending on the particular problem studied (Wolff 2010). A question raised about computing polynomial proxies for relatively simple designs such as FPB is that often there are, apparently, too few equations to solve for all the coefficients of the polynomial. The explanation is that not all parameters are equally significant and that some parameters may be highly correlated or anticorrelated. Both factors reduce the dimensionality of the problem, allowing reasonable solutions to be obtained even with an apparently insufficient number of equations. Proxy Validation At a minimum, a set of blind test runs, which are not used in building the proxy, should be compared with proxy predictions. A simple crossplot of proxy-predicted vs. experimental results for points used to build the proxy can confirm only that the proxy equation was adequate to match data used in the analysis. However, it does not prove that the proxy is also predictive. In general, volumetrics are more reasonably predicted with simpler proxies than are dynamic results such as recoveries, rates, and breakthrough times. Moving Toward a Common Process Some standardization of designs would help these methods become even more accepted and widespread in companies. The reader of this paper likely is a reservoir engineer or earth scientist who, by necessity, dabbles in statistics but prefers not to make each study a research effort on mathematical methods. Another benefit of somewhat standardized processes is that management and technical reviewers can become familiar and comfortable with certain designs and will not require re-education with each project they need to approve. However, because these methodologies are still relatively new, a period of testing and exploring different techniques is still very much under way. The literature shows the use of a wide range of methodologies. Approaches to explore uncertainty space range from use of only the simplest PB screening designs for the entire analysis, through multistage experimental designs of increasing accuracy, to bypassing proxy methods altogether in favor of space-filling designs and advanced interpolative methods. A basic methodology extracted from multiple papers listed in Wolff (2010) can be stated as follows: (1) Define subsurface uncertainties and their ranges. (2) Perform screening analysis (e.g., two-level PB, or FPB), and analyze to identify the most-influential uncertainties. (3) If necessary, perform a more detailed analysis (e.g., three-level D-optimal or central-composite). (4) Create a proxy model (response surface) by use of linear or polynomial proxies, and validate with blind tests. (5) Perform Monte-Carlo simulations to assess uncertainty and define distributions of outcomes. (6) Build deterministic (scenario) low/mid/high models tied to the distributions. (7) Use deterministic models to assess development alternatives. However, variations and subtleties abound. Most studies split the analysis of the static and dynamic parameters into two stages with at least two separate experimental designs. The first stage seeks to create a number of discrete geological or static models (3, 5, 9, 27, or more are found in the literature) representing a broad range of hydrocarbons in place and connectivity (often determined by rapid analyses such as streamline simulation). Then, the second stage takes these static models and adds the dynamic parameters in a second experimental design. This method is particularly advantageous if the project team prefers to use higher-level designs such as D-optimal to reduce the number of uncertainties in each stage. However, this method cannot account for the full range of individual interactions between all static and dynamic parameters because many of the static parameters are grouped and fixed into discrete models before the final dynamic simulations are run. This limitation becomes less significant when more discrete geological models are built that reflect more majoruncertainty combinations. Steps 2 and 3 in the base methodology sometimes coincide with the static/dynamic parameter split. In many cases, however, parameter screening is performed as an additional DoE step after having already determined a set of discrete geological models. This culling to the mostinfluential uncertainties again makes running a higher-level design more feasible, especially with full-field full-physics simulation models. The risk is that some of the parameters screened from the analysis as insignificant in the development scenario that was used may become significant under other scenarios. For example, if the base scenario was a peripheral waterflood, parameters related to aquifer size and strength may drop out. If a no-injector scenario is later examined, the P10/50/90 deterministic models may not include any aquifer variation. Ideally, each scenario would have its own screening DoE performed to retain all relevant influential uncertainties. An alternative is running a single-stage DoE including all static and dynamic parameters. This method can lead to a large number of parameters. Analysis is made more tractable by use of intermediate-accuracy designs such as FPB. Such compromise designs do require careful blind testing to ensure accuracy although proxies with mediocre blindtest results often can yield very similar statistics (P10/50/90 values) after Monte Carlo simulation when compared with higher-level designs. As a general observation, the quality of proxy required for quantitative predictive use such as optimization or history matching usually is higher than that required only for generating a distribution through Monte Carlo methods.

90

JPT MAY 2010

Determining which development options (i.e., unconstrained or realistic developments, including controllable variables) to choose for building the proxy equations and running Monte Carlo simulations also has challenges. One approach is to use unconstrained scenarios that effectively attempt to isolate subsurface uncertainties from the effects of these choices (Williams 2006). Another approach is to use a realistic base-case development scenario if such a scenario already exists or make an initial pass through the process to establish one. Studies that use DoE for optimization often include key controllable variables in the proxy equation despite knowing that this may present difficulties such as more-irregular proxy surfaces requiring higherlevel designs. Integrated models consider all uncertainties together (including surface and subsurface), which eliminates picking a development option. These models may be vital for the problem being analyzed; however, they present additional difficulties. Either computational costs will increase or compromises to subsurface physics must be made such as eliminating reservoir simulation in favor of simplified dimensionless rate-vs.-cumulative-production tables or proxy equations. That reopens the questions: What development options were used to build those proxies? and How valid are those options in other scenarios? Deterministic Models Short of using integrated models, there still remains the challenge of applying and optimizing different development scenarios to a probabilistic range of forecasts. Normal practice is to select a limited number of deterministic models that capture a range of outcomes, often three (e.g., P10/50/90) but sometimes more if testing particular uncertainty combinations is desired. Normal practice is to match probability levels of two outcomes at once (e.g., pick a P90 model that has both P90 OOIP and P90 oil recovery). Some studies attempt to match P90 levels of other outcomes at the same time, such as discounted oil recovery (which ties better to simplified economics because it puts a time value on production), recovery factor, or initial production rate. The more outcome matches that are attempted, the more difficult it is to find a suitable model. The subsurface uncertainties selected to create a deterministic model, and how much to vary each of them, form a subjective exercise because there are an infinite number of combinations. Williams (2006) give guidelines for building such models, including trying to ensure a logical progression of key uncertainties from low to high models. If a proxy is quantitatively sound, it can be used to test particular combinations of uncertainties that differ from those of the DoE before building and running time-consuming simulation models. The proxy also can be used to estimate response behavior for uncertainty levels between the two or three levels (1/0/+1) typically defined in the DoE. This can be useful for tuning a particular combination to achieve a desired response, and it allows moderate combinations of uncertainties. Such moderate combinations, rather than extremes used in many designs, will be perceived as more realistic. This choice also will solve the problem of not being able to set all key variables to 1 or +1 levels and

follow a logical progression of values to achieve P90 and P10 outcomes. However, interpolation of uncertainties can sometimes be: Challenging (some uncertainties such as permeability may not vary linearly compared with others such as porosity) Challenging and time-consuming (e.g., interpolating discrete geological models) Impossible [uncertainties with only a discrete number of physical states such as many decision variables (e.g., 1.5 wells is not possible)] Finally, selecting the deterministic models to use is usually a whole-team activity because each discipline may have its own ideas about which uncertainties need to be tested and which combinations are realistic. This selection process achieves buy-in by the entire team before heading into technical and management reviews. Probabilistic brownfield forecasting has the additional challenge of needing to match dynamic performance data. Although forecasts should become more-tightly constrained with actual field data, data quality and the murky issue of what constitutes an acceptable history match must be considered. History-match data can be incorporated into probabilistic forecasts through several methods. The traditional and simplest method is to tighten the individual uncertainty ranges until nearly all outcomes are reasonably history matched. This approach is efficient and straightforward but may eliminate some more-extreme combinations of parameters from consideration. Filter-proxy methods that use quality-of-history-match indicators (Landa and Gyagler 2003) will accept these more-extreme uncertainty combinations. The filter-proxy method also has the virtue of transparencyexplanation and justification of the distribution of the matched models is straightforward, as long as the proxies (especially those of the quality of the history match) are sufficiently accurate. More-complex history-matching approaches such as genetic algorithms, evolutionary strategies, and ensemble Kalman filter are a very active area for research and commercial activities, but going into any detail on these methods is beyond the scope of this paper. ConclusionWhat Do We Really Know? In realistic subsurface-forecasting situations, enough uncertainty exists about the basic ranges of parameters that absolute numerical errors less than 5 to 10% usually are considered relatively minor, although it is difficult to give a single value that applies for all situations. For example, when tuning discrete models to P10/50/90 values, a typical practice is to stop tuning when the result is within 5% of the desired result. Spending a lot of time to obtain a more precise result is largely a wasted effort, as look-backs have shown consistently. Brownfield forecasts are believed to be more accurate than greenfield forecasts that lack calibration, but look-backs also suggest that it may be misleading to think the result is the correct one just because a great history match was obtained. Carrying forward a reasonable and defensible set of working models that span a range of outcomes makes much more sense than hoping to identify a single true answer. As

JPT MAY 2010

91

DISTINGUISHED AUTHOR SERIES


George Box (an eminent statistician) once said: All models are wrong, but some are useful. In all these studies, there is a continuing series of tradeoffs to be made between the effort applied and its effect on the outcome. Many studies have carried simple screening designs all the way through to detailed forecasts with well-accepted results that are based on very few simulation runs. These studies tend to study the input uncertainty distributions in great depth, often carefully considering partial correlations between the uncertainties. Although the quality of the proxies used in these studies may not be adequate for quantitative predictive use, it still may be adequate for generating reasonable statistics. Other studies use complex designs to obtain highly accurate proxies that can be used quantitatively for optimization and history matching. However, many of these studies have used standardized uncertainty distributions (often discrete) with less consideration of correlations and dependencies. Higher-speed computers and automated tools are making such workflows less time-consuming so that accurate proxies and a thorough consideration of the basic uncertainties should both be possible. Whichever emphasis is made, the models that are used should be sufficiently complex to capture the reservoir physics that influences the outcome significantly. At the same time, they should be simple enough such that time and energy are not wasted on refining something that either has little influence or remains fundamentally uncertain. In the end, probabilistic forecasts can provide answers with names like P10/50/90 that have specific statistical meaning. However, it is a meaning that must consider the assumptions made about the statistics of the basic uncertainties, most of which lack a rigorous statistical underpinning. The advantage of a rigorous process to combine these uncertainties through DoE, proxies, Monte Carlo methods, scenario modeling, and other techniques is that the process is clean and auditable, not that the probability levels are necessarily quantitatively correct. However, they are as correct as the selection and description of the basic uncertainties. Having broken a complex forecast into simple assumptions, it should become part of a standard process to refine those assumptions as more data become available. Ultimately, like the example from exploration mentioned at the beginning, we hope to calibrate ourselves through detailed look-backs for continous improvement of our forecast quality. Acknowledgments The author would like to thank Kaveh Dehghani, Mark Williams, and John Spokes for their support and stimulating discussions. Thank you also to Hao Cheng for our work together on the subject and for supplying the graphics for Fig. 1. References
Bentley, M.R. and Woodhead, T.J. 1998. Uncertainty Handling Through Scenario-Based Reservoir Modeling. Paper SPE 39717 presented at the SPE Asia Pacific Conference on Integrated Modeling for Asset Management, Kuala Lumpur, 2324 March. doi: 10.2118/39717-MS. Chu, C. 1990. Prediction of Steamflood Performance in Heavy Oil Reservoirs Using Correlations Developed by Factorial Design Method. Paper SPE 20020 presented at the SPE California Regional Meeting, Ventura, California, USA, 46 April. doi: 10.2118/20020-MS. Damsleth, E., Hage, A., and Volden, R. 1992. Maximum Information at Minimum Cost: A North Sea Field Development Study With an Experimental Design. J Pet Technol 44 (12): 13501356. SPE23139-PA. doi: 10.2118/23139-PA. Demirmen, F 2007. Reserves Estimation: The Challenge for the . Industry. Distinguished Author Series, J Pet Technol 59 (5): 8089. SPE-103434-PA. Landa, J.L. and Gyagler, B. 2003. A Methodology for History Matching and the Assessment of Uncertainties Associated With Flow Prediction. Paper SPE 84465 presented at the SPE Annual Technical Conference and Exhibition, Denver, 58 October. doi: 10.2118/84465-MS. Murtha, J. 1997. Monte Carlo Simulation: Its Status and Future. Distinguished Author Series, J Pet Technol 49 (4): 361370. SPE37932-MS. doi: 10.2118/37932-MS. Otis, R.M. and Schneidermann, N. 1997. A process for evaluating exploration prospects. AAPG Bulletin 81 (7): 10871109. Rose, P 2007. Measuring what we think we have found: Advantages .R. of probabilistic over deterministic methods for estimating oil and gas reserves and resources in exploration and production. AAPG Bulletin 91 (1): 2129. doi: 10.1306/08030606016. Sawyer. D.N., Cobb, W.M., Stalkup, F and Braun, P 1974. .I., .H. Factorial Design Analysis of Wet-Combustion Drive. SPE J. 14 (1): 2534. SPE-4140-PA. doi: 10.2118/4140-PA. Vogel, L.C. 1956. A Method for Analyzing Multiple Factor ExperimentsIts Application to a Study of Gun Perforating Methods. Paper SPE 727-G presented at the Fall Meeting of the Petroleum Branch of AIME, Los Angeles, 1417 October. doi: 10.2118/727-G. Williams, G.J.J., Mansfield, M., MacDonald, D.G., and Bush, M.D. 2004. Top-Down Reservoir Modeling. Paper SPE 89974 presented at the SPE Annual Technical Conference and Exhibition, Houston, 2629 September. doi: 10.2118/89974-MS. Williams, M.A. 2006. Assessing Dynamic Reservoir Uncertainty: Integrating Experimental Design with Field Development Planning. SPE Distinguished Lecturer Series presentation given for Gulf Coast Section SPE, Houston, 23 March. http://www.spegcs. org/attachments/studygroups/11/SPE%20Mark%20Williams%20 Mar_06.ppt. Wolff, M. 2010. Probabilistic Subsurface Forecasting. Paper SPE JPT 132957 available from SPE, Richardson, Texas.

92

JPT MAY 2010

You might also like