Professional Documents
Culture Documents
16)
A Geostatistical Uncertainty Analysis Package Applied to
Groundwater Flow and Contaminant Transport Modeling
Developed By:
William L. Wingle, Eileen P. Poeter, Sean A. McKenna
Department of Geology and Geological Engineering
Colorado School of Mines
Golden, Colorado 80401
Funded By:
United States Bureau of Reclamation
United States Army Corp of Engineers: Waterways Experiment Station
Sandia National Laboratories
ii
Table of Contents
Table of Contents
CHAPTER 1
CHAPTER 2
CHAPTER 3
CHAPTER 4
..............................................................................
iii
.......................................................................................
1
1
Installation
........................................................................................
9
9
10
10
11
11
11
13
.........................................................................
15
15
16
17
18
18
19
20
20
21
21
Mainmenu (uncert)
...........................................................................
23
23
23
24
Introduction
iii
Table of Contents
CHAPTER 5
CHAPTER 6
iv
Geostatistics ...........................................................................
Modeling ................................................................................
Visualization ...........................................................................
Help ........................................................................................
Running from the Command Line ....................................................
24
25
25
26
26
.................................................................
27
27
27
34
34
35
40
40
41
41
41
42
42
44
45
49
50
50
51
51
52
52
54
....................................................
55
55
55
57
58
64
65
67
67
67
CHAPTER 7
CHAPTER 8
68
68
69
74
74
75
76
76
.................................................
83
84
84
85
91
92
92
92
92
92
92
97
................................................
99
99
100
102
115
117
120
120
120
120
121
122
129
130
78
82
Table of Contents
CHAPTER 9
vi
131
131
132
132
133
133
133
134
134
134
134
135
135
145
146
.........................................................
149
149
149
152
158
166
167
171
171
172
172
172
173
174
174
176
183
183
184
185
185
186
186
CHAPTER 10
CHAPTER 11
186
186
194
194
Grid
...................................................................................................
197
197
198
199
212
215
215
215
216
216
218
221
225
226
226
226
227
227
236
Contour
.............................................................................................
237
237
238
239
242
248
252
253
257
257
258
258
259
264
vii
Table of Contents
CHAPTER 12
CHAPTER 13
viii
264
266
267
268
268
268
269
Surface
..............................................................................................
271
272
273
274
279
280
280
280
281
281
281
282
285
285
285
288
288
289
289
Block
.................................................................................................
291
291
292
294
296
297
299
299
300
301
301
302
CHAPTER 14
Help ........................................................................................
Example of Using block ...................................................................
Running From the Command Line ...................................................
Setting up the Input File ...................................................................
Equal Dimensions ..................................................................
Non-Equal Dimensions ..........................................................
Non-Gridded Files .................................................................
Blanking Files ........................................................................
Block Mathematics ...........................................................................
Bibliography (block) .........................................................................
302
302
304
309
309
310
311
312
312
313
.........................................
315
315
316
317
326
327
328
330
332
332
332
335
336
337
338
338
338
339
339
340
342
342
342
342
343
343
343
343
ix
Table of Contents
CHAPTER 15
CHAPTER 16
343
345
347
.....................................................
349
349
352
376
376
380
380
380
380
380
380
382
384
385
386
386
387
387
.............................................................
389
389
390
392
406
407
407
407
407
408
408
408
410
410
411
411
CHAPTER 17
CHAPTER 18
Bibliography (mt3dmain)
.................................................................
411
................................................................
413
414
414
415
420
420
420
427
430
430
Utilities
.............................................................................................
431
....................................................................................................
Numerical and Operation Keypad: ........................................
lpr_ps: ...............................................................................................
ps_merge: ..........................................................................................
editor: ................................................................................................
431
431
433
435
436
C Print Formats
.................................................................................
437
Examples
................................................................................
437
calc
APPENDIX A
APPENDIX B
.........................................
439
Makefile ............................................................................................
Source Code ......................................................................................
Module Specific Files ............................................................
Shared Files ............................................................................
Included Example Data Files ............................................................
block .......................................................................................
contour ...................................................................................
grid .........................................................................................
histo ........................................................................................
modmain ................................................................................
mt3dmain ...............................................................................
plotgraph ................................................................................
sisim .......................................................................................
surface ....................................................................................
439
440
440
443
444
444
444
444
444
444
445
445
446
446
xi
Table of Contents
vario ........................................................................................
variofit ....................................................................................
APPENDIX C
Preference Files
.................................................................................
X Resource Files
Help Files
449
...............................................................................
453
.........................................................................
453
..........................................................................................
455
xii
...............................................
455
.....................................................................................
457
APPENDIX F
Bibliography
APPENDIX G
449
...............................................................
446
447
................................................
461
CHAPTER 1
Introduction
Introduction
UNCERT is a software package designed to aid hydrogeologists in using computers to simulate the
distribution of materials and material properties in the subsurface, evaluate groundwater flow and
contaminant transport, and design and evaluate alternative contaminant remediation designs.
This package is also designed so that users in other disciplines of research may find various
portions of the package useful, and handles will be attached so that new software packages may be
easily incorporated as new developments are made and the need arises.
There are a number of software modules associated within this package. These modules allow the
modeler to 1) input raw field data or data from a pre-existing database, 2) analyze the data using
classical statistics, 3) evaluate trends, 4) evaluate the data using geostatistical techniques such as
semivariogram analysis, various kriging techniques (simple, ordinary, indicator, and Bayesian), and
stochastic simulation. When the data are analyzed, or when data are prepared from other sources,
graphical tools are available to view the results in two-, two-and-a-half-, and three-dimensions.
Once the spatial variation of materials has been determined, tools will be available to automatically
generate finite-difference grids for groundwater flow and contaminant transport models such as
MODFLOW1 and MT3D2, 5) run these models, and 6) evaluate both the results of individual runs,
as well as the composite results of multiple model simulations.
Introduction
Development of this toolkit is important because of the inherent difficulty in describing the
subsurface. For any given set of data there are a multitude of possible interpretations of the
subsurface which honor the raw data. To evaluate the alternatives manually would take considerable
time and still only a small portion of the possibilities could be evaluated. This is true even when the
subsurface configuration is relatively simple. In Figure 1.1, for example, where there are only two
FIGURE 1-1: For a given set of drill logs, where material properties were identified, multiple
realizations of the subsurface may be envisioned. Each interpretation honors exactly the hard
well log data, but each realization is significantly different from the other. Based on the available
data, one is not better than another, and all should be evaluated.
materials present (sand and silt), three alternate interpretations are suggested based on data from
two wells. Each description honors the raw data exactly, but groundwater flow and contaminant
transport through each would vary significantly. In more complicated situations where materials
grade into one another (Figure 1.2) alternative interpretations are much more complicated and
Introduction
FIGURE 1-2: For given drill logs defining information about gradual hydraulic conductivity
varied, but still honor the data. In order to evaluate this inherent uncertainty, computers can be used
to create the multiple alternative realizations of the subsurface. The process can be forced to honor
the hard data (well logs, etc.) by using indicator kriging techniques, and incorporate more uncertain
data (soft data - data with a range of uncertainty, e.g. seismic information, geophysical well logs,
expert opinion) through Bayesian kriging. By automating this process, much of the uncertainty can
be characterized with comparatively little time invested by the hydrogeologist. Once multiple
realizations are created, groundwater flow and contaminant transport models can be executed to
compare modeled and field conditions. When a model response clearly doesnt match field
observations, this possible subsurface configuration can be disregarded; of the remaining
realizations (invalidating 90% of the realizations might not be unreasonable) that appear
reasonable, the distribution of contaminants may be evaluated, for the time already modeled, or for
Introduction
future conditions. Based on the results of flow and transport modeling in these remaining
configurations, the probable locations of contamination may be identified. Also, the probable
effectiveness of remediation facilities designed to contain the contamination, can be evaluated. A
computer can evaluate only a limited number of realizations, but the number is so large, relative to
that which can be accomplished manually, that a representative assessment of the reasonable
alternatives will be realized.
This process is illustrated in Figures 1.3 through 1.5. At this hypothetical site there are two leaking
FIGURE 1-3: For a given set of well data (clays and gravel borings), a geologist might interpret the
subsurface, and how a resultant simulation of contaminant migration might appear. For the given
data, this description may be reasonable, but it is only one possibility, and may be incorrect. The
description and results would only be correct by chance, and the results are not associated with a
quantified level of uncertainty for the predicted distribution or magnitudes of concentrations.
storage tanks. For simplicity we will assume a two-dimension system (i.e., lithology is constant
with depth) and assume exploratory drilling has cored six clay holes and five gravel wells. One
interpretation a hydrogeologist might make appears in Figure 1.3a. The material between wells is
assumed to be fairly uniform; this yields several large clay zones separated by a gravel channel.
Contaminants migrating through the subsurface might form a plume similar to that in Figure 1.3b
over a given time interval. On the other hand, a less intuitive description of the subsurface is
offered in Figure 1.4a. This realization exactly honors the data as in the first example, but is
substantially different. In this case it was assumed that the spatial continuity of units was not as
great, and as a result the contaminant plume would be significantly different (Figure 1.4b) over the
Introduction
FIGURE 1-4: For a different interpretation of the subsurface the resultant contaminant plume is
distinctly different than the one shown in Figure 1.3, even though both realizations exactly honor
the data. For the given data, both descriptions are reasonable, and neither is likely to be exactly
correct. Comparing Figures 1.3 and 1.4 demonstrates why it is important to evaluate multiple
descriptions of the subsurface. Without doing so, the risk involved in defining the contaminant
distributions and levels cannot be assessed.
same time interval. With no further data, both these solutions are equally probable, but a
hydrologist probably would not equally consider both, and yet there are many more possibilities to
be considered. This software package is designed to help identify these possibilities in a timely
fashion. Finally, based on the multiple possibilities, maps can be made showing the probability that
the contaminant plume will exceed a given concentration at a particular location at a given time
(Figure 1.5). From this risk map, remediation facilities can be designed, and going through a
similar process, the likely effectiveness of each design can be evaluated.
In this chapter several points have been made about the types of data that can be evaluated , and
how they are integrated into the characterization of the subsurface. One of the goals of this project
is to allow the model reasonable variation in the subsurface while constraining results as much as
possible with available data. This can be done by integrating the available data, and although one
particular data set may suggest a wide range of alternatives, when all the available data is
combined, the possible solution population should be greatly reduced (Figure 1.6). The data may
be divided into two basic types; hard data and soft data. Hard data are information that can be
directly examined and evaluated, drill core data are an example of data thatexplicitly define
material types. Soft data are less precise data; there is uncertainty associated with the values.
UNCERT Users Manual
Introduction
FIGURE 1-5: A risk assessment map of the contaminants reaching specified zones over a given
time interval at a concentration level of 1ppm is based on the results of contaminant transport
models through a number of alternative descriptions of the subsurface similar to those described in
Figures 1.3 and 1.4. This map is a composite of model results; for example, a location defined as
having a 50% probability of exceeding the 1ppm concentration level would have exceeded 1ppm in
half the simulations modeled (not exceeded in the remaining half). This analysis provides the
modeler with a measure of risk in evaluating where the contaminant will travel.
FIGURE 1-6: Interpretation of each set of data reveals a range of possible subsurface
configurations. When the data are integrated, only the subsurface configurations which are possible
interpretations of all the data remain as possible interpretations of the site. Each data set that is
added to the integrated interpretation further reduces the zone of overlap, thus reducing the overall
uncertainty.
Introduction
Seismic data are an example; seismic exploration measures the velocity of shock waves through
materials, but because different materials have similar velocities, and the degree of fluid saturation
in a single material also effects velocity, only estimates can be made about material type and
location. As a result, there is error associated with the interpretation of seismic data (there are also
errors in hard data, but they are considered small enough to be ignored).
To accomplish the tasks of data entry, data evaluation, subsurface characterization, groundwater
flow and contaminant transport modeling, and data visualization, many steps are required. A
simplified flow chart (Figure 1.7) outlining these steps shows how a modeler can start with field
FIGURE 1-7: Simplified flow chart of the analysis process from collecting raw data, analyzing the
data, making stochastic simulations of the site, and manipulating data into formats acceptable for
various groundwater and contaminant transport models. Once the models are executed, the output
can be viewed by various methods, parts of the model can be interactively modified, and the models
can be re-run. This process and these tools are useful in designing remediation facilities that
recover or contain contaminants from/in he groundwater.
data, or data prepared by other products, and be guided through statistical analysis of the data,
generation of multiple realizations of each data property, development of model grids, kriging of
data properties into model grids, generation of input files for flow and contaminant transport
models, execution of models, and visualization of model results. A more complete flow chart is
presented in Figure 1.8, but it will not be discussed in detail.
Introduction
Installation
CHAPTER 2
Warranty
The UNCERT package and the program modules within are distributed in the hope that they will be
useful, but WITHOUT ANY WARRANTY. No author or distributor accepts any responsibility to
anyone for the consequences of using them or for whether they serve any particular purpose or
work at all, unless stated so in writing by the authors. No author or distributor accepts
responsibility for the quality of data generated, nor the damage to existing data. Everyone is
granted permission to copy, modify, and redistribute the UNCERT program, but only under the
condition that the copyright notice in the software remain intact. The software is provided "as is"
without express or implied warranty.
Installation
Acquiring Software
The software may be acquired on the internet, by using anonymous ftp. The ftp site name is:
uncert.mines.edu
The UNCERT software is stored in the directory:
/pub/uncert
In this directory, releases with executables are available for several UNIX platforms, but only the
file:
uncert.tar.Z
is guaranteed to be current. I have limited access to most platforms, and all versions may not be
current. Check the dates on the files. The file 'uncert.tar.Z' contains the full UNCERT release, but
no executable files. You will have to compile UNCERT yourself if you retrieve this file. There are
also several files kept in:
/pub/misc
1. X-windows: is a graphical user interface (GUI) developed at MIT largely for use on multi-tasking UNIX
workstations. It was developed as a standard user interface with standard graphical libraries, so that applications developed on one platform would be easily portable to other platforms using X-windows.
2. Motif was developed by the Open Software Foundation (OSF) as an add on window manager to X-windows. The X-windows GUI has a number of short coming with regard to developing a user friendly interface. Motif is an extension of X-windows, built with X-windows libraries, and allows application
developers to easily generate attractive, user friendly software, and gives different applications written by
different developers a similar look and feel.
10
Installation
that may be useful. These include public domain and shareware programs available from other
locations on the internet. These versions may not be the most recent, but they may save to time
trying to locate them elsewhere. These files include f2c (a FORTRAN to C preprocessor), gcc (an
ANSI C compiler. You need a C compiler to build it), gs (a Postscript previewer), xv (a GIF/JPEG
viewer), and gzip (a good compression utility). There are other files too.
A ftp session might look like:
prompt> ftp uncert.mines.edu
user name: anonymous
password: (your e-mail address, e.g.. wwingle@mines.edu)
ftp> binary
ftp> cd /pub/uncert
ftp> get uncert.tar.Z
ftp> quit
Installation
Once you have downloaded the UNCERT software there are several steps you need to follow to
install UNCERT: 1) Unpack the software, 2) compile all the UNCERT modules (This step can to
skipped if you downloaded a version with executable), and 3) set up user accounts.
Compiling UNCERT
If you did not download a file with executable, or you have trouble with the executables you did
download (e.g. cannot find shared library ..., etc.), you will have to compile UNCERT. Once the
files are unpacked, change directories to the uncert directory, for example:
UNCERT Users Manual
11
Installation
prompt> cd /usr/local/uncert
If you are not running on an IBM computer, you will need to setup the Makefiles you each
module. This is done using the following command (select the appropriate command based on
your system):
prompt> set_make ibm: IBM RS6000
prompt> set_make hp: HP
prompt> set_make sun: Sun OS
prompt> set_make sol: Sun Solaris
prompt> set_make sgi: Silicon Graphics
prompt> set_make sco: SCO
prompt> set_make linux: Linux/Slackware
If your machine type is not listed, you will probably need to modify each Makefile in the
directories:
?/uncert/src/*
This will mainly involve defining where the X-windows and motif library and include files are
located. You may also have to define your C and FORTRAN compilers. Once the Makefile's are
correctly defined, type:
prompt> build
This script will go into each ?/uncert/src directory and try to make each program. This may or may
not work. Several things can go wrong.
1. The Makefile's do not have the right libraries specified. See if there is a Makefile
specific to your machine (e.g. Makefile.ibm). If there is, copy it to 'Makefile'. If a
correct Makefile does not exist, you may have to determine which libraries are missing.
NOTE: on some computers library order is important.
2. You do not have an ANSI C compiler, or your compiler is named something other then
'cc'. If you have another compiler then 'cc', set the variable 'CC' to your compiler. If
you don't have a ANSI C compiler, you can get gcc from our ftp site. gcc is a
shareware C and C++ compiler. It may take some effort to compile.
Note: our posted version may not be the most recent.
3. You do not have a FORTRAN compiler or your compiler is named something other
then 'xlf' (or 'f77'). If you have another compiler then 'xlf' (or 'f77'), set the variable
'F77' to your compiler. If you don't have a FORTRAN compiler, you can get f2c from
our ftp site. It is a shareware FORTRAN to C conversion program. You then compile
the C. Contact me (Bill Wingle) if you have this problem. I'm still working on an
instruction set.
4. The FORTRAN compiler does not recognize the -qextname compile option. Delete it.
This is an IBM FORTRAN/C compile option.
12
Installation
5. In block, you cannot find su.h, segy.h, libcwp.a, libpar.a, or libsu.a. Remove the -DSU
compile option. This is an option to compile SU (Seismic UNIX) which most users
probably won't have.
Once you get the Makefile's corrected, you can type make in each src directory, or you can type
'build' from the ~/uncert directory.
NOTE: When you start to port the code, you must do a make in the ?/uncert/src/Xs
directory first. This builds the library libXs.a which most of the programs depend
on. You can then move to any of the other src directories and start compiling
code. I've only included programs that I consider stable.
If you are compiling on a non-supported system, I doubt that you will have to make more then a
few changes to get the UNCERT modules compiled. There are a couple of important notes though.
1) Many of the program directories repeat the same files. These are common tool object
files that will eventually go into a single library. Unfortunately I can't keep all of the
files current as I develop UNCERT, therefore, from directory to directory, files may
vary slightly and be incompatible. This means that if you find a problem in a common
file, you need to change each file, not copy the fixed file to the different directories.
2) On some computers (Solaris), you will get many warnings when compiling messages.c
about the pixmaps being incorrectly aligned. These warnings can be ignored.
On some systems
export XUSERFILESEARCHPATH=$XUSERFILESEARCHPATH:$UNCERT/ \
app-defaults/%N
UNCERT Users Manual
13
Installation
On SGI's you must make this substitution. In general, if your platform supportsthis option, it is
better than XAPPLRESDIR.
You must also define a help browser. If you do not define a browser, you will still have on line text
help, but no graphics for figures.We are currently developing the xhelp package, but we recommend
you use netscape (Netscape Communications Corporation)or Mosaic (NCSA). These viewers may
be downloaded from:
http://home.netscape.com/
http://www.ncsa.uiuc.edu/SDG/Software/Mosaic/NCSAMosaicHome.html
A version a Mosaic (probably old) may be downloaded from our anonymous ftp site ifyou do not
have a browser:
uncert.mine.edu
/pub/misc/xmosaic-2.5.tar.gz
At this point netscape has more features, but it a commercial application,though they have been
letting educational institutions use unlicensed versions. To define a browser other the xhelp,
modify the line WWWVIEWER environment variable described above with one of the following
commands:
export WWWVIEWER=/usr/local/netscape
export WWWVIEWER=/usr/local/Mosaic
setenv WWWVIEWER=/usr/local/netscape
setenv WWWVIEWER=/usr/local/Mosaic
At this point the UNCERT software should be installed, compiled and ready for use. In order to set
some of the environment variables it is suggested that you logout and then login before you to run
the applications.
14
CHAPTER 3
Motif is an extension of the X-windows window manager interface. It provides avisually pleasing
interface and allows the programmer to easily generate user friendly features such as push buttons,
scrolled text windows, and pop-up dialogs. This section presents a brief overview of using motif.
These motif structures are used in all of the UNCERT modules and once you are comfortable with
one module, the other modules will "feel" much the same.
FIGURE
15
The control button, when activated, displays a menu of window control options. However, most of
the options can be more easily controlled by methods discussed below. The one exception is the
item "close." Sometimes a window can "lock-up" and the application in the window no longer
responds to the user; by selecting the "close" menu option, X-windows destroys the window and
terminates the application process. This is not the recommended way to quit an application, but is a
reasonable approach when the application is no longer responding.
Pull-Down Menus
Most of the programs in the UNCERT package are controlled by pull-down menus off the main
menu bar. To activate a menu selection, point at the desired item on the menu bar and hold the left
mouse button down. When the item is selected a pull-down menu will appear. While holding the
mouse button down point to the desired item then release. Note that the currently selected item
appears raised (Figure 3.2). If you change your mind and don't want to select any item, move the
mouse point off all the menus and release. Some pull-down menu items have further options; these
are identified by a small arrow to the right of the selection (i.e. "Type" in Figure 3.2). When the
mouse selects such an item, another pull-down menu is displayed.
FIGURE 3-2. X-windows/Motif pull-down
menu. This is a typical menu structure for an
UNCERT program module, with the main
menu-bar, the pull-down menu, and the subpull-down menu. The active selection is
highlighted, by appearing as the most forward
raised button. Note the underlined characters;
these are hot-keys which allow the user to
select most menu items using key strokes
alone.
It is also possible to use the menu largely without using the mouse, but instead using keyboard
commands. By examining Figure 3.2, most menu items have one character underlined. The
underlined character identifies the key stroke that will activate each selection. Items on the main
menu bar are selected by holding the <ALT> key with the appropriate character, items from submenus are selected by typing the character only. For example, to select the GRIDZO file type, as
shown in (Figure 3.2), the correct key stroke sequence is:
<ALT> F T G
For the rest of this manual, a menu naming convention will be used. For this example, the text
would refer to the menu item as File:Type:GRIDZO.
16
Pop-up Dialogs
Some sub-menu items also have "..." proceeding them. These indicate that a pop-up dialog window
will be displayed after its selection.
Pop-up Dialogs
The pop-up dialog is a temporary window created by the application and is used to gather
information from, or deliver a message to the user. The general format of a dialog window is shown
in Figure 3.3. In the dialog response area there may be a series of buttons, text, text fields, scrolled
text, slider bars, toggles, and toggle menus (each to be discussed). Below the response area there
will be one to four buttons (Figure 3.3 shows OK, Apply, and Cancel; Help is also a commonly used
button). OK is used to accept all the values defined in the dialog response area and remove the
dialog. Apply also accepts all the values defined in the dialog response area, but the pop-up dialog
is not removed; this option is often used in graphics programs to visually check how the graphic is
modified by the parameter modifications before the dialog is removed. Cancel removes the dialog,
and tries to discard changes made to values in the dialog response area. If Apply has been pressed,
modified values cannot be canceled. If a <RETURN> has been used in a text field, the value cannot
be canceled. Also, once a slider bar has been moved, its value cannot be canceled.
FIGURE 3-3. X-windows/Motif pop-up dialog. A dialog
A simple dialog box may be composed simply of a message, and an OK button. Generally the
message will inform the user that some other task must be performedbefore the desired selection
can be performed, or that the application does not currently support that feature currently. After the
user has read the message, pressing the OK button or hitting return will remove the message dialog.
17
Buttons
Buttons are used to respond to a question, or request further information. As seen in Figure 3.3,
OK, Apply, and Cancel are responding to an implied question for the dialog: "Are you done with the
dialog and do you want the values saved?". If a Help button had been present, by pressing it the
user would be indicating information was desired about options in the pop-up dialog. The format
of buttons is show in Figure 3.4. Buttons contain a text description and appear raised when the are
ready to be pushed. They appear recessed when they are pressed. Buttons may also be deactivated
by the program. Often the function of a button is inappropriate because of how other program
inputs have been defined. By deactivating the button the user cannot perform illegal operations and
the user is not tempted to supply more information then required. A deactivated button looks like a
normal button, but the text descriptor is faded (Figure 3.4c).
FIGURE 3-4. X-windows/Motif push buttons. As shown here push buttons have three main states:
active, pressed and selected, and inactive, respectively from left to right.
Scrolled Text
Scrolled text areas are used to display textual information (such as a file) or display a list of options
the user can select from (such as a list of all the files in the current directory). Often there is more
information available then can be presented in the area supplied; when this occurs scroll-bars are
attached to the bottom and side of the text area (Figure 3.5).
In the middle of the scroll-bar is a black button; at the ends are two arrows. If the black button
extends from arrow to arrow, all the text is visible with regard to that scroll-bar (the horizontal
scroll-bar controls visible columns, the vertical scroll-bar controls visible rows or lines). To move
the view area one line or column at a time, one presses the appropriate scroll-bar arrow with the
mouse pointer. To page down one presses the grayed area immediately below the scroll-bar button.
To page up one presses the grayed area immediately above the scroll-bar button. A similar logic
applies to the horizontal scroll-bar. Text can also be scrolled by pressing the appropriate scroll-bar
button, holding the mouse button, and dragging the button up and down (or left and right) as
desired.
If the text field is a selection area, as with a file selection dialog (to be discussed below), to select
the desired item, point at the appropriate line with the mouse pointer, and press the left mouse
button. The selected item will become highlighted. Note: many programs will let one double-click
with the mouse on the selected item to execute the dialogs function (e.g., in a file selection dialog
double-clicking on a filename will highlight the selection and pass the filename back to the program
for further processing).
18
Slider Bar
Slider Bar
A slider-bar is used to allow the user to select a value from a closed range of continuous values. For
example, a user might want to select a view direction over a surface; because view directions vary
from 0-360 degrees only, a slider-bar can be reasonably used to select a desired variable value. A
typical scroll-bar is shown in Figure 3.6. A slider-bar has a title, a position indicator, and a slider
button. To move the slider-bar, the slider-bar button is pressed and held with the left mouse button,
and dragged to the desired position. It is also possible to press on the grayed area to either side of
the slider-bar button. This will move the slider-bar button 1/10 of the total range of the slider-bar in
the indicated direction (i.e. +/-36 degrees for the above example).
FIGURE 3-6. X-windows/Motif slider bar. Slider bars are useful for selecting variable values
which fall within a specified range. For example, the slider bar may be used to specify the percent
(0% to 100%) red that should be used if defining a new color. The range of possible answers is
restricted.
19
Note that with slider-bars, the desired value is not always exactly attainable. Slider-bars are limited
to 100 positions. This means that with the view directions mentioned above, the smallest angular
step is 3.6 degrees.
Text Fields
Text fields are used to enter new variable values. The format of a text field in show in Figure 3.7.
It is composed of the variable name and a recessed text entry box. To enter a new value in the entry
box, select the text field with the mouse button, erase the previous entry and type in the new entry.
To enter the new value press <RETURN> when done, or press OK or Apply if the text field is part
of a pop-up dialog box. To erase a previous entry, one can 1) position the insert cursor at the end of
the current entry and backspace over the old entry, 2) quickly double-click with the left mouse
button in the text field; this highlights the entire selection, and the next typed key will replace the
highlighted selection, or 3) using the mouse, holding the left mouse button down, starting at one
end of the selection, dragging the mouse over the rest of the selection will highlight it; the next
typed key will replace the highlighted selection.
FIGURE 3-7. X-windows/Motif text field. A text field is used to display current variable values,
Toggles
Many program options are ON or OFF, TRUE or FALSE. Toggle buttons are vary useful for
defining these variables or parameters. Two toggle buttons are shown in Figure 3.8, the top one is
OFF/FALSE and appears raised. and the bottom button is ON/TRUE appearing recessed. To the
right of the toggle is the description. Toggle buttons are always squares. The toggle is changed by
pressing the toggle button.
FIGURE 3-8. X-windows/Motif toggles. Toggles are used to set variables to either TRUE or
FALSE, or ON or OFF. A raised button indicates the variable state is FALSE or OFF. A depressed
button indicates the variable is TRUE or ON.
20
Toggle Menus
Toggle Menus
For many variables, they must be set to one possible option, out of several possibilities (e.g. the line
color can be 1) red, 2) blue, or 3) green). These variables are defined with a toggle menu. A toggle
menu can have only one active selection at a time. An example toggle menu is shown on Figure
3.9. It has a menu title followed by a series of diamond shaped toggles, the active toggle or
selection appearing recessed, the rest raised. To select a menu option press the toggle next to the
desired description; that option will be activated, and the previous selection will be turned off.
FIGURE 3-9. X-windows/Motif toggle menu. Toggle menus allow the
user to select one preferred state, from among a list of options. Only
one option may be chosen at a time. The recessed toggle button is the
active option.
21
22
Mainmenu (uncert)
CHAPTER 4
The mainmenu (uncert) program is a simple user interface to run the different modules in the
UNCERT software package. It is designed to be a user friendly interface for UNCERT, so that
user's can progress through the software in evaluating their field data, and modeling the site of
concern.
Currently it is a very simple interface which allows the user only to execute the different software
modules within UNCERT. As it stands now, mainmenu is used mainly as a convenience in
executing software which the user may not be familiar with, and allowing the user to minimize
working in the UNIX command line environment. It is a simple attempt to bring the entire
UNCERT package together into a unified window based environment. It is not recommended that
the user try to use this interface exclusively though. A great deal of functionality in the software
would be lost, trying to do so.
FIGURE 4-1. This is an example of the mainmenu application window. The main menu-bar is used
to group various application programs available in UNCERT. This application is used to run other
UNCERT modules.
File
The File menu option has only one sub-menu item: Quit. Selecting File:Quit will terminate
mainmenu.
UNCERT Users Manual
23
Mainmenu (uncert)
Data Analysis
Array
Data Analysis:Array runs the array application. Array is a program for performing arithmetic
operations on two- and three-dimensional grids. For a full description, see Chapter 17.
Distcomp
Data Analysis:Distcomp runs the distcomp application. Distcomp is a program designed to
compare populations of different data sets. It can also calcualte basic statistics like the mean,
median, standard deviation, variance, and skew for data set. It plots the population distributions
using histograms, cumulative histograms, probability plots, P-P plots, and Q-Q plots. For a full
description, see Chapter 7.
Graph
Data Analysis:Graph runs the plotgraph application. Plotgraph is a X-Y graphing package. It
supports normal, semi-log, log-log, and probability plots. For a full description, see Chapter 5.
Histogram
Data Analysis:Histogram runs the histo application. Histo basic statistics program which
calculates the mean, median, standard deviation, variance, and skew for single parameter data. It
also plots the frequency distribution of the data using histograms, cumulative histograms,
probability plots, and box and whisker plots. For a full description, see Chapter 6.
Geostatistics
Semivariogram Analysis
Geostatistics:Semivariogram Analysis runs the vario application. Vario is used to generate
experiment and jackknifed experimental semivariograms, cross-semivariograms, correlograms,
semiradograms, semimadograms, etc. For a full description, see Chapter 8.
Semivariogram Fit
Geostatistics:Semivariogram Fit runs the variofit application. Variofit is used to calculate and
model experimental semivariograms (etc.) using least-squares estimation, latin-hypercube
sampling, or manual techniques. For a full description, see Chapter 9.
24
Grid
Geostatistics:Grid runs the grid application. Grid is gridding package which uses a least-squares,
kriging, and trend-surface analysis estimation algorithms. For a full description, see Chapter 10.
SISIM3D
Kriging:SISIM3D runs the application, sisim. Sisim is a X-windows/motif user interface which
assists in running sisim3d, a indicator kriging, conditional simulation program developed at
Stanford and modified at the Colorado School of Mines to handle soft data. For a full description,
see Chapter 14.
Modeling
MODFLOW
Modeling:MODFLOW runs the application modmain. Modmain is a user interface, pre- and postprocessor for the USGS ground water flow model, modmain. For a full description, see Chapter 15.
MT3D
Modeling:MT3D runs the application mt3dmain. Mt3dmain is a user interface, pre- and postprocessor for the US EPA contaminant transport flow model, MT3D. For a full description, see
Chapter 16.
Visualization
Block
Visualization:Block runs the application block. Block is a three-dimensional visualization tool for
examining regular and irregularly spaced rectangular grids, and 3D data sets. For a full description,
see Chapter 13.
Contour
Visualization:Contour runs the application contour. Contour is a two-dimensional contouring and
gradient analysis package used to examine regularly gridded data. For a full description, see
Chapter 11.
25
Mainmenu (uncert)
Surface
Visualization:Surface runs the application surface. Surface is a two-and-a-half-dimensional
visualization tool for examining regularly gridded data. For a full description, see Chapter 12.
Help
The help menu item allows the user to get information about what UNCERT is, how to use
mainmenu, and general information about all of the modules callable from mainmenu.
26
CHAPTER 5
X-Y Graphing:
Plotgraph
The plotgraph application is used for plotting two-dimensional X-Y graphs. The application
allows the user to plot lines, points with various symbols, and calculate regression lines. The data
can be plotted using normal, semi-log, and log-log axes.
The plotgraph application is composed of three sections (Figure 5.1); the main menu-bar, the status
and log text area, and the drawing or graph area. The menu-bar is used to select all plotgraph
commands, the log/status area is used by the program to report important messages or results, and
the drawing area is the display area for the graphs.
File
The File sub-menu options control file and print handling, and exiting the program. The options
include Open, Add, View, Save, Save as, Save Preferences, Print Setup, Print, Quit, and Quit
Without Saving.
27
FIGURE 5-1. This is an example of the plotgraph application window. The main menu-bar is on
the top of the application window, with the drawing or graph area below.
Open
Selecting File:Open generates a pop-up dialog (Figure 5.2) which allows the user to select an
existing data file. The dialog is composed of five parts; the filter, directory list, file list, selection,
and button command row.
The Filter is used to define which files are displayed. Generally in a directory there are many files,
but only a small subset of them are usable by the application. If standard file conventions are used,
plotgraph expects files with a "*.dat" form (the "" is a wild card - this format identifies all files
ending with a "dat" extension). The default directory is current directory path/*.dat. Often the
user uses a different file extension, or the desired file is in a different directory; by modifying the
filter appropriately and hitting <RETURN> or pressing the Filter button at the bottom of the dailog,
the files meeting the filter specifications will be shown.
28
The Directories area shows the directories immediately adjacent to the selected directory. The first
two directories in the list are special:
/*/.
/*/..
The remaining directories in the list are sub-directories of the current directory. By double-clicking
on a directory with the left mouse button, or marking the directory and hitting the <RETURN> key,
the program will search the indicated directory using the filter.
The Files area shows the list of files that pass the specifications of the Filter. Note that only the
filename is shown, but by moving the horizontal scroll-bar, the full directory path can be viewed.
To select a file, mark it with the mouse; double-clicking on the selection, hitting <RETURN> after
making the selection, or pressing the Open button completes the selection process.
If the desired file is not shown, the user can type in the file name in the Selection text field. If the
file is in the current directory the path need not be given, otherwise the full path is required. Hitting
the <RETURN> key or the Open button completes the file selection process.
If the user decides not to open a new file, the Cancel button will destroy the pop-up dialog. In this
program the Help button is inactive, i.e. no help is available directly from this dialog.
Add
In many cases, it is useful to plot more then one line on a graph, or lines from several different data
files on one graph. File:Add serve this function. Before File:Add can be used however, one file
29
must already have been opened (with the File:Open command, passed on the command line, or
defined in the preference file (Appendix C). Once a graph has been created additional lines may be
added. To do this select the Add option. A pop-up dialog (similar to Figure 5.2) will appear
requesting the desired data file (any legitimate plotgraph file format is acceptable). When the data
file is selected the line(s) will be superimposed over the preexisting graph.
In adding two or more lines to one graph, plotgraph assumes that a new file should be created
before the program is terminated. The graph may be saved by using the Save or the Save as menu
options. If the program is terminated before the graph is saved with the Quit menu option, the user
will be queried for a file name. To avoid saving the graph, select Quit Without Saving. To clear a
graph that has been "added" together, Open a new file. Note: once new lines have been added to a
graph, they cannot be deleted (it is possible to hide them however, and they can be edited out of the
newly created data file. See the main menu option Graph:Style, or the "Setting up the Input File"
section).
View
File:View pops up a simple screen editor with the last saved/opened version of the file being
graphed.
Save
File:Save saves the lines in the current graph to a file. If a save file has already been opened, the
data are simply saved. If a save file has not been selected yet, a pop-up dialog similar to that used in
File:Open (Figure 5.2) is created. The main difference between the Open and the Save dialog is
that to save a file, the file does not have to pre-exist. For a description of how the dialog works, see
the Open section above and substitute the Save for Open wherever appropriate.
The graph lines will be saved using the MANY LINE format described in "Setting up the Input File"
section.
Save as
File:Save as is used to save the graph lines to a user specified file. A pop-up dialog similar to that
used in File:Open (Figure 5.2) is created. The main difference between the Open and the Save as
dialog is that to save a file, it does not have to pre-exist. For a description of how the dialog works,
see the Open section above and substitute the Save for Open wherever appropriate.
The graph lines will be saved using the MANY LINE format described in "Setting up the Input File."
30
Save Preferences
When using programs with many user options, it is not possible for the program to always pick
acceptable default values for each parameter or input variable. For this reason, preference files
were created (See Appendix C). These allow the user to define a unique set of "defaults" applicable
to the particular project. When File:Save Preferences is selected, plotgraph determines how all the
input variables are currently defined and writes them to the file "plotgraph.prf."
WARNING:
If "plotgraph.prf" already exists, you will be warned that it is about to be overwritten. If you do not want the old version destroyed you must move it to a
new file (e.g. the UNIX command mv plotgraph.prf plotgraph.old.prf would be
sufficient). When you press OK, the old version will be over-written! This
cannot be done from within the application. To rename the file, you have to
execute the UNIX mv command from a UNIX prompt in another window.
If "plotgraph.prf" does not exist in the current directory, it is created. This is an ASCII file and can
be edited by the user. See Appendix C for details.
Print Setup
File:Print Setup allows the user to define the print destination, the number of copies that will be
made, the page orientation, the print margins, and whether the Postscript output will be in black and
white or color (Figure 5.3). The print destination can be a printer or a file. If the destination is a
FIGURE 5-3. Print Setup pop-up dialog.
This
dialog allows the user to direct print output to a
printer or a file, print variable numbers of copies,
print a header page, orient the output portrait or
landscape, select Black & White or color output,
and set the print margins.
31
NOTE: The default print queue name is "ps." Depending on your network configuration,
your Postscript print queue may be named something else.
If the destination is a file, the file name must be defined. To select an output file, enter the name in
the text field or press the Select button. A pop-up dialog similar to Figure 5.2 will prompt the user
for a Postscript file name. If multiple copies are desired, enter the desired number in the Number of
Copies text field. If a header page is desired, flip the Print Header Page toggle to TRUE. The page
margins can be set by pressing the Set Margins button. The margins are set using the dialog shown
in Figure 5.4.
FIGURE 5-4. Define Print Margins pop-up dialog. This dialog
allows the user to specify the print margins around the border of
the graph (not the extents of the text).
NOTE: The margins specify the distance to the edge of the graph, not to the extents of the
text.
The graph can also be printed in either a portrait (Figure 5.5) or a landscape orientation (Figure
5.6). The output can be set to either Black & White or Color. Note, color output can be printed on
a gray scale printer; the printer will dither the color to a gray scale.
FIGURE 5-5. This is an example, multiple line graph, printed using the portrait orientation format.
32
FIGURE 5-6. This graph shows the same data as in Figure 5.4, but it is presented differently. The
graph is print with a landscape rather than portrait orientation. The data points indicated with
circles are also fitted with a fifth degree regression curve.
Print
File:Print generates a Postscript file of the graph, and depending on how the print options are
define in Print Setup, directs this file to the specified print queue, or to the specified file.
Quit
File:Quit terminates the program, but if additions have been made to the graph, the user will first be
queried to supply a file to save the changes in.
33
Data
When the appropriate file type is being read (multiple column data), selecting the Data:Modify
menu option allows the user to specify which columns will define the X and Y data (Figure 5.7).
FIGURE 5-7. Data Column pop-up dialog. For multiple column data
files, these variables define which columns will define the X and Y
data.
Note that after these columns are defined the desired file must be loaded (or reloaded). These
options will not effect the current graph. Only the two active data columns are stored in memory by
the program, so if data columns are changed, the program must reread the data file.
Curve Fitting
After a data file is loaded, a least-squares regression can be calculated for any line by selecting the
Curve Fitting:Regression menu option. The options available are shown in Figure 5.8. Equations
FIGURE 5-8. Regression pop-up dialog. This dialog
allows the user to fit up to a 10th degree polynomial to any
line in the data set.
of order 1 to 10 are valid. The Line Number that the regression line will be calculated for can be
specified for any valid line. Once the regression order and data line are specified, pressing the
Calculate button calculates and plots the regression line. The regression line is plotted on the graph
and the regression formula is printed in the text log window.
The regression line is only a temporary feature of the graph and will be removed when the dialog
box is destroyed (by pressing the Done button). To make the line permanent, press the Add as
Permanent Line button.
34
Graph
Graph allows the user to specify various attributes about the apperance of the graph. Attributes
about the graph Axes types, Border, Fonts, Labels, Mesh, and line Style.
Axes
Graph:Axes allows the user to specify the axes types for the x-axis and y-axis. The active axes
type is highlighted in red. The options are Normal, Log-X, Log-Y, and Log-Log. The Log-X and
Log-Y options are both semi-log graphs, but specify the transformed axis.
Border
Graph:Border (Figure 5.9) allows the user to specify
FIGURE 5-9. Border definition pop-up dialog. This dialog allows the user to specify tic spacing,
labeling. Plotgraph will automatically scale the graph to the application window, and space minor
35
and major tic marks. These are often not exactly what the user wants. These, however are
completely user specifyable. In the Border pop-up dialog, the axes dimensions, tic spacing, and tic
label format are defined. The limits of the raw data are displayed for the user's reference (Figure
5.9). The limits of the data are the default limits of the graph. To modify the extents of the graph
the X and Y maximum and minimum values can be changed in the Graph Limits section of the
dialog. The major tic frequency, by default, is defined so that there will be eleven labels on each
axis; these will be divided into five minor tic ranges (four tics). These parameters can be modified
in the Border Tic Spacing area.
NOTE: The Graph Limits and the Border Tic Spacing specifications are largely ignored for a
log axis. The axis will start at the first power of ten below the X or Y Minimum value
(e.g. if Y Minimum = 0.3, the minimum graph limit would be 0.1). The axis will
also end at a power of ten just larger then the X or Y Maximum value (e.g. if Y
Maximum = 32.1, the maximum graph limit would be 100.0). The major tic marks
will also fall on each power of ten, and there are ten log scaled sub-tic marks.
When a new graph is loaded, or when the axis type is changed (normal vs. log), often the predefined
graph limits or tic spacing is inappropriate. Two options are available to remedy this situation. By
pressing the Auto-Scale Axes Limits button new X and Y, minimum and maximum limits are
selected; these will match the extents of the Data Set Limits. The major tic frequency can also be
automatically selected by pressing the Auto-Scale Tics Frequency button.
Major tic mark labels appear with each major tic mark, and the format is specified in the Labels
section. The formats are C language formats for floating point numbers. The specifics are defined
in Appendix A, but are briefly described as:
Syntax: [width of field].[# of decimal places][e f g]
where:
e = exponential
f = fixed
g = general
The dimensions of the graph are controlled by the Axes Specifications; Y Exaggeration Scale and
X/Y Ratio. Both of these numbers control the same process, but describe it in a different manner.
Note that when one number is changed, the alternate number is also changed. In many cases it is
important that the X-scale and the Y-scale are the same (i.e. there is no directional distortion, or
differential scaling) where the Y Exaggeration Scale = 1.0. Sometimes it is convenient for the Yscale to be some user specified factor larger or smaller then the X-scale. In other cases, it may be
more important for the appearance of the graph is consistant (e.g. square, where the X-axis is the
same length as the Y-axis. Here the X/Y Ratio would equal 1.0). As shown in Figure 5.3, the Xaxis will be 1.5 times the length Y-axis, or the Y-scale is exaggerated 19.373931 times.
WARNING: Because the X/Y Ratio and the Y Exaggeration Scale describe the same property;
when one number is ch other number is also changed. If you wish to specify a
36
value for one parameter, you must hit <RETURN> after the entry. This is
because, when you hit the Done or Apply button at the bottom of the dialog,
each variable in the dialog is queried for its value and entered. This process is
done one variable at a time. If you do not hit <RETURN> after making an
entry, the query process may change the alternate parameter first; when this
happens the unentered entry is replaced and forgotten.
Fonts
Graph:Fonts is used to select the screen and Postscript text fonts used by the graph. Different fonts
may be specified for the Main Title, the Secondary Title, the Axes Labels, the axes Division Labels,
Annotations, and the screen mouse Position (Figure 5.10). The font name for each may be entered
FIGURE 5-10. Font Selection pop-up dialog. This dialog is used to define the X-windows and
Postscript text fonts and font sizes for different portions of the graph.
in the text field or by pressing the Select button. The font Size can also be modified.
NOTE: The font size is in pixels for the X-windows screen display and in 1/72" points for
Postscript output. Note, that this also means that the screen graph and the Postscript
output graph will generally have the fonts scaled differently relative to the graph
(What you see is not necessairly what you get).
When the Select button is pressed, the dialog shown in Figure 5.11 will be generated. From this
dialog, the user can select the desired font.
NOTE and WARNING: Some font names that may be entered may not be supported by your
X-station or by your Postscript printer. If an invalid X font is selected, the X display
will select a default font. The program, however, cannot determine which fonts are
support by the attached printer. As a result a printing error will occur. Either the text
UNCERT Users Manual
37
will not be printed, the text will be printed with a default font, or the printer will
abort the print job.
The available fonts are (specified in fonts.h):
Avant Garde-Book -Book Oblique -Demi -Demi Oblique
Bookman-Demi -Demi Italic -Light -Light Italic
Courier -Bold -Bold Oblique -Oblique
Helvetica -Bold -Bold Oblique -Oblique
Helvetica-Narrow -Narrow-Bold -Narrow-Bold Oblique -Narrow Oblique
New Century School Book-Bold -Bold Italic -Italic -Roman
Palatino-Bold -Bold Italic -Italic -Roman
Symbol ()
Times-Bold -Bold Italic -Italic -Roman
Zapf Chancery-Medium Italic
Zapf Dingbats ( )
NOTE: All these fonts in all sizes may not be available on all systems. Also, whether or not
X-windows supports the font, suggests nothing about the Postscript printer support.
Generally, a Postscript printer will support the listed fonts in any size. X-windows
will likely not support all these fonts, and sizes are limited.
Labels
Graph:Labels is used to annotate the graph title and each axis. An example dialog is shown in
Figure 5.12. The Main Title is centered over the graph and written in a larger font then the rest of
the text on the graph. The Secondary Title, if used, is centered over the graph, under the Main Title
with a smaller font. If the Secondary Title is used the Main Title is raised slightly. The X-Axis
Label and Y-Axis Label define the label for the respective axis.
38
FIGURE 5-12. Plot Labels & Titles pop-up dialog. This dialog
Mesh
Graph:Mesh is used to overlay a grid over the graph (Figure 5.13).The line type, mesh origin, and
the spacing of the grid lines for the graph can be specified with this dialog. The default is no grid, a
(0.0, 0.0) origin, and the calculated mesh frequency is the same as the calculated main tic frequency
defined in Border:Grid Parameters. The X or Y Mesh Frequency can be specified as desired.
FIGURE 5-13. Mesh pop-up dialog. This
dialog allows the user to specify X and Y
mesh or grid frequencies, whether the mesh
will be drawn, and whether the mesh lines
will be solid or dashed
NOTE: When Use Mesh is selected both X and Y grid lines are generated; to hide one set of
grid lines (only mesh lines on one axis are desired), set the axis interval to something
larger than the width (or height) of the graph. If the graph has a log axis and the
grids are plotted, it is not possible to "hide" the grid lines for the log axis.
To reduce the "clutter" in the graph, selecting Dash Mesh is sometimes useful.
Style
Graph:Style is used to control the appearance of individual lines on the graph. When first selected,
the parameters for line #1 are displayed. If a different line is desired, the line ID can be specifically
39
selected in the Define Line No. entry field, or the lines can be step through using the Previous and
Next buttons (The numbering of lines is described in "Setting up the Input File"). When the desired
line is selected, the pop-up dialog shown in Figure 5.14 is generated. This dialog allows the user to
FIGURE 5-14. Define Line style pop-up
dialog. This dialog allows the user to
modify the color, line type and thickness,
and symbol type and size for the given line.
specify the Line Color, Line Type, Symbol Type, Symbol Fill, Line Thickness, and Symbol Size. The
Line Thickness and Symbol Size define the height in "pixels" on the screen and the height in 1/72
inches for Postscript printing. The thickness or size for "pixels" must be greater than or equal to
one; for Postscript, values must be greater than 0.0.
Log
The Log menu option is supplied to allow the user to save, view, or print all text which has been
written to the log/status window by the program or added by the user (The log window is also a
simple text editor). The options include View Log, Save, Save as, Clear, and Print. View Log, Save,
and Save as are similar in operation to the menu options under File described above.
Plot
Plot controls when the graph is plotted and under what circumstances it is refreshed.
40
Now
Plot:Now means redraw the graph now!
Refresh
Under Plot:Refresh there are two options: on Exposure or on Update. Because of the multitasking
environment in X-windows, the user commonly has many windows open. When a portion of the
plotgraph application becomes covered, and then uncovered at least one exposure event is sent. On
complicated graphs it can become frustrating to wait for the graph to redraw (sometime several
times) when nothing has been changed. This is what occurs under the on Exposure state. If the
graph is only redrawn on Update the graph is only drawn when the graph is initially loaded, when
some parameter is changed, or when Plot:Now is selected from the menu. Unfortunately, when
sections of the graph are covered, then uncovered, they are not necessarily redrawn, and there are
odd blanked areas. This looks odd, but replotting the graph will correct the problem. Which state
the user uses is a matter of personal preference.
Help
Help lists topics about the program for which there is help. When a item is selected a pop-up dialog
with a scrolled text area is generated which is similar to Figure 5.15 with the desired information.
NOTE: Only one help window may be open at a time.
Help files are editable ASCII data files; for further information see Appendix D.
Location
When the mouse pointer is in the active graph region (within the graph axes), the X and Y graph
position coordinates will be displayed in the upper-left of the graphics or drawing area. These
coordinate will be updated automatically as the mouse is moved. The coordinates are displayed
using the Position font described previously in the Font section (Figure 5.10).
41
FIGURE 5-15. Help pop-up dialog. The help window is a pop-up scrolled text window. Often not
all the text for the help selection can be placed in the window area, so the user can scroll vertically
or horizontally through the text using the scroll bars.
If you need to take a photograph of the screen, or for some other reason don't want the graph
coordinates displayed, simply move the mouse pointer out of the graph. The coordinates will
disappear.
Refresh
Sometimes the graphics display is not updated after a pop-up dialog or another window is removed,
or uncovers the display area. The graph can be redrawn using the Plot:Now menu option, or by
moving the pointer into the drawing area and clicking with any mouse button (Using the right
mouse button may produce undesirable results; see Zoom section below).
Zoom
Often it's important to examine a particular portion of the graph more closely. New graph extents
can be specified using the Graph:Border menu option, but this can be slow and inaccurate. A fast
way to zoom into a location is to use the right mouse button. To zoom, follow these steps:
42
1). Visually pick out the rectangular region you want to zoom into. Move the mouse
pointer to one corner, and press and hold the right mouse button.
2). Move the mouse pointer, still holding the right button down, to the opposite visual
corner. As you move the mouse, a rectangle will be drawn showing the area that will
be zoomed into.
3). When the desired area has been selected, release the right mouse button. The graph
will be redrawn showing the desired region.
The zoom can be repeated again in the zoomed region until the desired detail is acquired (Figure
5.16). To unzoom, you point the mouse anywhere in the drawing area and press and release the
mouse button without moving the mouse. This will redraw the graph to the maximum data extents
of the graph.
FIGURE 5-16. Example of a normal-axes graph, and then the same data, but zoomed into the
43
NOTE: If the graph does not immediately redraw after you release the mouse button, move
the mouse slightly. This will cause the graph to redraw.
WARNING: When you zoom back out to the full extents of the graph, preselected graph
dimensions set in the Graph:Border menu option, passed on the command line,
or defined in the preference file are ignored. It is therefore important not to use
the right mouse button to refresh the graph if you are trying to specify grid
parameters to improve the appearance of the graph.
44
FIGURE 5-17. This is the same data as shown in Figure 5.5, except here the X-axis is log
45
#
#.#
""
{}
=
=
=
=
integer
float
character string.
variable is an array. Values must be seperated by a ',' and no spaces are allowed.
Do not use the "{ }" symbols on the command line.
NOTES:
46
-axes
default = 0
-esp
default = 0
-fnt1
-fnt2
-fnt3
-fnt4
-fnt5
-fnt6
-fnts1
-fnts2
-fnts3
-fnts4
-fnts5
-fnts6
-help
-lc {}
=
=
=
=
=
=
=
=
=
=
=
=
=
=
default = Helvetica-Bold
default = Helvetica-Bold
default = Helvetica
default = Helvetica
default = Helvetica
default = Helvetica
default = 24.0
default = 15.0
default = 15.0
default = 12.0
default = 10.0
default = 12.0
default = variable
5 = Magenta
6 = Yellow
7 = Cyan
-lgf
-lglp
defalut = log.dat
default = 1
-lgmw
-lpbm
-lpc
-lpd
=
=
=
=
default = 200
default = 1.5
default = 1
default = 0
-lpf
-lph
= print filename
= print header page
0 = false
1 = true
default = "junk.ps"
default = 0
-lplm
-lpo
default = 1.5
default = 0
-lppsext
-lpq
-lpr
-lprm
-lps
=
=
=
=
=
default = "*.ps"
default = "ps"
-lptm
-lsfl {}
default = 1.5
default = 0
-lsc {}
default = variable
default = 1.0
default = 0
47
3
4
5
6
7
48
=
=
=
=
=
Green
Blue
Magenta
Yellow
Cyan
-lssz {}
-lsty {}
default = 9.0
default = 0
-ltk {}
-lty {}
= line thickness
= line type
-1 = no line
0 = solid
1 = dashed
2 = double dashed
default = 1.0
default = 0
-md
= dash mesh
0 = false
1 = true
default = 0
-mox
-moy
-ms
= X mesh origin
= Y mesh origin
= use mesh
0 = false
1 = true
default = 0.0
default = 0.0
default = 0
-mx
-my
-prf
-rgl
=
=
=
=
default = 1/10 DX
default = 1/10 DY
defalut = "plotgraph.prf"
default = 1
-rgo
-rgt
= regression order
default = 1
= regression type
default = 1
1 = normal: phi = F(t) DGECO & DGESL
2 = normal: phi = F(t) DQRDC & DQRSL
3 = Legendre polynomial
X mesh frequency
Y mesh frequency
preference file name
regression line
0 = false
1 = true
-rfh
= screen refresh
0 = on exposure
1 = on update
default = 0
-sttl
-ttl
-xc
-xfmt
-xlabel
-xmax
-xmin
-xMt
-xmt
-xto
-xy
-yc
-yfmt
-ylabel
-ymax
-ymin
-yMt
-ymt
-ys
-yto
=
=
=
=
=
=
=
=
=
=
=
=
=
=
=
=
=
=
=
=
Secondary title
Main title
X data input column
Number of decimal places for X-axis
X-axis label
Graph X-maximum
Graph X-minimum
X main tic frequency
Number of minor X tics
X axis label origin
X-Y ratio
Y data input column
Number of decimal places for X-axis
X-axis label
Graph Y-maximum
Graph Y-minimum
X main tic frequency
Number of minor Y tics
Y-axis exageration relative to X-axis
X axis label origin
49
Basic
The basic file is simply a series on X and Y pairs:
1.0
2.21
3.31
4.56
5.12
.
.
.
23.23
12.34
12.98
8.21
10.92
.
.
.
This data set would create one line. For this data format, no header information is required. A
similar file format that also needs no header information has multiple columns of data:
1.0
2.21
3.31
4.56
5.12
.
.
.
23.23
12.34
12.98
8.21
10.92
.
.
.
0.123
0.00123
0.231
0.345
0.456
.
.
.
1.45
1.56
2.34
1.76
1.43
.
.
.
For this style of data file plotgraph will assume column 1 defines the X values and column 2
specifies the Y values, unless otherwise directed by the Data:Modify menu option (NOTE: This
format allows a maximum of 10 columns. The X and Y data columns must also be defined before
the file is loaded; if they are defined afterwards the file must be reloaded).
1.45
1.56
2.34
1.76
1.43
X AND MANY Y must be on the first non-commented line and be in all capital letters. On the
second line there are two integer values; the first is the number of lines (the number of Y columns,
i.e. the total number of columns - 1), and the second defines how many X sample values were taken
50
(NOTE, there must be one Y sample in each line for every X coordinate. There is no option for
NULL entries).
Multiple Lines
The MULTIPLE LINES format allows a file to describe several lines at once that do not have a
common X value. It also allows the file to specify line color, symbol type, and line widths (NOTE:
when a graph is saved using the File:Save as option, this is the format it is saved with). An example
file might look like:
MULTIPLE LINES
2
2 1 4 2.0 4 4 10.0 0 Lead (Pb)
0.0
80.0
100.0 80.0
5 0 5 1.0 3 5 8.0 0 Silver (Ag)
0.0
10.0
20.0
35.0
60.0
68.0
80.0
75.0
100.0 80.0
The format is: MULTIPLE LINES must be on the first non-commented line and be in all capital
letters. The second line specifies how many lines are in the data file. After this line is a series of
line data structures; each structure describes one line. The first line of each series describes the 1)
number of points in the line, 2) line type, 3) line color, 4) line thickness, 5) symbol type, 6) symbol
color (set to line color currently), 7) symbol size, 8) symbol fill (not installed), and 9) line label (not
installed). See the section on Command Line Options for numerical values of line type, line and
symbol color, and symbol type. The following specified number of lines define the X, Y coordinate
data (One X, Y pair is on each line). If more lines are to be read, the above series description is
repeated.
GEO-EAS/GSLIB Files
GEO-EAS (Englund, 1988) has become a popular 2D geostatistical software package over the last
several years, and the its file format is commonly used. To avoid making users modify their files,
UNCERT accepts GEO-EAS files with one minor exception noted below. An example GEO-EAS
file might look like:
Well #23-12383
5
X-location
Y-location
Z-location
Hydraulic Conductivity (cm/s)
UNCERT Users Manual
51
Effective Porosity
100.0 212.0 3454.2 10.23
100.0 212.0 3454.2 8.71
100.0 212.0 3454.2 1.12
0.23
0.21
0.15
Plotgraph Mathematics
Theory of Least-Squares Regression
Least-squares regression is a technique to fit a function to a given set of data. The data are not exact
(have some associated error) and the purpose of fitting a regression line is to define the nature of the
data with a function, rather than fit a function which exactly honors all the data. This is done by
fitting a function to the data which minimizes the squared estimation error:
[yi (axi + b)
n
i =1
(5-1)
where n = number of data points, xi is the ith sample position, yi is the ith sample value, and a and b
are constants to be found which minimize the least-squared error (Burden and Faires, 1985).
In the general case where the data is approximated using a polynomial (P) of order M:
PM (x) =
a kxk
k =0
(5-2)
E = ( y i P(x i ))
i =1
(5-3)
For a second order polynomial, the normal equations can be written in matrix form as follows:
52
Plotgraph Mathematics
n 2
x
i =1
n
x
i =1
x
a xy
i =1
= i =1
b n
N
y
i =1
n
(5-4)
n
x M + 3
i =1
n
x M + 2
i =1
n
x M +1
i =1
n
xM
i =1
i =1
.
.
i =1
.
.
.
.
x6
x5
x4
x3
i =1
n
.
.
.
i =1
n
i =1
.
.
.
i =1
n
n M
x y
i =1
i =1
.
cM .
. .
.
. .
.
n 3
x3 . . = x y
i =1
i =1
n
c3 n 2
2
x c2 x y
i =1
i =1
n
c1 n 1
x c0 x y
i =1
i =1
n 0
N
x y
i =1
n
.
.
n
x M +3 x M + 2 x M +1 x M
i =1
i =1
n
i =1
n
i =1
n
i =1
i =1
n
i =1
n
x2
i =1
n
x
i =1
(5-5)
(5-6)
b = A 1 c
(5-7)
NOTE: Because this matrix is diagonally symmetric, using the LU decomposition to solve
the matrix, is more efficient then using the standard Gaussian method.
Once the regression polynomial has been calculated, it is important to determine the goodness-offit. To calculate this term, we first need to define the total sum of squares error (ST>) (Davis, 1986):
n
SST = ( y i y )
i =1
(5-8)
53
where y is the mean y value for the data set. The second term is the sum of squares due to
regression error (SSR) (Davis, 1986):
n
SSR = ( y i y )
i =1
(5-9)
where <img src="../symbol/yhat.gif">i is the calculated regression each xi value. The goodness-offit (R2) can now be calculated as (Davis, 1986):
R2 =
SSR
SST
(5-10)
If the line estimates the data well, the value will be near unity (1.0). This term though is generally
reported as a percentage. Another useful term is the multiple correlation coefficient (R) (Davis,
1986):
R = R2
(5-11)
Again, if the line estimates the data well, the value will be near unity (1.0).
Bibliography (plotgraph)
Burden, R.L. and J.D. Faires, 1985, Numerical Analysis, Third Edition, Prindle, Weber, and
Schmidt, Boston, pp 342-353.
Davis, John C., 1986 (Second Edition), Statistics and Data Analysis in Geology, John Wiley &
Sons, New York.
Englund, E., and A. Sparks, 1988, GEO-EAS, U. S. Environmental Protection Agency,
Environmental Monitoring Systems Laboratory, EPA/600/4-88/033.
McCuen, R.H., 1989, Hydrologic Analysis and Design, Prentice-Hall, Englewood Cliffs, New
Jersey.
Moler, C., 1978, LINPACK (Linear algebra FORTRAN77 sub-routines), University of New
Mexico, Argonne National Lab.
Press, W.H., S.A. Teukolsky, W.T. Vettering, and B.P. Flannery, 1992, Numerical Recipes in C, The
Art of Scientific Computing, Second Edition, Cambridge University Press, New York.
54
CHAPTER 6
Univariant Statistics:
Histo(gram)
The histo application is used to calculate and display univariant statistics for differing data sets. For
a sample population histo can be used to calculate basic statistics such as the mean, and variance. It
may also be used to display the behavior of several different populations at once using stacked
histograms and box and whisker plots. Histo can also be used to infer information about the
population, i.e. is the population normal, using a 2 (chi-squared) test or by plotting the frequency
distribution as a probability plot.
The histo application is composed of three sections (Figure 6.1); the main menu- bar, the status and
log text area, and the drawing or graph area. The menu-bar is used to select all histo commands, the
log/status area is used by the program to report important messages or results, and the drawing area
is the display area for the histograms.
File
The File sub-menu options control file and print handling, and exiting the program. The options
include Open, View, Save, Save as, Save Preferences, Print Setup, Print, and Quit.
55
FIGURE 6-1. This is an example of the histo application window. The main menu-bar is on the top
Open
Selecting File:Open generates a pop-up dialog which allows the user to select an existing data file.
This dialog functions exactly as the dialog in Figure 5.2 (plotgraph - Chapter 5). As with plotgraph
files, the default data file name extension is *.dat.
View
File:View pops up a simple screen editor with the current data file.
Save Preferences
When using programs with many user options, it is not possible for the program to always pick
reasonable default values for each parameter or input variable. For this reason preference files were
56
created (See Appendix C). These allow the user to define a unique set of defaults applicable to
the particular project. When File:Save Preferences is selected, histo determines how all the input
variables are currently defined and writes them to the file histo.prf.
WARNING:
if histo.prf already exists, you will be warned that it is about to be overwritten. If you do not want the old version destroyed you must move it to a
new file (e.g. the UNIX command mv histo.prf histo.old.prf would be
sufficient). When you press OK, the old version will be over-written! This
cannot be done from within the application. To rename the file you will have to
execute the UNIX mv command from a UNIX prompt in another window.
If histo.prf does not exist in the current directory, it is created. This is an ASCII file and can be
edited by the user. See Appendix C for details.
Print Setup
File:Print Setup works exactly as explained in Chapter 5.
Print
File:Print generates a Postscript file of the graph, and depending on how the print options are
define in Print Setup, directs this file to the specified print queue, or to the specified file.
Quit
File:Quit terminates the program,.
Data
When the appropriate file type is being read, selecting the Data:Modify menu option will pop-up
the dialog shown if Figure 6.2. This dialog allows the user to select which Data Columns in the
data file will be evaluated (Up to 20 columns can be selected. The number of toggles reflects the
number of columns in the data file). It allows the user to specify the histogram Sizing Rule (This is
used for sizing histogram bars). The options are Division Width, Number of Divisions, or Equal
Percent. With the first two methods, the divisions are equally spaced; with the third method
spacing is a function of the data distribution. For the Division Width, the user must specify the
desired bar width (The default is 1/10 the data range). For Number of Divisions, user specifies how
many equal divisions to divide the data range into (the default is 10). If the Equal Percent
Divisions rule is selected, the Number of Divisions text field is used again to enter how many
divisions to divide the data into. Instead of dividing the data by the range of the data though, the
data is divided by number of points in the file. For example, if the data file has 100 points and the
number of divisions is 20, the histogram will show the extents of groups of sorted 5 point data
groupings. The Starting and Ending Locations, by default, are the minimum and maximum extents
UNCERT Users Manual
57
FIGURE 6-2. Data Column pop-up dialog. This dialog allows the user to specify the data column
in the data file, define the extents, and set the width or the number of divisions for the histogram
bars.
of the data file. These may be redefined to more appropriate values. If the values have been reset,
or are set to the bounds of a pervious data set, pressing the Maximize Data Range button, will reset
the Starting and Ending Locations values to the minimum and maximum extents of the current data
set.
Style
The Style menu option allows the user to specify various attributes that control the appearance of
the histogram graph. These are divided into three sub-menus: Plot Type, Y-Axis Type, and
Transform Type.
Plot Type
Style:Plot Type allows the user to specify type of graph that will be plotted. There are several
different types of plots supported by histo. There are five basic options, some with additional
options. The available graph type are: Histograms, Box and Whisker Plots, Cumulative
Distributions, 1.0 - Cumulative Distributions, and Probability Plots.
Histograms
There are two type of histograms, Single and Stacked. The Single histogram is the default and will
plot all the data onto a single graph. When just one data column has been selected for plotting (See
58
Data above) a plot similar to that in Figure 6.3a will be drawn. If more then one data column has
a).
b).
FIGURE 6-3. These histograms demonstrates several features of histo. The top histogram (a)
shows the distribution of a log-transformed data set (Au - gold). The height of individual bars
represents the percentage of all the data samples falling within the given log-value range. The
black dashed line in the center of the graph identify the mean of the log-values. The shaded bars in
the background show 1, 2, and 3 standard deviation widths from the data set log-mean. This
histogram was also plotted using a portrait orientation. The bottom histograms (b) show how two
data sets can be displayed on the same graph.
been selected, the histogram division width is divided by the number of active data columns. The
histogram bars for each range from the different data columns are then plotted side by side Figure
6.3b). This kind of graph can become very busy and difficult to read if more then a few data
59
columns are plotted together. Instead of using the Single option, the histograms can be Stacked
(Figure 6.4). This style will generally be easier to interpret. If only one data column has been
FIGURE 6-4. This series of histograms show how a series of related data sets within a single file
can be stacked in a single graph, with the statistics for each set graphically shown.
Cumulative Distributions
A cumulative distribution is similar to the histogram, but it starts at 0.0% on the left (the data
minimum value), and increased to 100.0% on the right (the data maximum value). At any point inbetween, the percent (or number) of data values less than the X-axis value is plotted (Figure 6.6a).
Again Single or Stacked plots can be used.
60
a).
b).
FIGURE 6-5. Figure 5a shows the same data as Figure 6.4, but uses a box and whisker plot instead
of a histogram format. It is mainly used to show the distribution of different portions of the data
population. The meaning of each symbol is shown in Figure 5b.
Probability Plots
If a Probability Plot option is desired, select Set, or one of the Exceedence or Rank Order options
(Set is just a menu short-cut). The Exceedence Type may be specified, and the Rank Order Method
used to determine the frequency of occurrence of a variable value can be specified.
The Exceedence Type only affects the labeling on the X-axis. An Exceedence plot indicates the
percentage of points which exceed a specified value. A Nonexceedence plot indicates the
percentage of points which do not exceed a specified value. The appearance of the graphs
otherwise is identical.
The Hazen and Weibull methods are two methods for determining the Rank Order of one data value
within the data set. For further details see the histo Mathematics section (Equations 6-12 and 6-13),
or refer to McCuen (1989).
61
a).
b).
FIGURE 6-6. These histograms show a) the cumulative distribution, and b) 100% minus the
cumulative distribution for the log Au assay values shown in Figure 6.3a.
It is common in nature that the distribution of a measured parameter has a log distribution (this is
set with the Style:Transform Type:Log menu option discussed below). If this is the case, a Normal
probability plot will show a curved line (Figure 6.7a). From the curved line, though one can say the
data is not normally distributed, but little more. By Log transforming the data, if the line becomes
straight, the probability plot suggests that the data is log-normally distributed (Figure 6.7b).
62
a).
b).
FIGURE 6-7. The upper graph (a) shows the probability distribution of non-transformed Au assay
values; the data is clearly not normally distributed. The lower graph (b) shows the probability
distribution of log-transformed Au assay values; the straightness of the line suggest that the data is
log-normally distributed.
Y-Axis Type
Style:Y-Axis Type allows the user to specify how the frequency distribution is presented on the Yaxis. It can be specified by Count or by Percentage.
63
Transform Type
Style:Transform Type allows for either Normal or Log (Base 10) transforms (Normal implies the
data is unaltered). Transformed histograms are shown in Figure 6.3a (Normal) and Figure 6.8
(Log). Transformed probability plots are shown in Figures 6.7a and 6.7b
FIGURE 6-8. This histogram demonstrates several features of histo. This histogram shows the
distribution of a non-transformed data set (Au - gold). The height of individual bars represents the
percentage of all the data samples falling within the given value range. The black dashed line in the
center of the graph identifies the mean of the set. The shaded bars in the background show 1, 2, and
3 standard deviation widths from the data set mean. Note that this histogram is highly skewed.
Between Figure 6.4 and this figure, it is much more likely this data set is log-normally distributed
than normally distributed. This histogram was also plotted using the landscape orientation.
NOTE: The transforms use the log base 10, not natural log.
Statistics
There are two ways to display statistical information about data. One method is graphically, and
the other is numerically.
Display
The plotting routines in histo display various statistical information. Some of the items for
different style plots can be selected using the Statistics:Display menu option which creates the popup dialog in Figure 6.9. Note, these items do not apply to all plot styles. For histograms and
cumulative histograms, the Mean, Median, and Normal Distribution Curve can be turned on or off.
In a box and whisker plot, the Mean, Median, 25-75 Percentiles, 10-90 Percentiles, and the
64
Minimum and Maximum Extents can be turned on or off. In addition zero to three Standard
Deviations can be displayed.
Tabulated
Statistics:Tabulated will display the informational dialog shown in Figure 6.10. This dialog gives
statistical information about the data set and the histogram. Data such as the data set minimum and
maximum, mean, median, variance, standard deviation, average deviation, skew, kurtosis, 2 test
result, and the 10th, 25th, 75th, and 90th percentiles (See the histo Mathematics section on the
calculation of these values). The dialog also displays the number of data points and the extents of
the data set. Note, if the data has been log transformed, these values represent the appropriate
statistics or range based on the log value of each data point. A copy of the statistics can be printed
to the log/status window by pressing the Post Statistics to Log/Status Window button. Pressing the
Post All Columns button will print the statistics for all the selected data columns (See Data section)
to the log/status window. To view different columns or data sets from the data file, press the
Pervious or Next Active Date Set buttons.
Graph
Graph allows the user to specify various attributes about the appearance of the graph. Attributes
about the graph Border, Fonts, Labels, Mesh, and Line Styles.
Border
Graph:Border is described in Chapter 5 in the Graph:Border section (Figure 5.9).
65
FIGURE 6-10. Statistics pop-up dialog. This dialog shows some of the basic statistical values
Fonts
Graph:Fonts is described in Chapter 5 in the Graph:Fonts section (Figures 5.10 and 5.11).
Labels
Graph:Labels is described in Chapter 5 in the Graph:Labels section (Figure 5.12)
Line Styles
This option is similar to that used in plotgraph but instead of changing the attributes associated
with a line, attributes are changed with regard to the histogram bars, the mean data value line, and
the standard deviation bars. They are not truly lines but they are treated as such:
Line #1 = Mean Value Data Line
Line #2 = Standard Deviation Error-Bars
Line #3+ = Histogram bars
This dialog is described in the Graph:Style section of Chapter 5 (Figure 5.14).
66
Mesh
Graph:Mesh is described in Chapter 5 in the Graph:Mesh section (Figure 5.13).
Plot
Plot is described in Chapter 5 in the Plot section.
Log
The Log menu option is supplied to allow the user to save, view, or print all text which has been
written to the log/status window by the program or added by the user (The log window is also a
simple text editor). The options include View Log, Save, Save as, Clear, and Print. View Log, Save,
and Save as are similar in operation to the menu options under File described above.
When calculating the frequency distribution for the histogram, the column number (Pos.), X
position, and frequency of occurrence within each histogram bar are reported to the log window.
NOTE: All calculations are maintained in the log window, and the most recent are presented at
the top of the log. An example is shown below for data1.dat using 10 columns:
Calculation #6
Number of Divisions = 10
Pos.
X
Frequency
-----------------------------------------1:
-4.24
9
2:
-3.69
29
3:
-3.14
82
4:
-2.6
132
5:
-2.05
138
6:
-1.51
89
7:
-0.959
56
8:
-0.413
27
9:
0.133
9
10:
0.68
5
Help
Help works exactly as explained in Chapter 5 (plotgraph, Figure 5.15) Help section.
67
68
{}
=
=
=
=
integer
float
character string.
variable is an array. Values must be seperated by a , and no spaces are
allowed. Do not use the { } symbols on the command line.
NOTES:
default = 1
-d2575
default = 1
69
0 = false
1 = true
-dcl to -d20=active data column
0 = false
1 = true
-dive
-divn
-divr
-divs
-divw
-dm
70
-dme
= draw median
0 = false
1 = true
default = 1
-dmm
default = 1
-dsd
default = 1
-esp
default = 0
-exceed
default = 0
-fnt1
-fnt2
-fnt3
-fnt4
-fnt5
-fnt6
=
=
=
=
=
=
default = Helvetica-Bold
default = Helvetica-Bold
default = Helvetica
default = Helvetica
default = Helvetica
default = Helvetica
-fnts1
-fnts2
-fnts3
-fnts4
-fnts5
-fnts6
-ft
=
=
=
=
=
=
=
-gst
= frequency type
0 = single
1 = stacked
= give this help menu
= line color
0 = Black
1 = White
2 = Red
3 = Green
4 = Blue
5 = Magenta
6 = Yellow
7 = Cyan
default = 0
-lgf
-lglp
defalut = log.dat
default = 1
-lgmw
-lpbm
-lpc
-lpd
=
=
=
=
default = 200
default = 1.5
default = 1
default = 0
-lpf
-lph
= print filename
= print header page
0 = false
1 = true
default = "junk.ps"
default = 0
-lplm
-lpo
default = 1.5
default = 0
-help
-lc {}
default = 24.0
default = 15.0
default = 15.0
default = 12.0
default = 10.0
default = 12.0
default = 0
default = variable
71
0 = portrait
1 = landscape
72
-lppsext
-lpq
-lpr
-lprm
-lps
=
=
=
=
=
default = "*.ps"
default = "ps"
-lptm
-lsfl {}
default = 1.5
default = 0
-lsc {}
default = variable
-lssz {}
-lsty {}
default = 9.0
default = 0
-ltk {}
-lty {}
= line thickness
= line type
-1 = no line
0 = solid
1 = dashed
2 = double dashed
default = 1.0
default = 0
-md
= dash mesh
0 = false
1 = true
default = 0
default = 1.0
default = 0
-mox
-moy
-ms
= X mesh origin
= Y mesh origin
= use mesh
0 = false
1 = true
default = 0.0
default = 0.0
default = 0
-mx
-my
-nt
= X mesh frequency
= Y mesh frequency
= show normal curve(s)
0 = show
1 = hide
default = 1/10 DX
default = 1/10 DY
default = 1
-prf
-pt
defalut = "histo.prf"
default = 0
-rfh
= screen refresh
0 = on exposure
1 = on update
default = 0
-ro
default = 1
(2i - 1) / 2n
1 / (n - 1)
-se
-ss
-sttl
-tt
=
=
=
=
-ttl
-xfmt
-xlabel
-xmax
-xmin
-xMt
=
=
=
=
=
=
Main title
Number of decimal places for X-axis
X-axis label
Graph X-maximum
Graph X-minimum
X main tic frequency
default = Filename
default = ".2f"
default = "X"
default = Data Maximum
default = Data Minimum
default = 1/10 DX
73
-xmt
-xto
-xy
-yfmt
-ylabel
-ymax
-ymin
-yMt
-ymt
-ys
-yto
=
=
=
=
=
=
=
=
=
=
=
default = 5
default = 0.0
default = 1.5
default = ".2f"
default = "Y"
default = Data Maximum
default = Data Minimum
default = 1/10 DY
default = 5
default = Calculated
default = 0.0
Column
No Header Column Data
Data files using the column format may have one or more columns of data. The first line in the data
file though determines how many columns there will be throughout the file. Each following line
must have as many columns as the first line, or more (extra columns however are ignored). On each
line, there must be an entry for every column; there is no accommodation for NULL or blank
values. Each value on a line must be separated by a space; the file is unformatted. A sample data
set might appear:
1.0
2.21
3.31
4.56
5.12
.
.
.
74
23.23
12.34
12.98
8.21
10.92
.
.
.
0.123
0.00123
0.231
0.345
0.456
.
.
.
1.45
1.56
2.34
1.76
1.43
.
.
.
This data set has four columns. Each column could be analyzed by selecting and plotting the data
using the Data Column option under Data:Modify.
NOTE: This is the only file format available in histo which allows character data. See the
notes in Chapter 5 about file format restrictions.
GEO-EAS/GSLIB
As described in Chapter 5, histo will support GEO-EAS file formats.
Gridded
In UNCERT many gridded fdata setsare generated and used in modeling. Often it is important to
examine the statistics of the data sets or grids. These data sets are directly readable by histo. For a
complete description of the file formats for *.srf and *.bck files see Chapters 11 and 13
75
Histo Mathematics
Classical Statistics
In examining the raw data there are a number of values which are of interest in this type of analysis.
These are the mean, variance, skew, and kurtosis (these are the 1st, 2nd, 3rd, and 4th order moments
of the data). Also of interest is whether the data are normally distributed, or can be transformed
into a normal distribution. This is a fundamental assumption of kriging, and significant violations
may lead to unreasonable results. To check that the data are normally distributed, this package
supports probability plots and the chi-squared (2) test. These tools allow the user to determine if
the data are likely to be normally distributed. If they appear not to be normally distributed, a
logarithmic transform or another type transform may be appropriate. By viewing histograms of the
data, a bimodal distribution may be identified which suggests there is more then one population of
data in the data set (i.e. more then one process controlled the values sampled). If the data is
bimodally distributed, it may be possible to separate the populations and check the normality of
each population.
For a data set with n samples, the sample mean (x) is calculated as:
n
x=
xi
i =1
(6-1)
where xi = an individual sample value. The median (M) is calculated by (Press et al, 1992)
odd
x n +1 ;
2
M = x n + x n +1
2
2
; even
(6-2)
(x x )
s2 =
i =1
( n 1)
(6-3)
76
Histo Mathematics
(x x )
s = s2 =
i =1
( n 1)
(6-4)
g=
i =1
3
xi x
s
(6-5)
where the skew is a measure of symmetry. A symmetric distribution will have a skew of zero, and
non-symmetric skews will be positive or negative, as shown in Figure 6.11a (McCuen, 1989). The
4th order moment, kurtosis (k), is defined (Press et al, 1992):
a).
b).
FIGURE 6-11. These graphs show graphically the meaning of positive and negative skews and
kurtosis.
77
i =1
k=
4
xi x
s
(6-6)
The kurtosis is a measure of the peakedness or flatness of the distribution relative to a normal
distribution. A positive kurtosis reflects a peaked distribution (leptokurtic) and a negative kurtosis
is relatively flat (platykuric) (Figure 6.11b, Press et al, 1992).
In addition to these statistical terms, the 10th, 25th, 75th, and 90th percentiles are often calculated.
These are simply read from the sorted data set.
10th = xn*0.1
25th = xn*0.25
75th = xn*0.75
90th = xn*0.9
( O i E i )2
i =1
Ei
2 =
(6-7)
where Oi = the observed frequency of values over a range, and Ei = the expected frequency of
values over a range. To determine the expected frequency, the normal distribution itself must be
evaluated. The expected frequency is based on the probability y samples will occur between two
values. It is calculated by:
z=
(x i )
s
(6-8)
e 0.5z
(2 )0.5
2
f (z) =
78
(6-9
UNCERT Users Manual
Histo Mathematics
p(z) =
e 0.5z
(2)0.5 dz
0
2
(6-10)
Note that Equation 6-9 cannot be directly integrated and must be estimated numerically or
evaluated from tabulated values. Below is a table of calculated probabilities for given z values (The
table used by the software uses double precision values (non-truncated at the fourth decimal
place)). These values agree with tables by McCuen (1989) to the fourth significant digit; the
differences at the fifth place and beyond may only be in rounding.
0.00
0.01
0.02
0.03
0.04
0.05
0.06
0.07
0.08
0.09
-3.90
0.0000
0.0000
0.0000
0.0000
0.0000
0.0000
0.0000
0.0000
0.0000
0.0000
-3.80
0.0000
0.0000
0.0000
0.0000
0.0000
0.0000
0.0000
0.0000
0.0000
0.0000
-3.70
0.0001
0.0001
0.0001
0.0001
0.0001
0.0001
0.0001
0.0001
0.0000
0.0000
-3.60
0.0001
0.0001
0.0001
0.0001
0.0001
0.0001
0.0001
0.0001
0.0001
0.0001
-3.50
0.0002
0.0002
0.0002
0.0002
0.0002
0.0002
0.0002
0.0001
0.0001
0.0001
-3.40
0.0003
0.0003
0.0003
0.0003
0.0003
0.0002
0.0002
0.0002
0.0002
0.0002
-3.30
0.0005
0.0004
0.0004
0.0004
0.0004
0.0004
0.0004
0.0003
0.0003
0.0003
-3.20
0.0007
0.0006
0.0006
0.0006
0.0006
0.0005
0.0005
0.0005
0.0005
0.0005
-3.10
0.0009
0.0009
0.0009
0.0008
0.0008
0.0008
0.0008
0.0007
0.0007
0.0007
-3.00
0.0013
0.0013
0.0012
0.0012
0.0012
0.0011
0.0011
0.0010
0.0010
0.0010
-2.90
0.0018
0.0018
0.0017
0.0017
0.0016
0.0016
0.0015
0.0015
0.0014
0.0014
-2.80
0.0025
0.0024
0.0024
0.0023
0.0022
0.0022
0.0021
0.0020
0.0020
0.0019
-2.70
0.0034
0.0033
0.0032
0.0031
0.0030
0.0030
0.0029
0.0028
0.0027
0.0026
-2.60
0.0046
0.0045
0.0044
0.0042
0.0041
0.0040
0.0039
0.0038
0.0037
0.0035
-2.50
0.0062
0.0060
0.0058
0.0057
0.0055
0.0054
0.0052
0.0051
0.0049
0.0048
-2.40
0.0082
0.0080
0.0077
0.0075
0.0073
0.0071
0.0069
0.0067
0.0065
0.0064
-2.30
0.0107
0.0104
0.0102
0.0099
0.0096
0.0094
0.0091
0.0089
0.0086
0.0084
-2.20
0.0139
0.0135
0.0132
0.0129
0.0125
0.0122
0.0119
0.0116
0.0113
0.0110
-2.10
0.0179
0.0174
0.0170
0.0166
0.0162
0.0158
0.0154
0.0150
0.0146
0.0142
-2.00
0.0227
0.0222
0.0217
0.0212
0.0207
0.0202
0.0197
0.0192
0.0188
0.0183
-1.90
0.0287
0.0281
0.0274
0.0268
0.0262
0.0256
0.0250
0.0244
0.0238
0.0233
-1.80
0.0359
0.0351
0.0344
0.0336
0.0329
0.0322
0.0314
0.0307
0.0301
0.0294
-1.70
0.0446
0.0436
0.0427
0.0418
0.0409
0.0401
0.0392
0.0384
0.0375
0.0367
-1.60
0.0548
0.0537
0.0526
0.0516
0.0505
0.0495
0.0485
0.0475
0.0465
0.0455
-1.50
0.0668
0.0655
0.0643
0.0630
0.0618
0.0606
0.0594
0.0582
0.0571
0.0559
-1.40
0.0808
0.0793
0.0778
0.0764
0.0750
0.0736
0.0722
0.0708
0.0695
0.0681
-1.30
0.0968
0.0951
0.0935
0.0918
0.0902
0.0885
0.0869
0.0854
0.0838
0.0823
-1.20
0.1151
0.1132
0.1113
0.1094
0.1075
0.1057
0.1039
0.1021
0.1003
0.0986
79
80
0.00
0.01
0.02
0.03
0.04
0.05
0.06
0.07
0.08
0.09
-1.10
0.1357
0.1336
0.1314
0.1293
0.1272
0.1251
0.1231
0.1211
0.1191
0.1171
-1.00
0.1587
0.1563
0.1539
0.1516
0.1492
0.1469
0.1446
0.1424
0.1401
0.1379
-0.90
0.1841
0.1815
0.1789
0.1763
0.1737
0.1711
0.1686
0.1661
0.1636
0.1612
-0.80
0.2119
0.2091
0.2062
0.2033
0.2005
0.1977
0.1950
0.1922
0.1895
0.1868
-0.70
0.2421
0.2389
0.2359
0.2328
0.2297
0.2267
0.2237
0.2207
0.2178
0.2148
-0.60
0.2743
0.2710
0.2677
0.2644
0.2612
0.2579
0.2547
0.2515
0.2483
0.2452
-0.50
0.3086
0.3051
0.3016
0.2982
0.2947
0.2913
0.2878
0.2844
0.2811
0.2777
-0.40
0.3447
0.3410
0.3373
0.3337
0.3301
0.3265
0.3229
0.3193
0.3157
0.3122
-0.30
0.3822
0.3784
0.3746
0.3708
0.3670
0.3633
0.3595
0.3558
0.3521
0.3484
-0.20
0.4208
0.4169
0.4130
0.4091
0.4053
0.4014
0.3975
0.3937
0.3898
0.3860
-0.10
0.4603
0.4563
0.4523
0.4484
0.4444
0.4405
0.4365
0.4326
0.4287
0.4248
0.00
0.5000
0.4961
0.4921
0.4881
0.4841
0.4802
0.4762
0.4722
0.4682
0.4642
0.00
0.5000
0.5039
0.5079
0.5119
0.5159
0.5198
0.5238
0.5278
0.5318
0.5358
0.10
0.5397
0.5437
0.5477
0.5516
0.5556
0.5595
0.5635
0.5674
0.5713
0.5752
0.20
0.5792
0.5831
0.5870
0.5909
0.5947
0.5986
0.6025
0.6063
0.6102
0.6140
0.30
0.6178
0.6216
0.6254
0.6292
0.6330
0.6367
0.6405
0.6442
0.6479
0.6516
0.40
0.6553
0.6590
0.6627
0.6663
0.6699
0.6735
0.6771
0.6807
0.6843
0.6878
0.50
0.6914
0.6949
0.6984
0.7018
0.7053
0.7087
0.7122
0.7156
0.7189
0.7223
0.60
0.7257
0.7290
0.7323
0.7356
0.7388
0.7421
0.7453
0.7485
0.7517
0.7548
0.70
0.7579
0.7611
0.7641
0.7672
0.7703
0.7733
0.7763
0.7793
0.7822
0.7852
0.80
0.7881
0.7909
0.7938
0.7967
0.7995
0.8023
0.8050
0.8078
0.8105
0.8132
0.90
0.8159
0.8185
0.8211
0.8237
0.8263
0.8289
0.8314
0.8339
0.8364
0.8388
1.00
0.8413
0.8437
0.8461
0.8484
0.8508
0.8531
0.8554
0.8576
0.8599
0.8621
1.10
0.8643
0.8664
0.8686
0.8707
0.8728
0.8749
0.8769
0.8789
0.8809
0.8829
1.20
0.8849
0.8868
0.8887
0.8906
0.8925
0.8943
0.8961
0.8979
0.8997
0.9014
1.30
0.9032
0.9049
0.9065
0.9082
0.9098
0.9115
0.9131
0.9146
0.9162
0.9177
1.40
0.9192
0.9207
0.9222
0.9236
0.9250
0.9264
0.9278
0.9292
0.9305
0.9319
1.50
0.9332
0.9345
0.9357
0.9370
0.9382
0.9394
0.9406
0.9418
0.9429
0.9441
1.60
0.9452
0.9463
0.9474
0.9484
0.9495
0.9505
0.9515
0.9525
0.9535
0.9545
1.70
0.9554
0.9564
0.9573
0.9582
0.9591
0.9599
0.9608
0.9616
0.9625
0.9633
1.80
0.9641
0.9649
0.9656
0.9664
0.9671
0.9678
0.9686
0.9693
0.9699
0.9706
1.90
0.9713
0.9719
0.9726
0.9732
0.9738
0.9744
0.9750
0.9756
0.9762
0.9767
2.00
0.9773
0.9778
0.9783
0.9788
0.9793
0.9798
0.9803
0.9808
0.9812
0.9817
2.10
0.9821
0.9826
0.9830
0.9834
0.9838
0.9842
0.9846
0.9850
0.9854
0.9858
2.20
0.9861
0.9865
0.9868
0.9871
0.9875
0.9878
0.9881
0.9884
0.9887
0.9890
2.30
0.9893
0.9896
0.9898
0.9901
0.9904
0.9906
0.9909
0.9911
0.9914
0.9916
2.40
0.9918
0.9920
0.9923
0.9925
0.9927
0.9929
0.9931
0.9933
0.9935
0.9936
2.50
0.9938
0.9940
0.9942
0.9943
0.9945
0.9946
0.9948
0.9949
0.9951
0.9952
Histo Mathematics
0.00
0.01
0.02
0.03
0.04
0.05
0.06
0.07
0.08
0.09
2.60
0.9954
0.9955
0.9956
0.9958
0.9959
0.9960
0.9961
0.9962
0.9963
0.9965
2.70
0.9966
0.9967
0.9968
0.9969
0.9970
0.9970
0.9971
0.9972
0.9973
0.9974
2.80
0.9975
0.9976
0.9976
0.9977
0.9978
0.9978
0.9979
0.9980
0.9980
0.9981
2.90
0.9982
0.9982
0.9983
0.9983
0.9984
0.9984
0.9985
0.9985
0.9986
0.9986
3.00
0.9987
0.9987
0.9988
0.9988
0.9988
0.9989
0.9989
0.9990
0.9990
0.9990
3.10
0.9991
0.9991
0.9991
0.9992
0.9992
0.9992
0.9992
0.9993
0.9993
0.9993
3.20
0.9993
0.9994
0.9994
0.9994
0.9994
0.9995
0.9995
0.9995
0.9995
0.9995
3.30
0.9995
0.9996
0.9996
0.9996
0.9996
0.9996
0.9996
0.9997
0.9997
0.9997
3.40
0.9997
0.9997
0.9997
0.9997
0.9997
0.9998
0.9998
0.9998
0.9998
0.9998
3.50
0.9998
0.9998
0.9998
0.9998
0.9998
0.9998
0.9998
0.9999
0.9999
0.9999
3.60
0.9999
0.9999
0.9999
0.9999
0.9999
0.9999
0.9999
0.9999
0.9999
0.9999
3.70
0.9999
0.9999
0.9999
0.9999
0.9999
0.9999
0.9999
0.9999
1.0000
1.0000
3.80
1.0000
1.0000
1.0000
1.0000
1.0000
1.0000
1.0000
1.0000
1.0000
1.0000
3.90
1.0000
1.0000
1.0000
1.0000
1.0000
1.0000
1.0000
1.0000
1.0000
1.0000
Determining the probability that y of n samples occur in a given range is calculated by (McCuen,
1989):
) (
(6-11)
From this relationship, not only can we evaluate normality based on the 2 test, but this distribution
can be used to develop probability plots (NOTE: On probability paper, the probability axis is nonlinear and non-logarithmic. Its scale can be determined as a function of z and p(z)).
To develop the probability plot, the data must be rank ordered (sorted). Two common methods are
presented by Weibull (pw) and Hazen (ph) (McCuen, 1989) and the expected value for a given rank
is calculated as:
pw =
i
n +1
(6-12)
ph =
2i 1
2n
(6-13)
where n is the number of samples and i is the rank-order of the given sample. These methods
generate slightly different results, but either method is valid. Which method is used is largely a
matter of user preference, and the users impression of what works best for a particular data set.
81
Bibliography (histo)
McCuen, R.H., 1989, Hydrologic Analysis and Design, Prentice-Hall, Englewood Cliffs, New
Jersey.
Press, W.H., S.A. Teukolsky, W.T. Vettering, and B.P. Flannery, 1992, Numerical Recipes in C, The
Art of Scientific Computing, Second Edition, Cambridge University Press, New York, pps. 612614.
82
Distribution Comparison:
Distcomp
CHAPTER 7
The distcomp application is used to calculate and display the differences between two data sets, or
one data set and a series of other data sets.
The distcomp application is composed of three sections (Figure 7.1); the main menu- bar, the status
and log text area, and the drawing or graph area. The menu-bar is used to select all distcomp
commands, the log/status area is used by the program to report important messages or results, and
the drawing area is the display area for the histograms and probability plots.
FIGURE 7-1. The main menu-bar is on the top of the distcomp application window, with the
83
File
The File sub-menu options control file and print handling, and exiting the program. The options
include Open, View, Save, Save as, Save Preferences, Print Setup, Print, and Quit.
Open
Selecting File:Open generates a pop-up dialog which allows the user to select an existing data file.
This dialog functions exactly as the dialog in Figure 5.2 (plotgraph - Chapter 5). As with plotgraph
files, the default data file name extension is *.dat.
View
File:View pops up a simple screen editor with the current data file.
Save Preferences
When using programs with many user options, it is not possible for the program to always pick
reasonable default values for each parameter or input variable. For this reason preference files were
created (See Appendix C). These allow the user to define a unique set of defaults applicable to
the particular project. When File:Save Preferences is selected, histo determines how all the input
variables are currently defined and writes them to the file distcomp.prf.
WARNING:
84
if distcomp.prf already exists, you will be warned that it is about to be overwritten. If you do not want the old version destroyed you must move it to a
new file (e.g. the UNIX command mv distcomp.prf distcomp.old.prf would be
sufficient). When you press OK, the old version will be over-written! This
cannot be done from within the application. To rename the file you will have
to execute the UNIX mv command from a UNIX prompt in another window.
UNCERT Users Manual
If distcomp.prf does not exist in the current directory, it is created. This is an ASCII file and can
be edited by the user. See Appendix C for details.
Print Setup
File:Print Setup works exactly as explained in Chapter 5.
Print
File:Print generates a Postscript file of the graph, and depending on how the print options are
define in Print Setup, directs this file to the specified print queue, or to the specified file.
Quit
File:Quit terminates the program,.
Data
When the appropriate file type is being read, selecting the Data:Modify menu option will pop-up
the dialog shown if Figure 7.2. This dialog allows the user to select which Data Columns in the
data file will be evaluated (Up to 20 columns can be selected. The number of toggles reflects the
number of columns in the data file). It allows the user to specify the histogram Sizing Rule (This is
used for sizing histogram bars). The options are Division Width, Number of Divisions, or Equal
Percent. With the first two methods, the divisions are equally spaced; with the third method
spacing is a function of the data distribution. For the Division Width, the user must specify the
desired bar width (The default is 1/10 the data range). For Number of Divisions, user specifies how
many equal divisions to divide the data range into (the default is 10). If the Equal Percent Divisions
rule is selected, the Number of Divisions text field is used again to enter how many divisions to
divide the data into. Instead of dividing the data by the range of the data though, the data is divided
by number of points in the file. For example, if the data file has 100 points and the number of
divisions is 20, the histogram will show the extents of groups of sorted 5 point data groupings. The
Starting and Ending Locations, by default, are the minimum and maximum extents of the data file.
These may be redefined to more appropriate values. If the values have been reset, or are set to the
bounds of a pervious data set, pressing the Maximize Data Range button, will reset the Starting and
Ending Locations values to the minimum and maximum extents of the current data set.
Style
The Style menu option allows the user to specify various attributes that control the appearance of
the histogram graph. These are divided into three sub-menus: Plot Type, Y-Axis Type, and
Transform Type.
85
FIGURE 7-2.
Plot Type
Style:Plot Type allows the user to specify type of graph that will be plotted. There are several
different types of plots supported by histo. There are five basic options, some with additional
options. The available graph type are: Histograms, Box and Whisker Plots, Cumulative
Distributions, 1.0 - Cumulative Distributions, and Probability Plots.
Histograms
There are two type of histograms, Single and Stacked. The Single histogram is the default and will
plot all the data onto a single graph. When just one data column has been selected for plotting (See
Data above) a plot similar to that in Figure 7.3 will be drawn. If more then one data column has
been selected, the histogram division width is divided by the number of active data columns. The
histogram bars for each range from the different data columns are then plotted side by side (Figure
6.3b). This kind of graph can become very busy and difficult to read if more then a few data
columns are plotted together. Instead of using the Single option, the histograms can be Stacked
(Figure 6.4). This style will generally be easier to interpret. If only one data column has been
selected, Single or Stacked will generate the same plot.
Cumulative Distributions
A cumulative distribution is similar to the histogram, but it starts at 0.0% on the left (the data
minimum value), and increased to 100.0% on the right (the data maximum value). At any point in-
86
FIGURE 7-3.
between, the percent (or number) of data values less than the X-axis value is plotted (Figure 7.4).
Again Single or Stacked plots can be used.
FIGURE 7-4.
87
FIGURE 7-5.
Probability Plots
If a Probability Plot option is desired, select Set, or one of the Exceedence or Rank Order options
(Set is just a menu short-cut). The Exceedence Type may be specified, and the Rank Order Method
used to determine the frequency of occurrence of a variable value can be specified.
The Exceedence Type only affects the labeling on the X-axis. An Exceedence plot indicates the
percentage of points which exceed a specified value. A Nonexceedence plot indicates the
percentage of points which do not exceed a specified value. The appearance of the graphs
otherwise is identical.
The Hazen and Weibull methods are two methods for determining the Rank Order of one data value
within the data set. For further details see the histo Mathematics section (Equations 6-12 and 6-13),
or refer to McCuen (1989).
It is common in nature that the distribution of a measured parameter has a log distribution (This is
set with the Style:Transform Type:Log menu option discussed below). If this is the case, a Normal
probability plot will show a curved line (Figure 6.7a). From the curved line, though one can say the
data is not normally distributed, but little more. By Log transforming the data, if the line becomes
straight, the probability plot suggests that the data is log-normally distributed (Figure 6.7b).
P-P Plots
Figure 7.6a; Figure 7.6b; Figure 7.6c
Q-Q Plots:
Figure 7.7a; Figure 7.7b; Figure 7.7c
88
a).
b).
c).
FIGURE 7-6.
89
a).
b).
c).
FIGURE 7-7.
Y-Axis Type
Style:Y-Axis Type allows the user to specify how the frequency distribution is presented on the Yaxis. It can be specified by Count or by Percentage.
90
Transform Type
Style:Transform Type allows for either Normal or Log (Base 10) transforms (Normal implies the
data is unaltered). Transformed histograms are shown in Figure 6.3a (Normal) and Figure 6.8
(Log). Transformed probability plots are shown in Figures 6.7a and 6.7b.
NOTE: The transforms use the log base 10, not natural log.
Graph
Graph allows the user to specify various attributes about the appearance of the graph. Attributes
about the graph Border, Fonts, Labels, Mesh, and Line Styles.
Border
Graph:Border is described in Chapter 5 in the Graph:Border section (Figure 5.9).
Fonts
Graph:Fonts is described in Chapter 5 in the Graph:Fonts section (Figures 5.10 and 5.11).
Labels
Graph:Labels is described in Chapter 5 in the Graph:Labels section (Figure 5.12)
Line Styles
This option is similar to that used in plotgraph but instead of changing the attributes associated with
a line, attributes are changed with regard to the histogram bars, the mean data value line, and the
standard deviation bars. They are not truly lines but they are treated as such:
Line #1 = Mean Value Data Line
Line #2 = Standard Deviation Error-Bars
Line #3+= Histogram bars
This dialog is described in the Graph:Style section of Chapter 5 (Figure 5.14).
Mesh
Graph:Mesh is described in Chapter 5 in the Graph:Mesh section (Figure 5.13).
91
Plot
Plot is described in Chapter 5 in the Plot section.
Log
The Log menu option is supplied to allow the user to save, view, or print all text which has been
written to the log/status window by the program or added by the user (The log window is also a
simple text editor). The options include View Log, Save, Save as, Clear, and Print. View Log, Save,
and Save as are similar in operation to the menu options under File described above.
Help
Help works exactly as explained in Chapter 5 (plotgraph, Figure 5.15) Help section.
92
need to be processed quickly, and the operation can be completed in batch mode without user
interaction.
Syntax: distcomp [-dive #.#] [-divn #] [-divr #] [-divs #.#] [-divw #.#] [-dm #] [-dme #] [-dsd #]
[-esp #] [-exceed #] [-ft #] [-help] [-lc #] [-leg #] [-lgf ] [-lglp #] [-lgmw #]
[-lpbm #.#] [-lpc #] [-lpd #] [-lpf ] [-lph #] [-lplm #.#] [-lppsext ] [-lpo #]
[-lpq ] [-lpr] [-lprm #.#] [-lps #] [-lptm #.#] [-lsc #] [-lsfl #] [-lssz #.#]
[-lsty #] [-ltk #.#] [-lty #] [-md #.#] [-moy #.#] [-ms #] [-mx #.#] [-my #.#]
[-nt #] [-prf ] [-pt #] [-rfh #] [-ro #] [-se #.#] [-ss #.#] [-sttl ] [-tt #]
[-ttl ] [-xfmt ] [-xlabel ] [-xmax #.#] [-xmin #.#] [-xMt #.#] [-xmt #]
[-xto #.#] [-xy #.#] [-yfmt ] [-ylabel ] [-ymax #.#] [-ymin #.#] [-yMt #.#]
[-ymt #] [-ys #] [-yto #.#] [filename]
{}
=
=
=
=
integer
float
character string.
variable is an array. Values must be seperated by a , and no spaces are
allowed. Do not use the { } symbols on the command line.
NOTES:
-divs
-divw
-dm
93
94
-dme
= draw median
0 = false
1 = true
default = 1
-dsd
default = 1
-esp
default = 0
-exceed
default = 0
-fnt1
-fnt2
-fnt3
-fnt4
-fnt5
-fnt6
-fnts1
-fnts2
-fnts3
-fnts4
-fnts5
-fnts6
-ft
=
=
=
=
=
=
=
=
=
=
=
=
=
default = Helvetica-Bold
default = Helvetica-Bold
default = Helvetica
default = Helvetica
default = Helvetica
default = Helvetica
default = 24.0
default = 15.0
default = 15.0
default = 12.0
default = 10.0
default = 12.0
default = 0
-help
-lc {}
-lgf
-lglp
default = variable
defalut = log.dat
default = 1
1 = top right
2 = bottom left
3 = bottom right
-lgmw
-lpbm
-lpc
-lpd
=
=
=
=
-lpf
-lph
= print filename
= print header page
0 = false
1 = true
default = "junk.ps"
default = 0
-lplm
-lpo
default = 1.5
default = 0
-lppsext
-lpq
-lpr
-lprm
-lps
=
=
=
=
=
default = "*.ps"
default = "ps"
-lptm
-lsfl {}
default = 1.5
default = 0
-lsc {}
default = variable
-lssz {}
default = 9.0
default = 200
default = 1.5
default = 1
default = 0
default = 1.0
default = 0
95
-lsty {}
default = 0
-md
= dash mesh
0 = false
1 = true
default = 0
-mox
-moy
-ms
= X mesh origin
= Y mesh origin
= use mesh
0 = false
1 = true
default = 0.0
default = 0.0
default = 0
-mx
-my
-nt
= X mesh frequency
= Y mesh frequency
= show normal curve(s)
0 = show
1 = hide
default = 1/10 DX
default = 1/10 DY
default = 1
-prf
-pt
defalut = "distcomp.prf"
default = 0
-rfh
= screen refresh
0 = on exposure
1 = on update
default = 0
-ro
default = 1
-ltk {}
-lty {}
96
default = 1.0
default = 0
(2i - 1) / 2n
1 / (n - 1)
-se
-ss
-sttl
-tt
=
=
=
=
-ttl
-xfmt
-xlabel
-xmax
-xmin
-xMt
-xmt
-xto
-xy
-yfmt
-ylabel
-ymax
-ymin
-yMt
-ymt
-ys
-yto
=
=
=
=
=
=
=
=
=
=
=
=
=
=
=
=
=
Main title
Number of decimal places for X-axis
X-axis label
Graph X-maximum
Graph X-minimum
X main tic frequency
Number of minor X tics
X axis label origin
X-Y ratio
Number of decimal places for X-axis
X-axis label
Graph Y-maximum
Graph Y-minimum
X main tic frequency
Number of minor Y tics
Y-axis exageration relative to X-axis
X axis label origin
default = Filename
default = ".2f"
default = "X"
default = Data Maximum
default = Data Minimum
default = 1/10 DX
default = 5
default = 0.0
default = 1.5
default = ".2f"
default = "Y"
default = Data Maximum
default = Data Minimum
default = 1/10 DY
default = 5
default = Calculated
default = 0.0
97
98
CHAPTER 8
Experimental Semivariogram:
Vario
The vario application is used for calculating one- and two-dimensional experimental
semivariograms. The package is not limited to just the classical semivariogram but will also
calculate covariances, madograms, rodograms, cross-semivariograms, etc. Analysis of the
calculated values through jackknifing is also an option. Three types of soft indicator data can be
used with hard data to calculate the spatial continuity. The application displays the measure of
covariance ((h)) versus lag.
The vario application is composed of three sections (Figure 8.1); the main menu- bar, the status and
log text area, and the drawing or graph area. The menu-bar is used to select all vario commands,
the log/status area is used by the program to report important messages or results, and the drawing
area is the display area for the graphs.
99
FIGURE 8-1. This is an example of the vario application window. The main menu-bar is at the top
File
The File sub-menu options control file and print handling, and exiting the program. The options
include Open, View Data, View Results, Save, Save as, Save Preferences, Print Setup, Print, Quit,
and Quit Without Saving.
Open
Selecting File:Open generates a pop-up dialog which allows the user to select an existing data file.
This dialog operates exactly as the Open:File dialog in Chapter 5 (plotgraph Figure 5.2). As with
plotgraph files, the default data file extension is *.dat.
100
View
File:View pops up a simple screen editor with the last saved version of the data file being used.
View Results
File:View Results pops up a simple screen editor which allows the user to view the results of the
latest calculation.
Save
File:Save saves the results of the latest calculation to a file. If a save file has already been opened,
the data are simply saved. If a save file has not been selected yet, a pop-up dialog similar to that
used in File:Open (Figure 5.2) is created. The main difference between the Open and the Save
dialog is that to save a file, the file does not have to pre-exist. For a description of how the dialog
works, see the Open section above and substitute Save for Open wherever appropriate.
The data from the calculation will be saved in a *.gam file. This file is ready to be used as input to
the Variofit application. Simple editing of the header in this file allows it to be viewed in the
plotgraph application.
Save as
File:Save as is identical to File:Save described above, when a file has not been selected yet. This
option can be used to save the file for the first time, or save results to a new file.
Save Preferences
When using programs with many user options, it is not possible for the program to always pick
acceptable default values for each parameter or input variable. For this reason preference files were
created (See Appendix C). These allow the user to define a unique set of defaults applicable to
the particular project. When File:Save Preferences is selected, vario determines how all the input
variables are currently defined and writes them to the file plotgraph.prf.
WARNING:
If vario.prf already exists, you will be warned that it is about to be overwritten. If you do not want the old version destroyed you must move it to a
new file (e.g. the UNIX command mv vario.prf vario.old.prf would be
sufficient). When you press OK the old version will be over-written! This
cannot be done currently from within the application. To rename the file you
will have to execute the UNIX mv command from a UNIX prompt in another
window.
101
If vario.prf does not exist in the current directory, it is created. This is an ASCII file and can be
edited by the user. See Appendix C for details.
Print Setup
File:Print Setup works exactly as explained in Chapter 5.
Print
File:Print generates a Postscript file of the calculated spatial measure, and depending on how the
print options are define in Print Setup, directs this file to the specified print queue, or to the
specified file.
Quit
File:Quit terminates the program, but if additions have been made to the graph, the user will first be
queried to supply a file to save the changes in.
Experimental
The Experimental menu options allow the user to specify everything regarding what data in the data
file will be evaluated, what spatial equation will be used, and all the relevant search parameters
used to calculated a spatial continuity equation (e.g. an experimental semivariogram).
Data
The Experimental:Hard Data menu option allows the user to specify which columns will define the
X, Y, Z, and head and tail data (Figure 8.2). The tail column is only relevant if covariances are
being calculated (This is used with the Cross-Semivariogram, Covariance, Correlogram, and Soft
Indicator Covariance spatial equations described below). The head column is equivalent to an
individual data value column. Once the columns are defined, the minimum and maximum value in
each column, the total number of data points, and the maximum diagonal distance across the data
set are displayed. It is possible to read in two dimensional data by placing a 0 in the field for the
missing data column (Any other column can be used too, but you must remember to turn off the
Active Z Dimension Data toggle in the Data:Search Parameters dialog described below).
102
FIGURE 8-2. Data File Parameters pop-up dialog. For multiple column data files, these variables
define the columns for the X, Y and Z coordinates as well as the attribute.
Soft Data
The vario program allows the user to incorporate three types of soft data in calculations of
measures of spatial continuity. Soft data are defined as data which contain non-negligible
uncertainty on the parameter being studied. Vario uses three distinct types of soft data: imprecise
measurements (Type A soft data) which have been calibrated through misclassification
probabilities as defined by Alabert (1987), interval bounds on a data value (Type B soft data) as
defined by Journel (1986), and prior probability distributions (Type C soft data ) on a value as
described by Journel (1986). This users manual does not provide a tutorial on these types of data
and it is necessary that the user be familiar with the concepts of soft data prior to using this
package.
To enter soft data information, two dialogs are used. The first is created by the Experimental:Soft
Data menu option and is shown in Figure 8.3. The second (Figure 8.4) is generated by pressing the
Data Options button on the Experimental:Soft Data dialog (Figure 8.3). All soft data calculations
are based on an indicator transform of the data. It is necessary to enter the number of indicator
classes prior to making any soft data calculations. For both type A and type C soft data, it is
necessary to open a file which contains necessary information on the soft data (See the Setting Up
the Input Data File section).
The Uncertainty File Select button will display a listing of files in the directory similar to that
shown in the original file open command under the File dialog (Figure 5.2). The uncertainty files
generally have the suffix *.unc.
103
FIGURE 8-4. The Soft Data Options Dialog is used to control from which columns the upper
bound, lower bound and soft data indices will be read in the original data file. Also, the numbers of
each type of data within the data file are calculated and displayed. Interval data (Type B) are
discarded if the current threshold is within the interval bounds.
104
The Type A Data Flag and Threshold Number fields describe locations within the *.unc file. The
Type A Data Flag is an index which defines the level of soft data being used. The Threshold
Number is the threshold level (1 is lowest, n is highest) which is being calculated.
Type A soft data calculations can be weighted in several different ways. Each type A soft data
calculation is a linear combination of the covariance calculated from the hard data, the hard-soft
cross-covariance and the soft data auto-covariance. Each portion of the calculation is weighted
separately and the three weights add up to one. Straight Pair Weighting weights the gamma
calculation towards the most numerous type of data within the lag. Generally, soft data outnumber
hard data and this type of weighting will favor the soft information. P1-P2 Scaled Pair Weighting
takes into account the quality, as well as the quantity, of the soft information when calculating the
weights. If the imprecision level of the soft data is high, the scaled weighting option is perhaps
more prudent. The equations used to calculate the weights are discussed in the Vario math section.
The Data Options dialog (Figure 8.4) allows the user to set the column numbers for the soft data
parameters. These parameters are the lower bound column for the data value, the upper bound
column for the data value and the soft data index column. Type A soft data have the lower and
upper bounds set equal to each other as do hard data. The uncertainty file which was opened above,
contains the indicator thresholds and the misclassification probabilities p1 and p2. Type B and C
soft data (interval and prior pdf) have unequal upper and lower bounds. For type B soft data these
bounds define the interval in which the actual data value is believed to lie. For type C data, the
lower and upper bounds define an interval within which the prior probability is defined at every
indicator threshold. The previously opened uncertainty file contains the indicator thresholds and
the prior probabilities at each threshold. Summary information on the soft data is displayed.
It is possible to display each component of the spatial measure by clicking on one of the Soft Data
Print Options. The default option is the combined calculation.
Indicator Threshold
The Experimental:Indicator Threshold menu option is used to define the indicator classes (Figure
8.5). It is generally only necessary to set the upper threshold to something other than the default
maximum . By doing this, the values greater than the upper threshold are zero and those less than
or equal to the upper threshold are set to one. Further discretization is possible by using the lower
threshold. All values between the upper and lower thresholds are set to one; all others are set to
zero.
NOTE:
If the default minimum and maximum are used, there will be an error. There is no
variance if the data set is converted completely to 1s.
Search Parameters
There are two methods for calculating experimental semivariograms. Which you use depends on
the nature of the data set and the desired results. One is designed for irregularly spaced (nonUNCERT Users Manual
105
gridded) data, and the other is designed for gridded data. The method used for gridded data is very
fast, but cannot be used with irregularly spaced data, and can only evaluate one direction (zero
half-angles, zero bandwidths). The method for irregularly spaced data (this can be used for gridded
data too) is much slower, but half-angles and bandwidths (discussed below) can be evaluated.
NOTE: The Gridded method currently does not support Cross-Semivariogram,
Covariance, Correlogram, and Soft indicator Covariance spatial variability models.
106
The user is allowed two choices as to where the first lag should be calculated. Depressing the First
Lag = 1/2 Lag Distance button will make the first calculation from all pairs of points within the
distance 0.0 to 1/2 the lag distance value. If the First Lag = 1/2 Lag Distance button is not
depressed, the first lag will be calculated from 0.0 to the lag distance value. If the Z-dimension is to
be used in the calculation, the Activate Z Dimension button must be depressed. To define the search
parameters for each spatial equation, define the number of models/directions desired then press the
Define Directions button. This will create the dialog shown if Figure 8.7
FIGURE 8-7. The Search Parameters:Point:Define Directions dialog is used to define the spatial
parameters used in the calculation of spatial continuity for irregularly spaced data.
From the search directions dialog, the lag distance, maximum search distance, direction bandwidth,
plunge bandwidth, horizontal and vertical search directions, and horizontal and vertical halfangles can be defined. The Plot toggle is used to turn on and off plotting for each line. The graph
can get very busy and it can become difficult to determine which line is associated with each
model/direction otherwise. The search directions, half angles and bandwidths are shown
schematically in Figure 8.8.
FIGURE 8-8. Key elements in defining the
search for data pairs in the semivariogram
analysis of the point P1. The point within
one lag is P4, and the points within two
lags are P2, P3, and P5. Point P3, however
is outside both the tolerance angle and the
bandwidth for this search direction angle
and is therefore not evaluated as a pair
with P1 when evaluating g*(h). These
same concepts may also be applied in
three-dimensions.
107
Gridded Data
The Experimental:Search Parameters:Gridded Data menu creates the dialog shown in Figure 8.9.
This dialog is used to define how many spatial equation will be calculated and display the grid
dimensions of the data.
FIGURE 8-9. The Search Parameters:
Gridded dialog is used to define how many
search directions will be calculated .
To define the search parameters for each spatial equation, define the number of models/directions
desired then press the Define Directions button. This will create the dialog shown if Figure 8.10.
8-10. The Search Parameters:
Gridded Data:Define Directions dialog is
used to define the spatial parameters used in
the calculation of spatial continuity for
gridded data.
FIGURE
From the search directions dialog, the search direction and lag spacing is defined by specifying the
X, Y, and Z step size (These are integer values in units of rows, columns, and layers). The
maximum lag, is specified by defining the maximum number of steps. The Plot toggle is used to
turn on and off plotting for each line. The graph can get very busy and it can become difficult to
determine which line is associated with each model/direction otherwise.
Spatial Equations
The Experimental:Spatial Equation dialog (Figure 8.11) is used to define which spatial continuity
equation will be used by vario. These spatial measures are calculated by the algorithms obtained
from the geostatistical package GSLIB (Deutsch and Journel, 1992). See the Vario Mathematics
section of this chapter. For a full understanding of the strengths and limitations of the different
108
spatial measures, the reader is strongly referred to the original source by Deutsch and Journel
(1992).
FIGURE 8-11. Measures of Spatial Variability dialog is
Eleven different measures of spatial continuity are available (Figure 8.11). The standard
Semivariogram is the default method. For the Semivariogram, General Relative Semivariogram
Pairwise Relative Semivariogram, Semivariogram of logarithms, Semirodogram, Semimadogram,
and Indicator Semivariogram spatial equations the head (or value) data column in the
Experimental:Hard Data dialog (Figure 8.2) must be defined. For the Cross-Semivariogram,
Covariance, Correlogram, and Soft Indicator Covariance spatial equations the head and tail
variables must be defined. For the Indicator Semivariogram and the Soft Indicator Covariance
spatial equations, the minimum and maximum indicator cutoffs must be defined (Figure 8.5,
discussed below). For the Soft Indicator Covariance spatial equation, the parameters in the
Experimental:Soft Data dialogs must be defined.
The Calculate button calculates the specified spatial measure and plots the resulting values at the
appropriate lag value.
NOTE: Not all spatial equations can be used with the gridded data solution method. See
comment above.
Covariance View
For aesthetic reasons, covariance calculations can be plotted as standard semivariograms by the
relationship formulated by Isaaks and Srivastava (1988). The Experimental:Covariance View menu
109
option allows the results to be plotted in the normal Covariance form or as a Semivariogram. This
option if available for the Cross-Semivariogram, Covariance, and Soft Indicator Covariance spatial
equations only. This relationship does not require any assumption of ergodicity as is required in the
more common relationship between the semivariogram and covariance (see Vario Mathematics
section).
2D Semivariogram
Described thus far were tools for calculating one-dimensional experimental semivariograms (For
example, Figure 8.12, a set of three models based on the same data and search direction, but with
different half-angles). One-dimensional models are good you describing the spatial variation in a
particular direction, but it can be difficult and time consuming to determine from these, the
principle anisotropys for the data set. This process is simplified by calculating two-dimensional
experimental semivariograms. There are some limitations with the method though (discussed
below), so these models should be used to get an overall feel for the spatial variability of the data,
but one- dimensional models should be calculated along the principle anisotropic axes as input for
the kriging solutions.
FIGURE 8-12. This graph shows a single data set evaluated for a single direction using three half-
110
Search Parameters
The Experimental:2D Semivariogram:Search Parameters dialog (Figure 8.13) is used to define the
Lag Distance, and a Maximum Search Distance used in calculating the 2D semivariogram. Since
many data sets are actually three-dimensional, the orientation of the 2D plane must also be
specified. These are degree angle rotation around the X, Y, and Z Axes. The rotation directions are
defined in Figure 8.14.
FIGURE 8-13. The 2D Experimental:Search Parameters dialog is used to define the lag distances
for the spatial equation calculation and define the 2D plane (for a 3D data set) that will be
evaluated. The rotation directions are shown in Figure 8.14.
FIGURE 8-14. This diagram shows the axis rotation
NOTE: These are not the same rotation directions as described for sisim (Chapter 15). This
discrepancy will be corrected in a future release.
111
When all the parameters are defined, press Calculate on this dialog. This calculation may take
some time, so be patient. The calculation is basically computing eighteen experimental
semivariogram models with a five degree half-angle. When the calculation is complete, the pop-up
dialog shown in Figure 8.15 will be displayed.
FIGURE 8-15. Once the base calculation has been made for the 2D semivariograms, the final
experimental semivariograms can be calculated for various half-angles. This dialog can also be
used to pass the results to 2D and 3D visualization programs.
WARNING:
Do not close this dialog until you have created all the 2D half-angle maps you
want. If you do, you need to do the previous calculation again. Once the base
calculation is made, alternative half-angles can be produced very quickly.
Once the base calculation is complete, the data set can be evaluated using a limited number of halfangles (5, 15, 25, ..., 75, 85, 90. Bandwidths are not installed). This is because of how the
base calculation was made, but these should be adequate for most purposes. Once a half-angle has
been selected, press Calculate on THIS dialog. When calculated, eighteen experimental
semivariograms will be displayed in the graph area (Figures 8.16a and 8.16b). This is a very busy
way and not very useful way to examine the results. There are two other methods to view the
results that are must more useful. The first step is to select a file name (a new data file is going to be
created). The next step is to press the Grid or 3D Post button. If you press 3D Grid, the data set
will be passed to the program block (Chapter 12), and you will see a display similar to Figure 8.17.
In block you can tilt, rotate, and print the results (Refer to Chapter 12 for more details. If you press
Grid, the data will be passed to grid (For details see Chapter 9). Once in grid, you can select the
Method:Calculate menu option, then View:Contour Map. It will ask you if you want to save the
results; say Save & Plot (This will save the results to a file called junk.srf). You will then see a plot
similar to Figures 8.18a and 8.18b (See contour, Chapter 10 for more details).
These steps can quickly be done for each half-angle desired. Using this method, the principle
anisotropys (if present) can be quickly determined.
112
a).
b).
FIGURE 8-16. These two graphs each show eighteen experimental semivariograms calculated
every 10 from North with a) a 25 half-angle, and b) an 85 half-angle. Note that with increasing
distance, the results become much noisier, but this scatter is subdued with the larger half-angle
(as one would suspect). Also note that both model have a range of about 1.4. This generally within
zone of minimal scatter, so an argument could be made you using an isotropic model.
113
FIGURE 8-17. This is the same data shown in Figure 8.16a except with the data located correctly in
three dimensions. The variance of the data set is exceeded near the middle of the figure, but at
longer ranges substantial variation can be seen. The greatest variance has a NE-SW trend.
Calculate
The Experimental:2D Semivariogram:Calculate button serves the same purpose as the Calculate
button on the Experimental:2D Semivariogram:Search Parameters dialog.
Calculate
The Experimental:Calculate option calculates the specified spatial equation and plots the resulting
values based on the parameters definable under the Experimental menu option. This calculates
one-dimensional semivariogram models only.
114
FIGURE 8-18a. This is the same data shown in Figures 8.16a and 8.17 except with the data located
correctly on a two dimensional contour map. The variance of the data set is exceeded near the
middle of the figure, but at longer ranges substantial variation can be seen. The greatest variance
has a NE-SW trend.
Jackknifing
Jackknifing is a technique that can be used with very small data sets (generally less then 50 to 100
data points) to evaluate the uncertainty in the semivariogram. With small data sets, there are often
too few pairs in each lag to generate meaningful results and minor changes in the search angles,
half angles, bandwidths, or lag spacing can significantly change the resultant semivariogram. In
the simple case, what this technique does is, remove one data point at a time from the data set, and
calculate a semivariogram for each set. For n data points, this method generates n + 1
(semivariogram of full data set) experimental semivariograms. At each lag distance there will be a
scatter of calculated *(h) values. Based on these values, a mean *(h) and the variance and
standard deviation can be calculated for each lag. By plotting each *(h) value for each
experimental semivariogram at each lag, and 95% confidence error bars, much if the uncertainty of
the semivariogram can be described (Figure 8.19).
NOTE: This method can be used on any data set, but because it generates n + 1 experimental
semivariograms, it can be computational very expensive on large data sets. Also, in
general, when an experimental semivariogram is well behaved (little variation when
only the lag is changed), the uncertainty will be relatively small and the jackknifing
method will add little information.
115
FIGURE 8-18b. This is the same data shown in Figures 8.16b except with the data located correctly
on a two dimensional contour map. The variance of the data set is exceeded near the middle of the
figure, and because of the large half-angle the results are nearly isotropic.
Modify
The Jackknifing dialog is shown in Figure 8.20. The jackknifing option allows for recalculation of
the spatial measure after removing a number of data points. The number of points removed can be
set as a fixed number or as a percentage of the total data set. Display options for the jackknifed
solution can be set by the user.
Individual Display
The Individual Display option allows the user to look at a single jackknifed solution in the display
portion of the vario package. The dialog is shown in Figure 8.21.
Calculate
The Jackknife:Calculate option calculates and jackknifes the specified spatial equation and plots
the resulting values based on the parameters definable under the Jackknife menu option.
116
FIGURE 8-19. This graph shows a series of jackknifed experimental semivariograms and the
associated uncertainties (95% confidence level). Circles represent individual lag g(h) points for
jackknifed (all but one data point used) experimental semivariograms. Squares represent individual
lag g(h) points for the full experimental semivariogram. The error-bars show the variance at each
lag for the mean lag distance, and the mean g(h) value. The gray-horizontal-dashed band describes
the 95% of the variance for all the jackknifed data sets. The central dashed line in this band is the
variance for the full data set.
Graph
Graph allows the user to specify various attributes about the appearance of the graph. Attributes
about the graph Border, Error-bar Styles, Fonts, Labels, Legends, Mesh, and Line Styles.
Border
Graph:Border is described in Chapter 5 in the Graph:Border section (Figure 5.9).
Error-Bar Style
Error-Bar Style controls the display characteristics of the error bars. Error bars only exist when a
jackknifed calculation has been done. The Error-Bar Style dialog is shown in Figure 8.22. The
Error-Bar Style settings are very similar to the settings available in the Style: Set Line Attributes
117
dialog as described in Chapter 5 Figure 5.14). The only difference in the dialogs is the option to set
the extent of the cross-width on the error-bars.
Fonts
Graph:Fonts is described in Chapter 5 in the Graph:Fonts section (Figures 5.10 and 5.11).
Labels
Graph:Labels is described in Chapter 5 in the Graph:Labels section (Figure 5.12).
118
Legends
Graph:Legends:Data Parameters creates the pop-up dialog shown in Figure 8.23. The dialog
controls whether or not (Compare Figures 8.1, 8.12, and 8.16) the semivariogram Data Parameters
are shown on the graph, and if they are displayed, in which corner, and for which semivariogram
model.
FIGURE 8-23. The Data Parameter
Mesh
Graph:Mesh is described in Chapter 5 in the Graph:Mesh section (Figure 5.13).
119
Style
Graph:Line Styles is described in Chapter 5 in the Graph:Style section (Figure 5.14).
Log
The Log menu option is supplied to allow the user to save, view, or print all text which has been
written to the log/status window by the program or added by the user (The log window is also a
simple text editor). The options include View Log, Save, Save as, Clear, and Print. View Log, Save,
and Save as are similar in operation to the menu options under File described above.
Plot
Plot is described in Chapter 5 in the Plot section.
Model
The model menu option allows the user to launch the variofit package from within the vario
package. The model dialog is displayed in Figure 8.24. The dialog allows the option of placing the
launched variofit package in any one of the four corners of the screen. The user may decide to just
launch the variofit package several times in order to save displays of a spatial measure while
recalculating. In this fashion, it is possible to simultaneously compare the graphics from several
calculations.
FIGURE 8-24. The Model dialog allows the user to
Help
Help works exactly as explained in Chapter 5 (plotgraph, Figure 5.15) Help section.
120
121
Parameters:Points:Define Directions dialog remains on the screen the Calculate button at the
bottom of this dialog can be used. Using the Calculate button in the Experimental:Search
Parameters dialog, allows the user to adjust calculation parameters and recalculate the spatial
measure without having to close the dialog. Results of the calculation of the indicator
semivariogram are shown in Figure 8.25. The display parameters can be adjusted within the Border
dialog.
FIGURE 8-25. This graph shows the example indicator experimental semivariogram.
122
[-dip {#.#}] [-dir {#.#}] [-dsd #] [-esp #] [-fl #] [-fnt1 ] [-fnt2 ] [-fnt3 ]
[-fnt4 ] [-fnt5 ] [-fnt6 ] [-fnts1 #.#] [-fnts2 #.#] [-fnts3 #.#] [-fnts4 #.#]
[-fnts5 #.#] [-fnts6 #.#] [-gml {#}] [-gxi {#}] [-gyi {#}] [-gzi {#}] [-gnd #]
[-hbw #.#] [-help] [-hi #.#] [-hw #.#] [-il #] [-ind #] [-jack] [-jcl #.#] [-jebd #]
[-jebr #] [-jev #] [-jir #] [-jjr #] [-jrp #] [-jrpc #.#] [-jrt #] [-lag {#.#}] [-lbc #]
[-lc {#}] [-lgf ] [-lgpa #] [-lgpp #] [-li #.#] [-lpbm #.#] [-lpc #] [-lpd #] [-lpf ]
[-lph #] [-lplm #.#] [-lpo #] [-lppsext ] [-lpq ] [-lpr] [-lprm #.#] [-lps #]
[-lptm #.#] [-lty {#}] [-ltk {#.#}] [-md #] [-mg #.#] [-mox #.#] [-moy #.#] [-ms #]
[-mx #.#] [-my #.#] [-nsi #] [-out ] [-prf ] [-rfh #] [-run] [-set #] [-sfa #] [-sic #]
[-sfl {#}] [-sp #] [-spo #] [-ssz {#.#}] [-sttl ] [-sty {#}] [-swo #] [-ttl ] [-ubc #]
[-unf ] [-vbw {#.#}] [-vc #] [-vw {#.#}] [-xc #] [-xfmt ] [-xlabel ]
[-xmax #.#] [-xmin #.#] [-xMt #.#] [-xmt #] [-xy #.#] [-yc #] [-yfmt ]
[-ylabel ] [-ymax #.#] [-ymin #.#] [-yMt #.#] [-ymt #] [-ys #.#] [filename]
Meaning of flag symbols:
#
#.#
{}
=
=
=
=
integer
float
character string.
variable is an array. Values must be seperated by a , and no spaces are
allowed. Do not use the { } symbols on the command line.
NOTES:
default = junk.exp.dat
default = 0
123
124
-3rx
-3rx
-3rx
-calct
=
=
=
=
default = 0.0
default = 0.0
default = 0.0
default = 0
-ch
-ct
-dip {}
-dir {}
-dsd
=
=
=
=
=
default = 4
default = 4
default = 0.0
default = 0.0
default = 1
-esp
default = 0
-fl
default = 0
-fnt1
-fnt2
-fnt3
-fnt4
-fnt5
-fnt6
-fnts1
-fnts2
-fnts3
-fnts4
-fnts5
-fnts6
=
=
=
=
=
=
=
=
=
=
=
=
default = Helvetica-Bold
default = Helvetica-Bold
default = Helvetica
default = Helvetica
default = Helvetica
default = Helvetica
default = 24.0
default = 15.0
default = 15.0
default = 12.0
default = 10.0
default = 12.0
-gml
-gnd
-gxi
-gyi
-gzi
-hbw {}
-help
-hi
-hw {}
-il
default = 1
default = 1
default = 1
default = 1
default = 1/2th max diag
default = data max.
default = 90.0
default = 1
UNCERT Users Manual
-ind
-jack
-jcl
-jebd
=
=
=
=
-jebr
default = 1
-jev
default = 1
-jir
default = 1
-jjr
default = 1
-jrp
-jrpc
-jrt
default = 1
default = 10%
default = 0
-lag {}
-lbc
-lc {}
= lag spacing
= lower bound column (soft data)
= line color
0 = Black
1 = White
2 = Red
3 = Green
4 = Blue
5 = Magenta
6 = Yellow
7 = Cyan
-lgf
-lgpa
defalut = log.dat
defalut = 1
125
0 = false
1 = true
126
-lgpp
default = 1
-li
-lpbm
-lpc
-lpd
=
=
=
=
-lpf
-lph
= print filename
= print header page
0 = false
1 = true
default = "junk.ps"
default = 0
-lplm
-lpo
default = 1.5
default = 0
-lppsext
-lpq
-lpr
-lprm
-lps
=
=
=
=
=
default = "*.ps"
default = "ps"
-lptm
-lsfl {}
default = 1.5
default = 0
-lsc {}
default = variable
default = 1.0
default = 0
5 = Magenta
6 = Yellow
7 = Cyan
-lssz {}
-lsty {}
default = 9.0
default = 0
-md
= dash mesh
0 = false
1 = true
default = 0
-mg {}
-mox
-moy
-ms
=
=
=
=
mag lag
X mesh origin
Y mesh origin
use mesh
0 = false
1 = true
-mx
-my
-nsi
-out
-prf
-rfh
=
=
=
=
=
=
X mesh frequency
Y mesh frequency
number of soft indicators
output *.gam filename
preference file name
screen refresh
0 = on exposure
1 = on update
default = 1/10 DX
default = 1/10 DY
default = 8
defalut = junk.gam
defalut = "vario.prf"
default = 0
-run
-set
-ltk {}
-lty {}
default = 1.0
default = 0
default = 0
127
4
5
6
7
8
9
10
128
=
=
=
=
=
=
=
-sfa
-sic
-sp
default = 2
default = 6
default = 1
-spo
default = 0
-sttl
-swo
= Secondary title
= soft weighting option
0 = stright pair weighting
1 = p1-p2 scaled pair weighting
default =
default = 0
-ttl
-ubc
-unf
-vbw {}
-vw {}
-xc
-xfmt
-xlabel
-xmax
-xmin
-xMt
-xmt
-xto
-xy
-yc
-yfmt
-ylabel
-ymax
-ymin
-yMt
-ymt
=
=
=
=
=
=
=
=
=
=
=
=
=
=
=
=
=
=
=
=
=
default = Filename
default = 5
default = Undefined
default = 1/50th max lag
default = 90.0
default = 1
default = ".2f"
default = "X"
default = Data Maximum
default = Data Minimum
default = 1/10 DX
default = 5
default = 0.0
default = 1.5
default = 2
default = ".2f"
default = "Y"
default = Data Maximum
default = Data Minimum
default = 1/10 DY
default = 5
Main title
upper bound column (soft data)
soft data uncertainty definition file
vertical band width
vertical 1/2 angle
X data input column
Number of decimal places for X-axis
X-axis label
Graph X-maximum
Graph X-minimum
X main tic frequency
Number of minor X tics
X axis label origin
X-Y ratio
Y data input column
Number of decimal places for X-axis
X-axis label
Graph Y-maximum
Graph Y-minimum
X main tic frequency
Number of minor Y tics
-ys
-yto
-zc
default = Calculated
default = 0.0
default = 3
Index
Bound 1
Bound 2
Comments
23
104
Hard Data
27
279
Soft Data A
39
340
Soft Data B
55
412
-2
Soft Data C
44
85
-1
No Conditioning Data
Type A soft data are entered in a manner similar to hard data in that a single value is assigned to
both ends of the interval. However, there is uncertainty in assigning this location to class 4. As was
seen in the discussion above, this uncertainty is quantified by the values p1 and p2. In the case of
type A data, the index value is a flag telling the simulation software where the values of p1 and p2
UNCERT Users Manual
129
for each threshold are located. This index corresponds to the index in the uncertainty file. These
values of p1 and p2 will be entered in the uncertainty file as shown above. If p1 and p2 do not vary
spatially, every location conditioned with Type A soft data will have the same index. If p1 and p2 do
vary spatially, different locations will be characterized by different values of p1 and p2 and each set
of p1 and p2 values will have a different index. The index numbers can range from 2 to infinity.
Type B soft data are entered with the lower and upper bounds of the interval. Through some
technique, it is possible to determine that the location has a class value of 3, 4 or 5.
Type C soft data are entered in a manner similar to those of type B. The bounds define an interval,
but in the case of type C data the shape of the distribution between the interval bounds is known.
The index, from -2 to -infinity, is a flag telling the simulation software where to locate the
distribution within the uncertainty file that belongs within the interval bounds for this location.
Uninformed locations are given the maximum bounds of the observed data. For example, at each
location in the domain where there is no well, nor any soft data measurement, the simulated class
must be within the extremes of the classes observed throughout the site. Assigning the uninformed
locations to be within the maximum and minimum values observed during the site investigation
relies on the assumption that the maximum and minimum of the attribute within the domain have
been sampled. This may not be the case, and it may be reasonable to set the bounds on the
uninformed locations to define a greater interval than the observed interval.
Uncertainty File
The format of this file is given below by example:
2 1
2
0.876
0.921
0.806
3
0.798
0.932
0.902
-2
0.145
0.433
0.780
0.123
0.147
0.095
0.078
0.134
0.201
The first line contains the number of sets of soft data type A (imprecise data) calibration sets and
the number of type C (prior probability) soft data cumulative probability distributions. For this
example file, there are two type A data sets and 1 type C prior cdf. The second line contains the
index for the first set of type A calibration probabilities. Type A indices are equal to or greater than
130
Vario Mathematics
2. In this example file there are three indicator thresholds and the next three lines hold the p1 and p2
values for each threshold. Lines 6 through 10 are the index and p1 and p2 values for the second set
of type A calibration. Line 11 holds the index for the only type C prior cdf. Type C indices are
equal to or less than -2. The last three lines of this example file are the values of the prior cdf at
each of the indicator thresholds.
Vario Mathematics
The following section presents the mathematics behind the calculations performed within the vario
package. The equations used to calculate the different measures of spatial continuity are presented
first. These equations are taken from the GSLIB Software Library (Deutsch and Journel, 1992).
The following description is paraphrased from Deutsch and Journel (1992).
Semivariogram
This is the traditional semivariogram measure. The gamma value is one half the average squared
distance between to variables separated by a vector h.
(h) =
1
2 N( h )
N( h )
(x y )
i
i =1
(8-1)
N(h) is the number of pairs of data points. xi is the value at the start or tail of the pair and yi is the
variable at the end or head of the pair (Figure 8.26). Calculations with the semivariogram should
be limited to cases where the head and tail refer to the same variable (attribute). For different
variables, the cross-semivariogram should be employed.
FIGURE 8-26. The concept of the tail and the head data points in a pair. The points are
131
Cross-Semivariogram
A measure of cross variability defined as half the average product of h- increments corresponding
to two different variables (attributes).
ZY ( h ) =
1
2 N( h )
N( h )
(z z )(y y )
i
i =1
(8-2)
zi and zi are the tail and head values of attribute z respectively. Similarly, yi and yi are the tail and
head values of the second attribute. The head and tail for both attributes are separated by the vector
h.
Covariance
This spatial measure is the covariance measure used in traditional statistics. Written in spatial
notation this calculation is referred to as the non-ergodic covariance. The covariance measure
does not explicitly assume that the means of the head variable and the means of the tail variables
are equal.
The means of the head and tail variables are denoted by m+h and m-h respectively and are calculated
as:
1
C( h ) =
N( h )
N( h )
x y m
i i
i =1
hm+h
(8-3)
where
m+h =
mh =
1
N( h )
1
N( h )
N( h )
i =1
(8-4)
N( h )
x
i =1
(8-5)
If x and y refer to different variables, the covariance calculation determines the cross- covariance.
This calculation is used to determine the cross-covariance between hard and soft data.
132
Vario Mathematics
Correlogram
The correlogram is the covariance calculation standardized by the respective tail and head standard
deviations:
( h ) =
C( h )
h + h
(8-6)
where s-h and s+h refer to the standard deviation of the tail and head values respectively. The
standard deviations are calculated by:
2 h =
2+ h =
1
N( h )
1
N( h )
N( h )
2
i
m 2 h
i =1
(8-7)
N( h )
2
i
m 2+ h
i =1
(8-8)
When x and y refer to two different variables, this calculation becomes the cross- correlogram.
(h)
2
mh + m+h
(8-9)
N( h )
( x i y i )2
2
i =1 ( x i + y i )
(8-10)
133
Note:
Both the general relative and the pairwise relative semivariograms have been shown
to be resistant to data sparsity and outliers when applied to positively skewed data
sets. Because of the denominators in the calculations, the general and pairwise
relative semivariograms should be used only with positive variables.
Semivariograms of Logarithms
The traditional semi-semivariogram calculated on the natural logarithms of the original variables:
L (h) =
1
2 N( h )
N( h )
[ln(x ) ln(y )]
i =1
(8-11)
Semirodogram
A measure of spatial variability similar to the traditional semi-semivariogram, but using the square
root of the absolute difference between variables separated by a vector h.
R (h) =
Note:
1
2 N( h )
N( h )
xi yi
i =1
(8-12)
Rodograms and madograms are useful for determining the large scale spatial
structure but should not be used for modeling the nugget value of spatial continuity.
Semimadogram
A measure of spatial variability similar to the traditional semi-semivariogram, but using the
absolute difference of the two variables separated by a vector h.
M (h) =
1
2 N( h )
N( h )
x y
i
i =1
(8-13)
Indicator Semivariogram
The semivariogram is constructed on an indicator variable. The indicator classes are constructed
within the program and an indicator threshold must be supplied. The indicator classification is
done as:
134
Vario Mathematics
1, if x i cut k
ind i =
0, otherwise
(8-14)
Indicator Covariance
Indicator covariance is calculated by doing an indicator transform on the data and then using the
covariance equation. The advantage over the indicator semivariogram is that the assumption of
ergodicity need not be met when using covariance.
135
sandstone aquifer is log-normal with a given mean and variance). Table 8.2 summarizes the three
types of soft data. These three types of soft data are discussed below in terms of Bayesian statistics.
Uncertainty
Measure
Type of Data
Format
Hard Data
single value
z(x)
no umcertainty
imprecise single
quality index
Soft Data
Tyupe A
value z(x)
probability interval
Type B
(zmin(x),
zmax(x))
interval width
Type C
probability
probability dsitribution
distribution
TABLE 8.2. Types
Bayesian Statistics
Bayesian statistics can be used to evaluate the three types of soft data. Bayes Theorem states:
P( A | B) =
P( B | A ) * P( A )
P(B)
(8-15)
where A and B are discrete events. For continuous random variables X and Y, Bayes Theorem is
expressed in terms of probability density functions (pdfs):
f (x | y) =
f ( y | x) * fx (x)
fy ( y)
(8-16)
Where fx(x) and fy(y) are the marginal pdfs of random variables X and Y respectively, and f(x|y) is
the pdf of random variable X given that Y = y and vice versa for f(y|x) (x and y are the specific
values of X and Y in this instance) (Alabert, 1987).
For a random variable Z corresponding to an unknown attribute z at a given location, the prior
marginal pdf on z is fz(z). This pdf is derived from any knowledge of the variable in the area and is
a prior pdf as no experiment has yet been performed at this location to gain more knowledge of the
attribute z. This pdf, fz(z), summarizes the uncertainty on z at this location. An experiment
performed to gain knowledge of variable z at a given location will update the pdf on z:
136
Vario Mathematics
f (z | e) =
f (e | z) * fz (z)
fE ( e )
(8-17)
f(z|e) is a posterior distribution, and it is a measure of the uncertainty on z after the experiment E.
f(e|z) is a likelihood function-it measures the likelihood of the outcome of the experiment, e, given
z and thus quantifies the informative quality of the experiment E (Alabert, 1987).
137
lithology is to be determined from seismic velocity measurements (soft data), the accuracy of this
method can be determined by examining the results of the experiment (interpreting lithology from
velocity) at locations where hard data are available (the wells). From the calibration samples, the
probabilities of misclassifying the attribute via the soft data are determined. These probabilities are
capable of fully characterizing the indicator likelihood function for each indicator cutoff, as will be
discussed in the Estimating the Z-cdf With Soft Data section. This calibration is essential,
otherwise it would be necessary to rely entirely on models of uncertainty which could be woefully
inadequate. Alabert (1987) discusses several models that could be used as likelihood functions
when it is impossible to quantify the precision of the experiment.
138
Vario Mathematics
Hard Data
For hard data, each discrete class will contain a 0 or 1. This complete vector of 0s and 1s is the
discretized version of the step posterior cdf corresponding to the precise information z(x) (Alabert,
1987).
Hard Data
Soft Data
Cutoffs
Datum
Type A
Type B
Type C
ZNc
1.0
1.0
1.0
1.0
1.0
1.0
1.0
0.8
1.0
0.0
0.5
0.0
0.0
0.2
z1
0.0
0.0
0.0
0.2
As shown in Table 8.3, the type A soft data are encoded with a 0 or 1 at each discrete class. The
resulting vector of 0s and 1s contains imprecise information. For example, each indicator 1 has
a nonnegligble probability that the true corresponding indicator is actually a 0. Thus the
imprecision of the type A indicators must be quantified. This quantification is determined by
covariances between imprecise indicators and covariances between imprecise indicators and hard
indicators. Under some assumptions, those covariances are shown to be related to hard indicator
covariances through a scaling factor which depends only on the misclassification probabilities:
(Alabert, 1987, p. 31) p1 and p2.
[
= P[ Z (x) z
]
| Z( x ) > z ]
p1 = P Z (x) z c | Z(x) z c
p2
(8-18)
(8-19)
where Z(x) is a binary function defined for a cutoff zc. The probabilities p1 and p2 are easily
estimated for each cutoff if calibration samples are available to determine the precision of the
experiment generating the soft data. If <img src=../symbol/ZZhat.gif>(x) is a good measure of
UNCERT Users Manual
139
Z^(x)
Z^(x)
Z^(x)
Z^(x)
zk _
> zk _
zk _
> zk _
Z(x) zkp1
Z(x) zk1.0 - p1
Z(x) > zk p2
Z(x) > zk 1.0 - p2
There are two cases where H0 is rejected and two cases where it is accepted:
H0 is true
H0 is false
Accept H0
Correct decision
(case 1a)
Type II error
(case 2a)
Reject H0
Type I error
(case 1b)
Correct decision
(case 2b)
Thus, p1 is the probability of accepting the null hypothesis based on the soft data estimate when that
is the correct course of action. p2 is the probability of committing a type II error-accepting the null
hypothesis based on the soft data when that is an incorrect choice.
140
Vario Mathematics
coding of Type C soft data is similar to that developed by Journel (1986) in the original version of
the soft kriging method.
] [
1 2
1
+ h + 2 h + m 2+ h m 2 h C( h )
2
2
(8-20)
] [
] [
(8-21)
(8-22)
Assuming the error on the soft indicators is stationary, p1 and p2 are independent of x and can be
written p1(zc) and p2(zc). These probabilities fully characterize the indicator likelihood function at
cutoff zc defined as:
141
( ) [
L cc i , i = P i (x, z c ) | i(x, z c )
(8-23)
Alabert (1987) makes the point that knowledge of the N<em>c</em> indicator likelihood functions
is not equivalent to knowledge of the full likelihood function f(<img src=../symbol/ZZhat.gif>|z);
however determination of the full likelihood function is not practical, nor is it necessary to fully
account for the quality of the indicator information. Therefore, in practice, the likelihood function
is estimated through estimates of p1(zc) and p2(zc). A graphical method of determining p1 and p2 is
shown in Figure 8.28. The misclassification probabilities are calculated from the equations at the
top of the figure. Regions A and D are inclusive of any points lying on the vertical Zc. Region A is
also inclusive of any values lying directly on the horizontal Zc. The case of Type A soft data
without calibration samples will not be discussed here.
8-28. Graphical
method of
determining p1 and p2for cutoff zc(after
Albert, 1987).
FIGURE
Two relations will be developed: the first is between hard indicator covariances and hard-soft
indicator cross-covariances and the second is between hard indicator covariances and soft indicator
covariances. Alabert (1987) derives a simple relationship between the hard indicator covariance
CI(h,zc) and the hard-soft indicator cross-covariance CI(h,zc). Assuming stationarity of I and
î and that p1(zc) does not equal p2(zc):
142
Vario Mathematics
C I ( h, z c ) =
C II ( h, z c )
p1(z c ) p 2 (z c )
(8-24)
An estimate of the hard indicator covariance can be derived from the hard-soft indicator covariance:
C*I ( h, z c ) = C*I ( h, z c )
Nh
C ( h, z c )
+ (1 ) * II
*
p1 (z c ) p 2 (z c ) N , N
h
s
(8-25)
the subscript Nh refers to that portion of the estimate derived from the available hard indicators. Nh,
Ns denotes the estimate derived from the experimental hard-soft indicator cross-covariance. The
weight, w, can be adjusted to account for both the number of pairs involved in each covariance
estimate, as well as, the quality of the soft indicators (Alabert, 1987).
The second case considers estimating the hard data covariance from the covariance between soft
data locations. When considering two soft indicators at two different locations, î(x,zc) and
î(y,zc), Alabert (1987) shows that the covariance between the two is a scaled version of the
hard data covariance between them:
][
(8-26)
C I ( h, z c ) =
C I ( h, z c )
(8-27)
Note that the quality of the soft information at both locations is taken into account. If one of the
two soft data values provides no information on its corresponding hard attribute (p1(zc)
approximately equals p2(zc)), then C (x,y,zc) = 0. This model will break down when the errors have
a strong spatial correlation (Alabert, 1987). So, assuming stationarity of I and and if p1(zc) does
not equal p2(zc), CI(h,zc) can be estimated:
C*I ( h, z c ) = 1 C*I ( h, z c )
Nh
C* ( h, z c )
C*I ( h, z c )
+ 2 * II
+
3
2
*
*
*
p
z
p
z
1 ( c ) 2 ( c ) N , N
p1 (z c ) p 2 (z c )
h
s
Ns
(8-28)
where Ns denotes the estimate of covariance derived from the experimental soft indicator
covariance. The weights should again be chosen to reflect the quality of the soft data and the
143
number of pairs involved in each of the experimental covariances. The three weights must sum to
1.0. The two weighting options within the software are described below.
The Straight Pair Weighting Option calculates the omega weights strictly by the quantity of hard
and soft data pairs within the lag spacing.
1 =
2 =
Nh
N total
(8-29)
Ns
N total
(8-30)
3 = 1.0 (1 + 2 )
(8-31)
Where Nh and Ns denote the number of hard data and soft data pairs within the lag spacing. Ntotal
equals the total number of pairs within the lag spacing. The use of soft data to estimate the
covariance, or semivariogram, allows the maximum number of pairs available for estimation to be
Nh2 + Nh * Ns + Ns2 rather than just Nh2 if only hard data were available.
For a full theoretical development of these relationships the reader is referred to Alabert (1987).
Alabert (1987) performed experimental checks for the relationships presented in this section and
found that even small amounts of soft data can improve the estimation of the spatial correlation
relative to using only hard data.
144
Vario Mathematics
as a 1.0. The pcdf value for each threshold is then used to determine how that data location is
weighted in the semivariogram calculations.
Kulkarni (1984) devised a semivariogram equation which took into account different weights of
points at different locations.
N( h )
(h) =
N( h )
K (h)
j
j =1
j =1
(8-32)
Where the quantity K is the weight of the data pair calculated as the product of the pcdf value for
each point at the current indicator threshold. The number of pairs of data that make up the gamma
calculation at each lag is no longer necessarily an integer value since it is the sum of the K weights.
The previous equation can be written in terms of a covariance calculation:
C( h ) = N( h )
N( h )
K (h)
j
[K (h)i(x )i(x + h) m
j
j =1
j =1
h m+ h
]
(8-33)
m h = N( h )
N( h )
K (h)
j
(K (h)i(x ))
j
j =1
j =1
m + h = N( h )
(8-34)
N( h )
K (h)
j
j =1
(K (h)i(x + h))
j
j =1
(8-35)
Jackknifing
A problem using the experimental semivariograms is that a method for directly measuring its
uncertainty, error, or confidence limits is not available. This is because for each lag, there is only a
single calculable mean. It may be the mean value of numerous calculations at the given lag, but by
trying to measure the variance of the deviation around the mean, one is in effect calculating the
variance of the data set variance at that lag. As a result, as the range increases, typically so does the
UNCERT Users Manual
145
variance, and the results are basically useless. The calculated variance is generally equal to or
greater than the mean value for the lag calculated (i.e. at a given lag, the mean value with 95%
confidence is generally between, less than zero (really zero) and more than twice the calculated
value; this is not useful information).
To side step this problem, a process called jackknifing (cross-validation) is used (Shafer and
Varljen, 1990, and Davis, 1987). Jackknifing is a procedure where one (or more) data points is
removed from the data set, and then the experimental semivariogram is calculated. By repeating
this procedure for every point in the data set, a series of n (n = number of samples) experimental
semivariograms are calculated. For each lag distance there are now n mean (h) values. From these
values it is then possible to approximately determine, for example, the 95% confidence limits for
the mean (h) value for a particular lag. When these are plotted, the error bars define the possible
range of the modeled semivariogram. There is a problem with this method. Each mean value
calculated is correlated with the other mean values calculated at that lag (the same data, except for
one point, is being used) therefore the variance calculations are not strictly correct (Davis, 1987).
As will be explained, this technique is not being used to prove a particular semivariogram model is
correct, which it cannot do (Davis, 1987), but to guide the modeler in collecting further data or
identifying a likely range of reasonable model semivariograms.
Two other concepts should be considered when jackknifing. First, because data points are being
removed from the data set to calculate the experimental semivariogram, the variance, and therefore
the calculated sill will generally increase slightly. With more data the population is better defined,
and the variance is lower. Secondly, when a single experimental semivariogram based on all the
data is calculated, the results may appear to be easily modeled. The problem with a single
experimental semivariogram is that it is difficult to determine if it represents the true nature of the
site, or if the modeler was fortunate in selecting lags. By jackknifing the data the error-bars let the
modeler determine how much confidence can be attributed to the modeled semivariogram.
Bibliography (vario)
Alabert, F., 1987, Stochastic Imaging of Spatial Distributions Using Hard and Soft Information,
M.S. Thesis, Stanford University, Stanford, California.
Davis, B.M., 1987, Uses and Abuses of Cross-Validation in Geostatistics, Mathematical Geology,
Vol. 19, No. 3, pp 241-248.
Deutsch, C.V. and A.G. Journel, 1992, GSLIB: Geostatistical Software Library and Users Guide,
Oxford University Press, New York, 340 pp.
Englund, E. and A. Sparks, 1988, GEO-EAS, U.S. Environmental Protestion Agency,
Environmental Monitoring Systems Laboratory, Las Vegas, Nevada, EPA/600/4- 88/033.
146
Bibliography (vario)
Isaaks, E., and R.M. Srivastava, 1988, Spatial Continuity Measures for Probabilistic and
Deterministic Geostatistics, Mathematical Geology, Vol. 20, No. 4, pp. 313-341.
Journel, A., 1986, Constrained Interpolation and Qualitative Information, Mathematical Geology,
Vol. 18, No. 3, pp. 269-286.
Shafer, J.M. and M.D. Varljen, 1990, Approximation of Confidence Limits on Sample
Semivariograms From Single Realizations of Spatially Correlated Random Fields, Water
Resources Research, Vol. 26, No. 8, pp 1787-1802.
147
148
CHAPTER 9
Model Semivariogram:
Variofit
The variofit application is used to fit model semivariograms to experimental and jackknifed
experimental semivariograms (Generated by vario). This can be done manually or automatically
using least-squares regression or latin-hypercube sampling techniques. Ergodic variations of the
model semivariogram from simulation series may also be evaluated.
The variofit application is composed of three sections; the main menu-bar, the status and log text
area, and the drawing or graph area. The menu-bar is used to select all vario commands, the log/
status area is used by the program to report important messages or results, and the drawing area is
the display area for the semivariogram graph.
File
The File sub-menu options control file and print handling, and exiting the program. The options
include Open, View Data, View Results, Save, Save as, Save Preferences, Print Setup, Print, Quit,
and Quit Without Saving.
UNCERT Users Manual
149
FIGURE 9-1. This is an example of the variofit application window. The main menu-bar is on the
top of the application window, with the log status window, and the drawing area below.
Open
Selecting File:Open generates a pop-up dialog which allows the user to select an existing data file.
This dialog operates exactly as the Open:File dialog in Chapter 5> (plotgraph Figure 5.2).
However, unlike plotgraph the default data file extension is *.gam (See vario, Chapter 8, for data
file format).
View Data
File:View Data pops up a simple screen editor showing the file that the current experimental
semivariogram or jackknifed experimental semivariogram is based upon.
150
View Results
File:View Results pops up a simple screen editor showing the file containing the last saved results
of a model fit calculation. See Output Data File Format Section for data file format.
Save
File:Save saves the results of the latest calculation to a file. If a save file has already been opened,
the data are simply saved. If a save file has not been selected yet, a pop-up dialog similar to that
used in File:Open (Figure 5.2) is created. The main difference between the Open and the Save
dialog is that to save a file, the file does not have to pre-exist. For a description of how the dialog
works, see the Open section above and substitute Save for Open wherever appropriate.
The graph lines will be saved using one of the two file formats specified in the File Output
section (single format, and multiple jackknifed format).
Save as
File:Save as is identical to File:Save described above, when a file has not been selected yet. This
option can be used to save the file for the first time, or save results to a new file.
Save Preferences
When using programs with many user options, it is not possible for the program to always pick
reasonable default values for each parameter or input variable. For this reason preference files were
created (See Appendix C). These allow the user to define a unique set of defaults applicable to
the particular project. When File:Save Preferences is selected, variofit determines how all the input
variables are currently defined and writes them to the file variofit.prf.
WARNING:
If variofit.prf already exists, you will be warned that it is about to be overwritten. If you do not want the old version destroyed you must move it to a
new file (e.g. the UNIX command mv variofit.prf variofit.old.prf would be
sufficient). When you press OK the old version will be over-written! Moving
the old file cannot be done currently from within the application. To rename
the you will have to execute the UNIX mv command from a UNIX prompt in
another window. If variofit.prf does not exist in the current directory, it is
created. This is an ASCII file and can be edited by the user. See Appendix C
for details.
Print Setup
File:Print Setup works exactly as explained in Chapter 5.
151
Print
File:Print generates a Postscript file of the calculated spatial measure, and depending on how the
print options are define in Print Setup, directs this file to the specified print queue, or to the
specified file.
Quit
File:Quit terminates the program, but if additions have been made to the graph, the user will first be
queried to supply a file to save the changes in.
Regression
The Regression menu option is used to fit a single model semivariogram to an experimental
semivariogram. Manual an automatic fitting routines are available. For a description of the
mathematics used for defining or for automatically calculating a model semivariogram, see the
Mathematics Section below.
Calculation Method
There are two options for model calculation, Automatic and Manual. The Automatic method a
least-squares regression method and is useful for getting an approximate model fit. For many
solutions this automated fit may give reasonable results, but rarely is the automated least-squares fit
the best fit for the model semivariogram. This method usually yields satisfactory results, but the
algorithm still needs work, and sometimes it gives very poor results. Fitting a model
semivariogram is a bit of an art, and a geostatistician may prefer a slightly different model. The
automated solution is, though, good for making a first approximation. The Manual solution method
is used when the user wishes to improve upon the automatic fit, or define the model directly. Here
the range, sill, and nugget values can be adjusted as desired, and the model semivariogram can be
displayed with the experimental semivariogram.
Automatic
To automatically fit a model semivariogram to an experimental semivariogram, several steps have
to be followed:
1) Open a experimental semivariogram data file (File:Open).
152
2)
3)
4)
5)
6)
While the regression is being calculated, the range increment for nests 4, 3, and 2 will be displayed
in the status log window along with the largest error calculated for all increments of the first nest
range. Once the regression is complete, the best-fit model semivariogram will be drawn in the
graphics area and the optimum range, sill, and nugget values will be updated within the pop-up
regression dialog.
NOTE: Any user input values for Range, Sill, or Nugget will be overwritten when an
automatic solution is calculated.
WARNING:
There are still bugs in this algorithm and the optimized solution may be an
extremely poor fit!
WARNING:
The automatic solver uses an iterative approach (there are more unknowns than
equations) to optimize the model solution. Because of this it is not
recommended that this solver be used to fit three or four nested structures.
Using the default value for Division Steps (100) a two-nested model takes
approximately 100 times as long to solve as a single-nested model (i.e. 30 sec.
vs. 0.3 sec). For the same data set, a three-nested model would take 50
minutes, and a four-nested model several days!
Manual
To manually fit a model semivariogram, two options are available. Parameter values may be
entered directly, or a fitting tool with slider bars can be used (discussed below). To fit a model by
directly entering values, follow these steps:
1)
2)
3)
4)
5)
6)
7)
153
The described model semivariogram will be displayed in the graphics area, superimposed over the
experimental semivariogram (Figures 9.2 and 9.3). To refine the model fit further, repeat steps 4)
through 7).
FIGURE 9-2. This is an example experimental semivariogram file imported from vario (Chapter 8).
To fit a model manually using the slider fitting tool, select the Manual Calculation Method and
select the Regression:Slider Fitting menu item. This will produce the dialog shown in Figure 9.4.
Parameters
Selecting Regression:Parameters generates the pop-up dialog shown in Figure 9.5. This dialog
describes everything necessary to fit a model semivariogram to an experimental semivariogram.
Up to four nested structures can be modeled. For each nested structure (at least one is required) the
Range and Sill values may be entered, and the model type defined (Linear, Linear with Sill,
Spherical, Gaussian, Exponential, and Logarithmic are valid model types). No Model indicates
that the defined model nest and any higher nest will not be evaluated (The first column on the left
defines the first nest structure, the fourth column on the right defines the fourth nest structure). The
Nugget may also be defined.
Because the graph can get very cluttered when there are several experimental semivariograms
displayed, the Visible Models button was added. When pressed, the pop-up dialog shown in Figure
9.6 is created. With this dialog, different experimental semivariograms can be turned on and off.
When an automatic solution is desired several other parameters need to be defined. The automated
method uses an iterative solver because there are more unknowns than equations. Because of this,
some limits must be placed on the iteration. By default, the maximum possible range of the data set
(Starting Range) is considered to be the maximum diagonal distance across the data set, and the
154
FIGURE 9-3. This graph shows the same experimental semivariogram as in Figure 8.4, but a twonested spherical model semivariogram has been fit to the data. The figure was printed with a
landscape format.
minimum possible range is considered 0.0 (Ending Range). In the iterative solution, this possible
range of Range values is divided into 100 (default value for Calculation Steps - Divisions)
subdivisions. For the first iteration, a best-fit model will be calculated assuming the maximum
range. For the next iteration, the range will be set to the maximum range - (maximum range /
UNCERT Users Manual
155
FIGURE 9-5. Regression pop-up dialog. This dialog is used for defining parameters necessary for
100.0). This will be repeated until the minimum range (0.0) is used. For each iteration, the leastsquares error for each model semivariogram will be saved. The model semivariogram with the
smallest squared-error, is considered the best-fit. In many situations, it may be reasonable to
modify these values. For example, by examining the experimental semivariogram in Figure 9.2, it
is clear the range is less than 65 meters. If the data sets maximum possible range was 500 meters a
156
great deal of computational effort would be wasted examining ranges greater than 70 meters. A
similar argument can be made for the Ending Range. The number of Divisions defines the fineness
of the steps used to determine the range. If the range is fairly narrow fewer divisions may be
reasonable; if a very wide range is possible more divisions may be necessary.
NOTE:
The program supports only models with up to four structures. It is felt that only in
an extremely unusual case would even four structures be justified, much less five or
more.
NOTE:
The range for each model structure must be between the Starting and Ending Range
parameters. Do not over constrain the solution.
NOTE:
For multi-nested structures, the number of Calculation Steps- Divisions is the main
control on how long the automatic solution takes. Reducing the number of divisions
can reduce the solution time significantly!
Slider Fitting
The Slider Fitting tool shown the Figure 9.4 can be used in the Manual calculation mode. Using
this tool, the Nugget, Cnest, and Rangenest values can be fit to the experimental semivariogram.
When the slider bars are moved the current parameter value will be used to a model semivariogram
on the graphics display. By moving the slider bars (with a little practice), and selecting the
appropriate nest, the model semivariogram can be moved and adjusted to overlay the experimental
semivariogram.
Solution Method
When an automatic solution is selected, a least-squared method must be defined. This method
defines how the least-squares regression algorithm will weight and consider errors between the fit
model and the experimental semivariogram. The options are displayed in a pop-up dialog (Figure
9.7) and explained below:
FIGURE 9-7. Model Fitting Solution Method dialog: This
157
Least-Squares - This method fits a best-fit model curve to the data. No adjustments are made
for the nugget, variance, or the relative number of pairs per lag.
Nugget LS - This method determines the nugget by determining the y-intercept of a line drawn
through the first two <img src=../symbol/g.gif>(h) lag values. Once the nugget is
determined the standard least-squares approach used above is applied.
Variance LS - This method defines the sill as equaling the variance of the data set. This forces
the model curve to honor the data set variance; this is also commonly done by
geostatisticians when manually fitting model semivariograms. Once the nugget is
determined the standard least-squares approach used above is applied.
Nugget & Variance LS - This method combines the Nugget LS and Variance LS methods
described above.
Weighted LS - This method is a normal least-squares regression, except that it weights the error
for each lag, based on the number of sample pairs used to generate that lag <img src=../
symbol/g.gif>(h) value (i.e., this method presumes that it is more important to honor a
data point representing 500 data pairs than one representing 3 data pairs). The previous
four methods do not consider the number of data pairs, therefore they weight all lag values
equally.
Calculate
Selecting the Regression:Calculate menu option will cause the program to calculate and plot a
single model semivariogram based on how the parameters are set in the Regression:Modify dialog
(Figure 9.5). When the calculation is complete the model parameters and Mean Square Error
(MSE) are printed to the log/status window. A word of caution; if nested structures are being
calculated, particularly three or four, this procedure may take a very long time if the Automatic
calculation method is used. Before you use this may nests, examine the experimental
semivariogram carefully; rarely are this many nested structures justified.
NOTE: This Regression is not equivalent to the regression option described in Chapter 5 for
plotgraph.
Jackknife
When a series of jackknifed semivariogram models is loaded (See vario, Chapter 8), various
information can be displayed about the group of experimental semivariograms.
Modify
Jackknife:Modify creates the pop-up dialog shown in Figure 9.8. This dialog allows the user to
specify what information is displayed about the jackknifed experimental semivariograms. Draw
Intermediate Results determines whether the experimental semivariograms based on the number of
data points minus one are displayed. Draw Unjackknifed Solution determines whether the
158
experimental semivariogram based on all of the data points is displayed. Draw Error-Bars
determines whether the vertical error-bars representing the confidence limits of the <img src=../
symbol/g.gif>(h) value for each lag are drawn. Draw Error-Bands determines whether the
horizontal error-bars representing the confidence limits of the lag distance are drawn. Draw Range
of Jackknifed Variances: because each jackknifed experimental semivariogram is based on a
slightly different set of data points the variance for each solution is also different; this option
determines whether this variation is drawn. Finally, the confidence level for the error-bars can be
specified; 95% is the default.
9-8. Jackknife Display pop-up
dialog.
When a series of jackknifed
experimental semivariograms are loaded,
this dialog allows the user to specify how
pertinent information about uncertainty is
displayed and whether the jackknifed
experimental and the full experimental
semivariograms are displayed.
FIGURE
Individual Display
Jackknife:Individual Display creates the dialog shown in Figure 9.9 and allows the user to display
individually each jackknifed experimental semivariogram Figure 9.10. When the dialog is
destroyed, all of the jackknifed semivariograms and other relevant information is redisplayed. The
jackknifed experimental semivariograms can be displayed by number ID (ID number set by order
of occurrence in data file), or they can be stepped through by pressing the Next and Previous
buttons.
Latin-Hypercube
Latin-Hypercube sampling is used in this package to select a limited number, but a representative
group of model semivariograms which honor the confidence limits of a series of jackknifed
experimental semivariograms (See Mathematics Section for theory behind this approach). This
approach is not meant for use on single experimental semivariograms, or best estimate kriging. The
model semivariogram output from this technique is used for input into the SISIM3D stochastic
conditional simulation program (Chapter 14).
159
FIGURE 9-9. The graph shows a single jackknifed experimental semivariogram and the variance
(dashed line) of the associated data set (full data set less one point). The error-bars show the
variance for all the jackknifed solutions.
FIGURE 9-10. Jackknifed Individual Display pop-
up dialog.
This dialog allows the user to
individually display each jackknifed experimental
semivariogram and the full experimental
semivariogram.
Parameters
Latin-Hypercube:Modify generates the pop-up dialog shown in Figure 9.11. This dialog is used to
specify the parameters which control the latin-hypercube sampling process.
Unlike the Regression:Parameters dialog however, only two nested structures can be modeled.
Because of the lack of data inherent when this process is used, modeling even two structures is
probably suspect. For most cases there is probably little justification for using more than one
structure. As with the Regression:Parameters dialog No Model, Linear, Linear with Sill,
Spherical, Exponential, Gaussian, and Logarithmic models can be used for each structure. Instead
of specifying single Range, Sill, and Nugget values though, the range of values must be specified,
because this is a randomized model fitting procedure.
For each variable; Range, Sill and Nugget, a Minimum and Maximum value must be specified (The
Minimum may equal the Maximum range, e.g. a constant is implied). This specifies the range of
allowable or reasonable values. These entries are preset with default values:
160
FIGURE 9-11. Latin-Hypercube Fitting pop-up dialog. This dialog is used to define the parameters
which constrain how model semivariograms will be selected, using a Latin-Hypercube MonteCarlo sampling technique, to fit a series of jackknifed experimental semivariograms.
Range: The Minimum Range is set by the longest lag that has its confidence interval fall
completely below the confidence interval of the jackknifed experimental
semivariogram variances (No lower lag values can fall above the variance either), or
failing this, to the minimum confidence interval range of the shortest lag
(Realistically, it might be best to set this value to 0.0 in this case since the possibility
of pure nugget effect cannot be disregarded) (Figure 9.12). The Maximum Range is
set by the shortest lag that has it confidence interval that lies completely above the
variance confidence interval. In many models this never occurs, and the Maximum
Range is set to the maximum extent of the maximum lag confidence interval (Figure
9.12, the Maximum Range is set to 436 feet).
Variance: The Minimum Variance is set to the lower bound of the jackknifed experimental
semivariogram confidence interval, and the Maximum Variance is set to the upper
bound (Figure 9.12, Minimum = 0.2613, Maximum = 0.2841).
Nugget: The Minimum Nugget is set to 0.0. The Maximum Nugget is set to the upper
bound of the shortest lag confidence interval, or the upper bound of the jackknifed
experimental semivariogram confidence interval; which ever is the least (Figure 9.12,
Minimum = 0.0, Maximum = 0.2762).
161
FIGURE 9-12. This graph shows a series of jackknifed experimental semivariograms and the
associated uncertainties (95% confidence level). Circles represent individual lag/(h) points for
jackknifed (all but one data point used) experimental semivariograms. Squares represent
individual lag/(h) points for full (all data points used) experimental semivariograms. The error
bars show the variance at each lag for the mean lag distance, and the mean (h) value. The grayhorizontal-dashed band describes the 95% of the variance for all the jackknifed data sets. The
central dashed line in this band is the variance for the full data set.
If the Minimum and Maximum entries appear inappropriate, the user is free to alter their values.
There are many justifications for doing so.
Each interval (Range, Variance, and Nugget) can be divided into one to ten Divisions; by default
these Divisions are of equal width. If non-equal Division intervals are desired, press the Define
button for the appropriate parameter. When this is done, one of the pop-up dialogs shown in
Figures 9.13a, 9.13b, and 9.13c will be created. In this dialog the desired absolute division
intervals can be assigned. If the Range Increments have been adjusted and equal increments are
desired, press the Set Divisions Equal button. By default the Range and Nugget are divided into
four Divisions and the Variance one division. In this case, when the Latin-Hypercube sampled
semivariograms are calculated (Select the Latin-Hypercube:Calculate main menu-bar option), 16
model semivariograms will be calculated (4 Range x 1 Variance x 4 Nugget divisions x Number of
Solutions per Division). An example is shown in Figure 9.14a. For each model semivariogram, the
model Range, Variance, and Nugget is randomly selected from within each Range Interval (Equally
Spaced selections from within the Range Interval are Possible in addition to the Random Range
Selection Option).
NOTE: As long as no parameters are changed in this dialog, the randomly selected model
semivariograms will always be exactly the same; this is because the same initial
Randomize Seed is used. This allows the user to repeat previous work. To generate
a new series of random model semivariograms, select a new Randomize Seed (any
integer value).
162
FIGURE 9-13. Sampling Division Range pop-updialogs. These dialogs are examples of dialog
used for specifying the range increments for
Range, Variance, and Nugget for the LatinHypercube Sampling model fitting algorithm.
a).
b).
c).
Just because 16 model semivariograms have been calculated, this does not mean that they honor the
confidence intervals of the jackknifed experimental semivariogram. As shown in Figure 9.14a
three model semivariograms (dashed lines) do not honor the confidence intervals of the shortest
163
a).
b).
FIGURE 9-14. These graphs show a series of Latin-Hypercube sampled model semivariograms fit
to the jackknifed experimental semivariograms in Figure 9.11. In a) all the calculated models are
shown; note though, that three model curves are dashed. These three model do not honor the
variance error-bars, and are therefore probably unreasonable solutions. b) shows the same results
except that the inappropriate model solutions are removed.
lag! These models honor the parameter values specified (Range, Variance, and Nugget), but they
do not honor the jackknifed experimental semivariogram and should be disregarded.
NOTE: Apparently invalid model semivariograms are drawn, because in some cases the
models are reasonable, but the decision algorithm in the software is overly sensitive.
Carefully examine each model before discarding it.
Several options allow the software to check for, and eliminate invalid models. Note, if a model is
eliminated a new model is not calculated to replace it. If the Check Solutions Within Range Area is
active, the software will attempt to determine which, if any models do not honor the experimental
data. If Reject Invalid Solutions is also selected, invalid models will not be drawn (Figure 9.14b),
otherwise they will be drawn with dashed lines (Figure 9.14a).
164
Individual Display
Because the graph can get very cluttered when many model semivariograms are displayed at once,
the Latin-Hypercube:Individual Display option is available (pop-up dialog in Figure 9.15). It
allows the user to examine each model semivariogram individually (Figure 9.16). This is also
convenient for examining models which are marked invalid. If the Solution # for the model is
known (unlikely), it can be entered directly, or the user can step through the models individually by
using the Next and Previous buttons. For each model, the user can mark it as Valid, Invalid
(Discard), or yet to be determined (?).
FIGURE 9-15. Latin-Hypercube Individual Display
pop-up dialog. This dialog allows the user to
individually display each Latin-Hypercube sampled
model semivariogram.
FIGURE 9-16. This graph shows an individual Latin-Hypercube sampled model semivariogram
165
Save
Latin-Hypercube:Save saves the numerical model results of the Latin-Hypercube to a data file
readable by sisim (Chapter 14) (See the Output Data File Format section for file format details). If
the output file has already been named, the old file is overwritten and the new data is saved. If no
file has been named, the user is requested for a filename (*.lhc extension) with a pop-up dialog
similar to that shown in Figure 5.2.
Save as
Latin-Hypercube:Save as creates a pop-up dialog similar to that shown in Figure 5.2 requesting an
output file name for the Latin-Hypercube sampled model semivariograms (default extension =
*.lhc). Once the file name is selected, the data is saved.
Calculate
The Latin-Hypercube:Calculate menu-bar option calculates the latin-Hypercube sampled model
semivariograms based on the parameters define in the Latin- Hypercube:Modify dialog.
Series
Series allows the user to load in a series of experimental semivariograms. These will usually be
calculated from a series of indicator simulations (See Chapter 14, sisim). This is a useful tool for
viewing the ergodic fluctuations of the simulation results () versus the original input/model
semivariogram used to calculate them. In theory there should be variation in the simulation grid
experimental semivariograms, but they should, in sum, reflect the initial model semivariogram they
were all calculated from. This tool can be used to evaluate how well this assumption is honored.
Display
Series:Display creates the pop-up dialog shown in Figure 9.17. This dialog is used to control what
information about the experimental semivariograms is displayed on the graph. Under the default
configuration, each semivariogram is drawn, along with the mean (h) value for every lag, and the
maximum extents of the sampled (h) values (Figure 9.18a). Various details can be turned on or
off. In Figure 9.18b, only the mean lag (h) values and the (h) extents are display. In Figure 9.18c,
the 95% confidence intervals of the lag (h) values are added. In Figure 9.18d, only the (h)
extents, and an individual model are displayed.
166
Parameters
Series:Parameters creates the pop-up dialog shown in Figure 9.19. This dialog is used to control
what experimental semivariograms are loaded into variofit. The files loaded need to use the
following naming convention:
prefix.#.suffix
An example series might be:
sample.1.gam
sample.2.gam
sample.3.gam
sample.4.gam
If there are models missing between the first and last selected, a warning message will be printed in
the log/status window. The fact that there is a file missing will not effect the calculated statistics.
The statistics are based only on the models successfully loaded.
Graph
Graph allows the user to specify various attributes about the appearance of the graph. Attributes
about the graph Border, Error-bar Styles, Fonts, Labels, Legends, Mesh, and Line Styles.
167
a).
b).
FIGURE 9-18 a,b. These graphs show some of the possible display options for viewing series
experimental semivariograms. Graph a) shows the scatter of all the simulation experimental
semivariograms (o), the mean g(h) value for each lag (n), and the extents (- - -)of the sample g(h)
lag values. Graph b) shows the same data set without the individual experimental semivariograms.
168
c).
d).
FIGURE 9-18 c,d. These graphs show some of the possible display options for viewing series
experimental semivariograms. Graph c) shows the same information as b), but the 95% confidence
interval for the range of g(h) for each lag is shown. Graph d) shows an individual experimental
semivariogram from the series set in a), b), and c).
169
Border
Graph:Border is described in Chapter 5 in the Graph:Border section (Figure 5.9).
Error-Bar Style
Error-Bar Style controls the display characteristics of the error bars. Error bars only exist when a
jackknifed calculation has been done. The Error-Bar Style dialog is shown in Figure 8.22. The
Error-Bar Style settings are very similar to the settings available in the Style: Set Line Attributes
dialog as described in Chapter 5 (Figure 5.14). The only difference in the dialogs is the option to
set the extent of the cross-width on the error-bars.
Fonts
Graph:Fonts is described in Chapter 5 in the Graph:Fonts section (Figures 5.10 and 5.11).
Labels
Graph:Labels is described in Chapter 5 in the Graph:Labels section (Figure 5.12).
170
Legends
Data Parameters
Graph:Legends:Data Parameters is described in Chapter 8 in the Graph:Legends:Data
Parameters section (Figure 8.23).
Model Parameters
Graph:Legends:Model Parameters dialog is shown in Figure 9.20. It is very similar to the
Graph:Legends:Data Parameters dialog (Figure 8.23), except this option controls if and where the
model semivariogram parameters will be displayed on the graph. An example is shown in Figure
9.1.
FIGURE 9-20. Model Parameter
Display pop-up dialog. This dialog
allows the user to place a legend of
calculated model semivariogram
parameters on the graph is a
specified corner.
Mesh
Graph:Mesh is described in Chapter 5 in the Graph:Mesh section (Figure 5.13).
Style
Graph:Line Styles is described in Chapter 5 in the Graph:Style section (Figure 5.14).
Log
The Log menu option is supplied to allow the user to save, view, or print all text which has been
written to the log/status window by the program or added by the user (The log window is also a
simple text editor). The options include View Log, Save, Save as, Clear, and Print. View Log,
Clear, Save, and Save as are similar in operation to the menu options under File described above.
Plot
Plot:Now and Plot:Refresh are described in Chapter 5 in the Plot:Now and Plot:Refresh sections.
UNCERT Users Manual
171
Help
Help works exactly as explained in Chapter 5 (plotgraph, Figure 5.15) Help section.
172
a).
b).
c).
FIGURE 9-21. These graphs show a series of model semivariograms fit to the data presented in
Figure 8.2. a) shows an automated fit single nested spherical model. b) shows an automated fit
two-nested spherical model. c) again shows a two-nested spherical model where the automated fit
has been manually adjusted to improve the fit.
173
h
h3
( h ) = 1.481.5
0.5
+ 1.07
(44.0)3
44.0
This model is a reasonable approximation based on the constraints used, but it is not a good fit. The
calculated nugget of 1.07 is much larger then one would expect. Extending the slope of the first
few lags suggests a value closer to 0.4! The model, also, in zones tends to consistently over- or
under-estimate the experimental semivariogram lag values. A better model selection may be a twonested spherical structure (Select a Spherical model for nest 2). For nests one and two, select the
Spherical model type. When the model is recalculated (Press Regression: Parameters), the model
shown in Figure 9.21b is a better fit.
h
h3
h
h3
( h ) = 1.151.5
0.5
+ 1.081.5
0.5
+ 0.31
3
(11.0)
11.0
(55.0)3
55.0
This, model honors the early portion of the experimental semivariogram more accurately then the
single nested model and is therefore a better fit. It does still tend to overestimate the variance over
the 15 to 45 meter lag distance. Using the automatic fitting algorithm, however, this is about the
best that will be done.
Note that this automatic model fitting process is <u>not</u> completely automatic. The modeler is
responsible for selecting the appropriate model structure, the algorithm only fits a curve based upon
set constraints. If the constraints are poorly defined, the model fit will also be poor.
174
because the lag values do not have to be honored, only the range of uncertainty has to be honored.
This process is also highly automated (There is no manual mode), and the modeler is free only to
set some constraints, and accept or reject calculated model semivariograms.
As an example, load (File:Open) the experimental jackknifed semivariogram file well.jack.gam.
This series of experimental semivariograms describes the uncertainty in the spatial distribution of
materials based on the wells shown in Figures 1.3 and 1.4. The data set (well.jack.gam) was
calculated by vario (Chapter 8 example). Once the file is loaded, the jackknifed experimental
semivariograms and their uncertainty will be displayed (Figure 9.22a).
a).
b).
FIGURE 9-22. These graphs show a) series of jackknifed experimental semivariograms with 95%
confidence interval information, and b) a series of 16 Latin-Hypercube sampled model
semivariograms fit to the data. Note that the three dashed models in b) do not honor the constraints
of the jackknifed experimental semivariograms and should be discarded.
The Latin-Hypercube technique allows the modeler to select a limited number of model
semivariograms which cover the range of solutions which honor the confidence intervals of
jackknifed experimental semivariograms and the modelers opinion of reasonable values. After
loading the file, the modeler must specify how to sample the jackknifed experimental
semivariograms to generate model semivariograms. For this data set there, is definitely not enough
175
data to suggest a two-nested structure. Model this data set with only one nest. Select a Spherical
model for nest one. A spherical model is selected because it has given reasonable results for similar
conditions. Based on the data available, it would be difficult to argue one model was better than
another. Because the selection criteria for range needs work, reset the Minimum Range to 10.0
(data implies pure-nugget effect possible, a 0.0 range though is felt highly unlikely (expert
opinion)), and the Maximum Range to 230.0 (Confidence interval of third lag suggests range could
be greater then 230 feet but it is unlikely to be much longer). Setting these limits sets reasonable
ranges for the Range variable. Once these parameters are set, select Latin-Hypercube:Calculate.
Sixteen model semivariograms will be generated and displayed (4 Range Divisions * 1 Variance
Division * 4 Nugget Divisions * 1 Solution per Division). Every time Latin-Hypercube:Calculate
is pressed, using these parameters these exact 16 model semivariograms will be calculated. To
generate a different set of random sampled model semivariograms, change the Randomize Seed
value to another integer value (Set to 10, it was 1 (default value)). If this is done, a new series of
model semivariograms will be calculated (Figure 9.22b). Note that in this series, two models are
drawn with dashed red lines. This indicates that these models do not honor the uncertainty limits of
the experimental semivariograms and should not be used for future calculations. To save these
valid models, Select the Latin-Hypercube:Save menu option and supply an appropriate file name.
This file can be used for input for the sisim module (Chapter 14).
NOTE: For different levels of uncertainty, the confidence intervals can be reduced or
expanded in the Jackknife:Modify dialog.
176
sex ] [-sfl {#}] [-sp ] [-sr #.#] [-sse #] [-sss #] [-ssz {#.#}] [-st #] [-sttl ] [sty {#}] [-sv #] [-svm #] [-ttl ] [-xfmt ] [-xlabel ] [-xmax #.#] [-xmin #.#]
[-xMt #.#] [-xmt #] [-xto #.#] [-xy #.#] [-yfmt ] [-ylabel ] [-ymax #.#] [ymin #.#] [-yMt #.#] [-ymt #] [-ys #.#] [-yto #.#] [filename]
Meaning of flag symbols:
#
#.#
{}
=
=
=
=
integer
float
character string.
variable is an array. Values must be seperated by a , and no spaces are
allowed. Do not use the { } symbols on the command line.
NOTES:
=
=
=
=
default = 1
default = 100
default = 0.0
default = 0
-fnt1
-fnt2
-fnt3
-fnt4
-fnt5
-fnt6
-fnts1
-fnts2
-fnts3
-fnts4
-fnts5
-fnts6
-help
-jcl
=
=
=
=
=
=
=
=
=
=
=
=
=
=
default = Helvetica-Bold
default = Helvetica-Bold
default = Helvetica
default = Helvetica
default = Helvetica
default = Helvetica
default = 24.0
default = 15.0
default = 15.0
default = 12.0
default = 10.0
default = 12.0
default = 90.0%
177
178
-jebd
default = 1
-jebr
default = 1
-jev
default = 1
-jir
default = 1
-jjr
default = 1
-lc {}
= line color
0 =
1 =
2 =
3 =
4 =
5 =
6 =
7 =
default = variable
Black
White
Red
Green
Blue
Magenta
Yellow
Cyan
-lgf
-lgmp
defalut = log.dat
default = 1
-lgms
default = 1
-lgpp
default = 1
3 = bottom right
-lhck
-lhmax1
-lhmax2
-lhmax3
-lhmin1
-lhmin2
-lhmin3
-lhmt1
=
=
=
=
=
=
=
-lhmt2
-lhout
-lhrj
-lhseed
-lhseg1
-lhseg2
-lhseg3
-lhsel
=
=
=
=
=
default = 1.0
default = 4
default = 1
default = 4
default = 1
-lhspd2
-lpbm
-lpc
-lpd
=
=
=
=
default = 1
default = 1.5
default = 1
default = 0
-lpf
= print filename
default = 1
default = "junk.ps"
179
-lph
default = 0
-lplm
-lpo
default = 1.5
default = 0
-lppsext
-lpq
-lpr
-lprm
-lps
=
=
=
=
=
default = "*.ps"
default = "ps"
-lptm
-lsfl {}
default = 1.5
default = 0
-lsc {}
default = variable
-lssz {}
-lsty {}
default = 9.0
default = 0
-ltk {}
180
default = 1.0
default = 0
default = 1.0
-lty {}
-m1
= line type
default = 0
-1 = no line
0 = solid
1 = dashed
2 = double dashed
= Model types nest #1
default = 3
0 = no model (option not available for nest #1)
1 = linear (NOT INSTALLED)
2 = linear with sill (NOT INSTALLED)
3 = spherical
4 = exponential
5 = logarithmic
6 = power (NOT INSTALLED)
7 = Gaussian
-m2
-m3
-m4
-md
=
=
=
=
-mox
-moy
-ms
= X mesh origin
= Y mesh origin
= use mesh
0 = false
1 = true
default = 0.0
default = 0.0
default = 0
-mx
-my
-ng
-out
-prf
-r1
-r2
-r3
-r4
-rfh
=
=
=
=
=
=
=
=
=
=
X mesh frequency
Y mesh frequency
nugget
output *.out filename
preference file name
range for nest #1
range for nest #2
range for nest #3
range for nest #4
screen refresh
0 = on exposure
1 = on update
default = 1/10 DX
default = 1/10 DY
default = 0.0
defalut = junk.out
defalut = variofit.prf
default = 0.0
default = 0.0
default = 0.0
default = 0.0
default = 0
-run
-s1
-s2
-s3
=
=
=
=
default = 0
default = 0
default = 0
default = 0
default = 0.0
default = 0.0
default = 0.0
181
182
-s4
-scl
-sdc
= C for nest #4
= series confidence level
= draw series lag confidence bars
0 = false
1 = true
default = 0.0
default = 95.0
default = 1
-sdm
default = 1
-sdx
default = 1
-sex
-sp
-sr
-sse
-sss
-st
=
=
=
=
=
=
default = .gam
default =
default = Max. diagonal
default = 1
default = 1
default = 4
-sttl
-sv
= secondary title
= solver method
0 = automativ
1 = manual
default =
default = 0
-svm
default = 1
-ttl
-xfmt
-xlabel
-xmax
-xmin
-xMt
-xmt
-xto
-xy
-yfmt
-ylabel
=
=
=
=
=
=
=
=
=
=
=
default = Filename
default = ".2f"
default = "X"
default = Data Maximum
default = Data Minimum
default = 1/10 DX
default = 5
default = 0.0
default = 1.5
default = ".2f"
default = "Y"
series extention
series prefix
search range
ending series
starting series
automatic solution type fitting algorithm
0 = least-squares
1 = nugget least-squares
2 = variance least-squares
3 = nugget & variance least-squares
4 = weighted (by pairs) least-squares
Main title
Number of decimal places for X-axis
X-axis label
Graph X-maximum
Graph X-minimum
X main tic frequency
Number of minor X tics
X axis label origin
X-Y ratio
Number of decimal places for X-axis
X-axis label
-ymax
-ymin
-yMt
-ymt
-ys
-yto
=
=
=
=
=
=
Graph Y-maximum
Graph Y-minimum
X main tic frequency
Number of minor Y tics
Y-axis exageration relative to X-axis
X axis label origin
Single Models
The single model format name is misleading. Multiple models may be stored with this format, but
each model is saved with its own set of model parameters used to create it (Various search
parameters including the lag, search direction, bandwitdhs, etc.) The format is outlined below.
SEMIVARIOGRAM #[Model Number][/][Number of Models in File]
for (i=1;i# of Models)
[Lags in Model]
for (j=1;j# of Lags)
[Minimum Lag][Maximum Lag][Average Lag][(h)][Pairs]
end for
[Data Set Variance]
[Data Set Mean]
[Calculation Type: 0 = Point, 1 = Gridded]
if (calculation type == Point)
[Lag]
[Search Direction]
[Plunge]
[Horizontal Bandwidth]
[Vertical bandwidth]
183
else
[X Step increment]
[Y Step increment]
[Z Step increment]
endif
[Spatial Equation: 0 = Semivariogram, 1 = Cross-Semivariogram, ... (See -set
flag in vario (Chapter 8))]
if (spatial equation == Indicator or Soft-Indicator Covariance)
[Low Cutoff]
[High Cutoff]
endif
[Data Set File Name]
if (calculation type == Point)
[X Column]
[Y Column]
[Z Column]
[Head Column]
if (spatial equation == Covariance, Cross-Semivariogram,
Correlogram, or Soft Indicator Covariance)
[Tail Column]
endif
endif
end for
If there is only one model, the data from Data Set Varaince down can be ignored. For an example
file, see data file water.3.gam.
Jackknifed Models
The jackknife model format stores all the jackknifed experimental semivariograms and details
about the variance over each lag interval. The format is outlined below
JACKKNIFED SEMIVARIOGRAM
[Data Set Mean] [Data Set Variance]
[# of Lags]
for (i=1;i# of Lags)
[Lag #] [Mean of Lag] [Lag Standard Deviation] [(h) Standard Deviation]
end for
[# of Experimental Semivariograms]
for (i=1;i# of Semivariogram)
[# of Lags for this Semivariogram]
for (j=1;j# of Lags)
[Minimum Lag][Maximum Lag][Average Lag][(h)][Pairs]
end for
end for
184
Single Models
The output for a single model semivariogram is a commented header describing the
parameters for each nest of the model semivariogram followed by three line descriptions (using
MULTIPLE LINE format, plotgraph, Chapter 8) specifying the data set variance, the experimental
semivariogram and the model semivariogram. The header for the model semivariogram shown in
Figure 9.3 is:
!------------------------------------!Model Segment 1:
Spherical
! Range = 9.600000
! Sill = 1.109714
!
!Model Segment 2:
Spherical
! Range = 49.600000
! Sill = 1.133128
!
!Nugget = 0.307158
!
!Variance = 2.550000
!Mean = 3.041230
!
!Sum squared error = 0.169223
!------------------------------------MULTIPLE LINES
3
2 1 5 1.0 -1 5 5.0 1 Variance
0.000002.55000
80.000002.55000
0 -1 2 1.0 0 2 5.0 1 Variance
200 0 4 3.0 -1 4 5.0 1 Variance
0.000000.30716
0.400000.39018
.
.
.
185
basically this file can be used as a reference to lookup model parameters, and it a valid graph file for
plotgraph (Chapter 8).
Variofit Mathematics
Experimental Semivariogram Calculation
See Chapter 8 (vario) Mathematics Section on semivariogram calculations.
186
Variofit Mathematics
2
errormin
= * ( h i ) mod el ( h i )
i =1
(9-1)
This process is not perfect, but it does yield reasonable results and it can give the experienced user
a good starting equation when fitting complicated nested models.
In the case of a Spherical Model with a nugget and one sill, the model semivariogram is defined as
(Journel and Huijbregts, 1978):
h3
h
( h ) = C1.5 0.5 3 + C o
a
a
(9-2)
(h) = C + Co
; for h > a.
(9-3)
( h ) = C o
; for h = 0.
(9-4)
; for h <= a.
(9-5)
C
C0
a
h
= point variance.
= nugget or random variance.
= range.
= sample distance.
h3
h
error = * ( h ) C1.5 0.5 3 + C o
a
h3
h
fi = 1.5 0.5 3
a
a
(9-6)
(9-7)
Squaring the error and expanding equation (9-7) over the number of samples, and summing from i
= 1 to n (n = number of sample pairs):
187
error 2 = [ * (h i )
n
i =1
i =1
i =1
i =1
error 2 = * (h i )
2 * ( h i )(Cfi + C o ) + C2 fi 2 + 2Cfi Co + Co 2
i =1
i =1
i =1
(9-8)
2 * ( h i )Cfi 2 * ( h i )Co + C2 fi 2 +
2 Cfi Co +nCo 2
i =1
(9-9)
Equation (9-9) is minimized by setting its first two derivatives with respect to C and C0 equal to
zero:
n
S / C = 2
* ( h i )fi + 2C
i =1
fi 2 + 2C o
i =1
f = 0
i
i =1
therefore
n
+ Co
i =1
f = * (h )f
i
i =1
i i
i =1
(9-10)
and
n
S / C o = 2
* ( h ) + 2C f + 2C n = 0
i
i =1
i =1
therefore
n
f + C n = * (h )
i
i =1
i =1
(9-11)
Using a matrix form, the two unknowns, C and C0, can be calculated by:
n
fi 2
i =1
n
fi
i =1
188
fi
C
i =1
C =
o
n
* ( h i )fi
i =1
(h i )
i =1
(9-12)
UNCERT Users Manual
Variofit Mathematics
where
c = A 1 h
is the equation to be solved. Once the combination of C and C0 which minimize the solution are
determined, the sum of squared errors is calculated, and a new range (a) is selected. This is done
iteratively until the range (a) which yields the minimum error is found. Generally, it is sufficient to
evaluate a to plus or minus 1/100th of the maximum distance between data points (Instead of
using the maximum distance across the data set, another distance may be more appropriate and
yield more stable estimates). The form of the solution matrices in equation (9-12) are the same
regardless of the type model that is fit to the data. The only difference is that the term fi, is
evaluated using the appropriate model. These are defined as follows (Journel and Huijbregts,
1978):
Exponential Model:
fi = 1 e( r / a )
; r = hi
(9-13a)
; r = hi
(9-13b)
; r = hi
(9-13c)
; r = hi
(9-13d)
Gaussian Model:
fi = 1 e( r
/ a2
Logarithmic Model:
fi = log( r )
Linear Model:
fi = mr
fi = mr
; r = h i for h i a
fi = C
; for h i > a
Spherical Model:
fi = C 1.5( r / a ) 0.5 r 3 / a 3
fi = C
))
(9-13e)
; r = h i for h i a
; for h i > a
(9-13f)
189
where for the linear models, m is a slope term. NOTE: In the future Power and Hole Effect models
will be included in the software.
Power Model:
fi = C 0 + ph
(9-13g)
When describing the experimental semivariogram, often a nugget and a single point variance (i.e.
one sill) term is inadequate. Models that fit such data have what is called a nested structure, and
may be described by several point variance terms (i.e., C0, C1, C2,...). So, for a two-nested
semivariogram:
( h ) = C o + C1d i + C 2 e i
(9-14)
where
C0
C1
C2
di, ei
=
=
=
=
nugget.
point variance of the first nest with range a1.
point variance of the second nest with range a2, where a2 a1.
an experimental model, as f used in equations 9-13a-g. Note that these models
may be of different types.
Under this arrangement, three terms, C0, C1, and C2, must be determined. The solution matrix is:
n
ei2
i =1
n
eid i
i
1
=
n
ei
i =1
ei
i =1
C 2
n
d i C1 =
i =1
Co
e d
i i
i =1
n
i =1
n
i =1
* ( h i )e i
i =1
*
( h i )d i
i =1
n *
(h i )
i =1
(9-15)
For a model with three nests, four terms, C0, C1, C2, and C3 must be determined. The format is
similar:
( h ) = Co + C1d i + C2 e i + C3fi
(9-16)
where
C0
C1
190
= nugget.
= point variance of the first nest with range a<em>1</em>.
Variofit Mathematics
C2
= point variance of the second nest with range a2, where a2 a1.
C3
= point variance of the second nest with range a3, where a3 a2 a1.
di, ei, fi = n experimental model, as f used in equations 9.13a-g. Note that these models
may be of different types.
and the solution matrix is:
n
fi 2
i =1
n
fi e i
i =1
n
fi d i
i =1
n
fi
i =1
e e d
e d d
i =1
n
fi
i =1
n
C
ei 3
C
i =1 2 =
n
C1
di
C o
i =1
fi e i
fi d i
i =1
n
i =1
n
i i
i =1
n
i i
i =1
n
i =1
i =1
n
i =1
* ( h i )fi
i =1
* ( h i )e i
i =1
*
( h i )d i
i =1
* (h i )
i =1
(9-17)
To demonstrate how these matrices were developed, the final case developed where four nests are
used will be derived. In this case there are nine unknowns, the five point variance terms C0, C1, C2,
C3, and C<em>4</em>, and four ranges. In this case, as in the previous cases the point variance
terms are determined explicitly, but the four range terms are determined through an iterative
process. For the system where:
( h ) = Co + C1d i + C2 e i + C3fi + C 4 g i
(9-18)
where
C0
C1
C2
C3
C4
di, ei, fi, gi
=
=
=
=
=
=
nugget.
point variance of the first nest with range a1.
point variance of the second nest with range a2, where a2 a1.
point variance of the third nest with range a3, where a3 a2 a1.
point variance of the fourth nest with range a4, where a4 a3 a2 a1.
an experimental model, as f is used in equations 9-13a-g. These models
may be of different types.
]2
(9-19)
191
Expanding the error terms and minimizing yields (equations have been divided by 2.0):
S
=
C 4
S
=
C3
* (h )g + C g
i
i =1
+ C3
i =1
* ( h i )fi + C 4
i =1
n
i =1
S
=
C2
S
=
C1
S
=
Co
i i
i =1
i =1
i =1
e i d i + Co
i =1
d i 2 + Co
i =1
e
i =1
e i d i + C1
i =1
i =1
fi d i + C 2
i =1
e i 2 + C1
fi d i + Co
i =1
i =1
g i d i + C3
fi e i + C2
i =1
fi e i + C1
i =1
i =1
i i
i =1
fi 2 + C2
g i e i + C3
i =1
* ( h i )d i + C 4
i =1
i =1
* ( h i )e i + C 4
i i
g i fi + C3
g f + C g e + C g d + C g
i =1
i =1
* (h )d + C g + C f + C e + C d + C n
i
i =1
i =1
i =1
i =1
i =1
where:
S
S
S
S
S
=
=
=
=
=0
C o C1 C 2 C3 C 4
rearranging:
n
i =1
i =1
i =1
i =1
i =1
i =1
i =1
i =1
i =1
i =1
i =1
i =1
i =1
i =1
i =1
i =1
i =1
i =1
i =1
i =1
C 4 g i 2 + C3 g i fi + C 2 g i e i + C1 g i d i + C o g i
C 4 g i fi + C3 fi 2 + C2 fi e i + C1 fi d i + Co fi
C 4 g i e i + C3 fi e i + C 2 e i 2 + C1 e i d i + C o e i
C 4 g i d i + C3 fi d i + C 2 e i d i + C1 d i 2 + Co d i
n
i =1
i =1
i =1
i =1
C 4 g i + C3 fi + C 2 e i + C1 d i + Co n
192
* (h i )gi
i =1
(9-20)
= * ( h i )fi
i =1
(9-21)
= * ( h i )e i
i =1
(9-22)
= * ( h i )d i
i =1
(9-23)
= * (h i )
i =1
(9-24)
Variofit Mathematics
gi2
i =1
n
g i fi
i =1
n
gi ei
i =1
n
fi d i
i
=
1
n
gi
i =1
f f e f d
f e e e d
f d e d d
giei
i =1
n
i =1
n
i i
i i
i i
i =1
i i
i =1
n
i =1
n
i =1
n
gid i
i =1
n
i =1
n
i =1
n
gi
i =1
n
fi C
4
i =1
C
n
3
e i C 2 =
C
i =1
1
n
d i C o
i =1
i =1
n
g i fi
i i
i =1
n
i i
i =1
n
i =1
n
i =1
i =1
* ( h i )g i
i =1
* ( h i )fi
i =1
*
( h i )e i
i =1
* ( h i )d i
i =1n
*
(h i )
i =1
(9-25)
The approach describe above, explains the process for calculating the least-squares best-fit curve
for a given set of range values, a1, a2, a3, and a4, where the nugget (C0) and variance terms (C1, C2,
C3, and C4) are unknown
(9-26)
Another possible constraint is to assume that the nugget (C0) equals the y-intercept of a line drawn
through the shortest two lag points in the experimental semivariogram. Here the nugget equals:
C o = y mx
(9-27)
where x and y are the range and (h) coordinates of the first lag point, and m is the slope of the line
between the shortest two lag points. This isnt necessarily a good assumption, but on a practical
basis, it can improve the model solution in some cases. Another alternative is to weight each lag
point by the number of data pairs that went into defining that point, the theory being that a point
represented by many data points should be honored more closely then a lag point only based on a
few sample pairs.
193
Latin-Hypercube Sampling
Once the statistical distribution of experimental semivariograms has been calculated,
semivariograms can be fit through the zone defined by the error-bars. The objective is not to make
a single best estimate of the character of the subsurface (i.e. a single semivariogram), rather the
objective is to select model semivariograms representative of the range of possible conditions at the
site. This range of semivariograms is used with the original data to conduct indicator kriging and
stochastic simulation to generate multiple interpretations of the subsurface. One approach is to use
Monte-Carlo techniques and randomly select 100 model semivariograms that fall within the range
of reasonable solutions. This might appear reasonable, but expert opinion of conditions at the site
may indicate that models generated with nuggets approximately equal to the sill or ranges near zero
are unreasonable or unrealistic, even though the jackknifed experimental semivariogram indicates
such semivariogram models of the site are possible interpretations.
An alternative approach to random selection of a large number of possible semivariogram models is
to use latin-hypercube sampling. This reduces the number of simulations required to insure that the
flavor of all alternatives is addressed (McKay et al, 1979).
This approach can yield a daunting number of simulations, many of which will bear little
resemblance to one another if the data set is small. Such a situation results in the obvious
conclusion that some data sets provide so little information about a site that more data should be
collected before further assessment is undertaken. If the data are more abundant, the range of
possible models will be constrained, and the simulated models may represent a modest range of
possible subsurface interpretations. If the jackknifed semivariogram has small error-bars, the entire
process of using a variety of semivariograms for simulation of one site can be omitted because the
process is not likely to indicate a larger uncertainty associated with the interpretation of such well
characterized sites.
Recall that the objective of this approach is not to make a single best estimate of the subsurface
interpretation, but to evaluate the possible range of subsurface character based on available data.
From a purely mathematical approach this may be computationally intractable, however
incorporation of expert opinion into the process makes it possible to limit the reasonable
alternatives.
Bibliography (variofit)
Burden, R.L. and J.D. Faires, 1985, Numerical Analysis, Third Edition, Prindle, Weber, and
Schmidt, Boston, pp 342-353.
Clark, I., 1979, Practical Geostatistics, Elsevier Applied Science, London and New York.
Journel, A.G. and Ch. J. Huijbregts, 1978, Mining Geostatistics, Academic Press, London.
194
Bibliography (variofit)
McKay M.D., R.J. Beckman and W.J. Conover. 1979. A Comparison of Three Methods for
Searching of Input Variables in the Analysis of Output From a Computer Code,
Technometrics. Vol. 21, No. 2, pp 239-245.
Wingle, W.L., and E.P. Poeter, 1993, Evaluating Uncertainty Associated with Semivariograms
Applied to Site Characterization. Ground Water, Vol. 31, No. 5, pp 725-734.
195
196
CHAPTER 10
Grid
The Grid module interpolates parameter values at locations were there are no physical data. This is
done using various interpolation algorithms (inverse-distance, kriging, trend-surface analysis)
based on irregularly spaced data. Sometimes it is of interest to estimate what is occurring between
data locations. For other applications, for convenience, or for clarity, irregularly spaced data must
be interpolated onto a regular grid to be useful. For example contour, surface, and block require
that the data being viewed be gridded with a rectangular pattern. These programs then allow the
user to visually view the interpolated estimate of the field data. Grid is used to interpolate values at
locations of convenient based on field data.
Within grid there are several gridding algorithms; inverse-distance, simple and ordinary kriging
method, and trend-surface analysis. Inverse-distance is a relatively simple method which estimates
the value of a location based on the distance and value of surrounding points. Kriging does much
the same thing as inverse-distance, except kriging also considers spatial statistics describing how
the field data varies directionally. Kriging is often referred to a the best unbiased estimator to
estimate a value for a given location. Trend-surface analysis is basically a least-squares regression
technique which assumes a data value is a function of a regional trend and minor local
variations. The calculated trend-surface attempts to model the regional component.
The grid application is composed of two sections (Figure 10.1); the main menu-bar, and the log/
status area. The menu-bar is used to select all grid commands and the log/status area contains
relevant data about the status of the program or the state of on-going calculations.
197
Grid
FIGURE 10-1. This is an example of the grid application window. The main menu-bar is on the
top of the application window, and the status/discussion window in the lower portion.
contour, surface, or block (3D grid). Help gives the user a selection of pop-up help topics. Each
menu item is fully described below with all the available options.
File
The File sub-menu options control file and print handling, and exiting the program. The options
include Open, View, Save, Save as, Save Preferences, Quit, and Quit Without Saving.
Open
Selecting File:Open generates a pop-up dialog. This dialog functions exactly as the dialog in
Figure 5.2 (Plotgraph - Chapter 5) and allows the user to select an existing data file. As for
plotgraph files, the default data extension is *.dat.
View
File:View:Data pops up a simple screen editor with the last opened data file. File:View:Results
pops up the screen editor with the calculated grid results.
Save Preferences
When using programs with many user options, it is not possible for the program to always pick
reasonable default values for each parameter or input variable. For this reason preference files were
created (See Appendix C). These allow the user to define a unique set of defaults applicable to
the particular project. When File:Save Preferences is selected, surface determines how all the input
variables are currently defined and writes them to the file grid.prf. The next time grid is run
these default values will be used
198
WARNING:
If grid.prf already exists, you will be warned that it is about to be overwritten. If you do not want the old version destroyed you must move it to a
new file (e.g. the UNIX command mv grid.prf grid.old.prf would be sufficient).
When you press OK the old version will be over-written! The renaming cannot
be done currently from within the application. To rename the file you will have
to execute the UNIX mv command from a UNIX prompt in another window. If
grid.prf does not exist in the current directory, it is created. This is an ASCII
file and can be edited by the user. See Appendix C for details.
Quit
File:Quit terminates the program, but if a grid has been calculated and not yet saved, the user will
first be queried to supply a file to save the changes in.
Method
The Method menu-bar options allows the user to select a gridding algorithm. The active method
will be highlighted in red on the pull-down menu. The active method is selected by choosing from
the pull-down menu. The theory for each method is described in the Grid Mathematics section later
in the chapter, and the use of each methods follows below.
NOTE: Several options (Simple 2D Kriging (GSLIB), Cokriging (GSLIB), and Indicator
Kriging (GSLIB)) are grayed out. These methods currently are not installed and
will not be discussed further.
Inverse-Distance
Method:Inverse-Distance creates the pop-up dialog show in Figure 10.2. This dialog is used to
control the inverse-distance gridding algorithm.
The parameters defining the search and weighting parameters are the search radius, tolerance,
power, nearest points, search type, and the grid file name. The search radius specifies the
maximum a data point can be from the point being gridded and still have an influence on its
estimated value; the default value is the maximum diagonal distance across the data set (i.e. all
points may influence any grid location). The tolerance is used to handle a problem with the
inverse-distance algorithm. If the point being gridded is at the same location as a data point, the
distance between the two locations is zero; if this is not specially treated, a division by zero error
occurs. The tolerance is used to avoid this problem (See Mathematics Section for further details).
UNCERT Users Manual
199
Grid
Power is a weighting function for the distance between the gridded location and the surrounding
data points. If power equals 1.0, the inverse-distance algorithm regresses to a normal linear
interpolation. Power set equal to 2.0 is a commonly used weight (See Mathematics Section for
details). The nearest points refer to the maximum number of nearby points that will be used to
estimate a grid location, 10 is a commonly used number. Depending on the data set more or fewer
points may be called for.
NOTE: In large data sets, you should not use all of the points in the data set as nearest
neighbors. Though it seems reasonable to maximize the use of all the available data,
doing so tends to pull the estimate of the grid location toward the mean of the data
set despite local trends.
Currently, the software only supports a normal search; not quadrant or octant searches. In some
data sets it is possible to get artificial benching on slopes, and peaks and valleys or pits tend to
become flat; these different search methods can reduce these problems (See Mathematics Section).
The grid file is the file where the calculated results will be saved. The file will be formatted for
contour (Chapter 11), surface (Chapter 12), and block (Chapter 13) using a GRID CENTERED
GRID format (Figure 10.3, see Setting up a Standard Input Grid File section (Chapter 11)). The
filename can be typed directly in the text field or, the Select button next to the text entry field maybe
used. It generates a pop-up dialog similar to that in Figure 5.2. The default search extension for
2D solutions, though is *.srf. The filename may also be selected using the File:Save as menu
options (This option, however will also save the file; not just rename the destination). After the grid
is calculated, the results can be saved to the grid file by pressing the Save button or by selecting the
File:Save menu-bar option.
To calculate the inverse-distance grid, press 1) the Calculate button at the bottom of this pop-up
dialog, or 2) press the Calculate button at the bottom of either the bottom of the 2D or 3D pop-up
dialogs, or 3) select either the Method:Calculate or the Grid:Calculate menu-bar options. When
the calculation is started, the pop-up dialog shown in Figure 10.4 will be displayed. This dialog
will be displayed as long as the computer is calculating the grid. When the computation is
200
FIGURE 10-3. There are two grid formats used in grid. Which is used is determined by which
gridding algorithm is used. GRID CENTERED GRIDs are used by inverse-distance and trendsurface analysis. NODE CENTERED GRID is used by kriging.
complete it will be removed. If you want to abort the calculation, press the Stop button. As soon
as the current column calculation is complete (This may take several second on large grids or large
data sets), the calculation will be stopped. Note; if the calculation is aborted, no usable information
is retained. While the grid is being calculated, the current column being calculated will be
displayed in the log/status window, and Grid Complete will be printed when the calculation is
done.
FIGURE 10-4. Inverse-Distance Calculating... pop-up
dialog. This is displayed while the inverse-distance grid
is being calculated. Similar dialogs are display for each
of the other gridding algorithms. The dialog will
disappear when the calculation is complete. To stop a
grid calculation in progress press the Stop button.
WARNING:
When the grid is calculated, it is not automatically saved. If you want to save
the calculation, press the Save button on this dialog or select the File:Save
menu-bar option. This grid calculation will be lost the next time any grid using
any method is calculated!
201
Grid
FIGURE 10-5. Ordinary Kriging Analysis pop-up dialog. This dialog is used to set parameters
NOTE: Unlike other modules in grid, the output file format is NODE CENTERED GRID
(Figure 10.3), not GRID CENTERED GRID. These formats are discussed in the
Setting up a Standard Input Grid File section of Chapter 11 (contour).
WARNING:
This module should not be used by those unfamiliar with kriging and
semivariogram analysis.
For the kriging algorithm there are many variable take must be set by the user, there are some that
can ignored for some data sets, and there are some where the default values are acceptable (there
may be better values) in most cases. These parameters are described below. For a complete
description of these variable it is suggested that the reader refer to Deutsch and Journel (1992).
The first item to decide on is whether Simple or Ordinary Kriging will be used. If Simple kriging is
used, you must enter the data sets mean value (This can be determined in histo). The data set mean
is assumed, but if a different value is desired, press the Simple Mean button. The pop-up dialog
shown in Figure 10.6 can be used set a new value. Note, it is important to check and set these
values correctly if zonal kriging is used (discussed below), because the mean for the data set may
not be relevant for each zone.
202
dialog is used to define the sample mean for all data points
within the region.
The output file names and the Debug Level can also be specified. Be careful not to save results over
previously existing files. A 0 Debug Level gives very little information. A 3 Debug Level can give
many megabytes of debugging information.
To set the remaining information, there are several buttons for creating input dialogs for defining
search, semivariogram, drift, and zonal parameters.
Finally, to calculate the grid, press the Calculate button at the bottom of Figure 10.4 or select the
Method:Calculate menu option.
WARNING:
After the grid is calculated, the results are not saved to file until the Save
buttons are pressed. Using the File:Quit menu option will warn you that you
have unsaved work. Using the File:Quit menu option will not; it will quit, not
saving the results!
Model Semivariograms
To define the spatial correlation of data, the Model Semivariogram(s) must be defined. Pressing the
Model Semivariogram button will create the dialog shown in Figure 10.7. This dialog allows the
user to specify the Range, C(Sill), Nugget, Semivariogram Model Type, Anisotropys, and Model
Orientations for up to four nested structures. The Semivariogram Angles and Anisotropies are
calculated as described in Figure 10.8. The Range, C, Nugget, and Models can be determined using
vario (Chapter 8) and variofit Chapter 9).
If zonal kriging is used, complete semivariogram model definitions must be defined for each
zone.
It is also possible to define different semivariogram models for each axis, though the sills in each
direction must be identical in all directions at the maximum range. This allows up to three
semivariogram models to be defined for each zone (Principle (X), Y-Axis, and Z-Axis). This takes
more work to model, but in some cases, modeling the data, it appears that different directions fit
different models (e.g. horizontal is a two- nested spherical, but vertical is exponential). This option
can be selected by turning the Semivariogram Anisotropies Off. The normal and default position is
On.
203
Grid
FIGURE 10-7. Semivariogram Model Definition pop-up dialog. This dialog is used to set the
Search parameters
The Search Parameters pop-up dialog (Figure 10.9) is used to define data regarding the search
parameters for locating the data points nearest to the grid point/block being located (The nearest
neighbors). The Search Radius defines the maximum distance away from the grid location being
estimated, that data points will be chosen from. If a search ellipsoid is used for this search (When
the semivariogram models suggest the data is anisotropic), the Search Angles (1,2,3) and the
Anisotropies must be defined. To determine what these values should be, refer to Figure 10.8. If
there is no anisotropy to the data, the search and anisotropy values may be ignored. If the data is
fairly regularly distributed, the Normal Search is appropriate. If the data are clustered, or data
values are along lines (wells, contour lines, geophysical survey lines) it is often better to use an
Octant Search. If an Octant Search is used you must also select the Maximum number of points
per Octant to be used (This is the number of points that will be used from each octant if they are
available). Associated with this is the Minimum and Maximum Nearest Points used. If fewer then
the Minimum number of points are available, the grid location will not be estimated. The Minimum
and Maximum Trimming Limits are used to filter out data points where no appropriate value is
available. Only data points with values between the Trimming Limits will be used. Block
204
Discretization in the X, Y, and Z directions are used to differentiate between point and block kriging.
If all the values are 1, point kriging is used, otherwise block kriging is used.
If zonal kriging is used, the Search Radius, Search Angles (1,2,3) and the Anisotropies must be
defined for each zone.
Drift Parameters
The Search Parameters pop-up dialog (Figure 10.10) is used to define data regarding drift. Some
data sets show an obvious trend, or a trend becomes apparent in the semivariogram analysis. If the
trend can be accurately estimated, it should be removed, the residuals kriged, and finally the trend
added back in. Where a trend is present, this should give better results. The program can krige with
No Drift, it can internally Estimate Trend (Drift), or it can Read Drift From a File. If the program
is to estimate the drift internally, the user must specify the nature of the drift in the X, Y, and Z axes.
This is done by pressing the Set Drift Orders button. This creates the dialog shown in Figure 10.11.
205
Grid
FIGURE 10-9. Search Parameters pop-up dialog. This dialog is used to define the data search
For each axis, the drift can be Linear, Quadratic, or Cross-Quadratic. The drift can also be read
from a file, if the user prefers to use another algorithm.
FIGURE 10-10. Drift Parameters pop-up dialog. This dialog is used to define the drift parameters
Zonation Parameters
The Zonation Parameters pop-up dialog (Figure 10.12) is used to setup the model zonation. Under
most circumstances, a single zone will be used, but sometimes more are needed. The maximum
number of zones allowed is ten. The reason one would consider using multiple zones, is when there
are two or more areas within the region which should be described by more then model
semivariogram. In other words, the assumption of stationarity is not valid. An example of how this
is used is shown in Figures 10.13a and 13b. For the site, the upper zone has a short range of less
then 60 m. The lower region has a range of over 300 m. When both zones are modeled together,
the average range is about 150 m and the results are shown in Figure 13a. When the zones are
206
set the drift surface order terms used by the GSLIB kriging
algorithm.
treated separately, results closer to actual site conditions are attained and shown in Figure 13b.
(Note the long continuous zone in the lower portion of the grid that is not present in Figure 13a).
FIGURE 10-12. Grid Zonation
The Model Semivariograms button creates the same pop-up dialog as shown in Figure 10.7 and
described above. The Select Zone Definition Files pop-up dialog is shown in Figure 10.14. This
file is used to select and view the mask files for each zone (The file format is discussed in the
Setting up the Input File section below).
NOTE: There must be a zone mask for each zone, every cell in the grid must be belong to
one and only one zone, and the zone mask files must have exactly the same number
of layers, rows, and columns as the grid being calculated.
The Relationship Between Zones button creates the dialog shown in Figure 10.15. This dialog
allows the user to specify how data points in one zone will be used, will be used when they are in
UNCERT Users Manual
207
Grid
a).
b).
FIGURE 10-13 a,b. These two figures demonstrate how using multiple zones can effect the grid
solution. In a) a single semivariogram model was used with a range of about 150m. In b) a short
range (50m) has used in the upper portion of the cross-section, and 300m has used in the lower
section. In b) the continuous nature of the lower portion of the actual site and the discontinuous
nature of the upper section is well captured. The vertical features on the figures are the well
locations.
the search neighborhood of a point being kriged in another zone. A Sharp transition means that
only points from the specified zone will be ignored (Figure 10.16a). A Gradational transition
means that points will be shared across the specified zones (Figure 10.16b). Note, at the transition
from one zone to another, the transition can still be abrupt. This can happen in neighboring cells
because the model semivariograms are different and the search neighborhood, and there fore point
used in the estimation, can be substantially different. This problem will tend to disappear with
increasing model ranges and sample data densities. The Fuzzy transition is not yet supported.
Note that this relationship matrix can be loaded or saved using the dialog in Figure 10.12 (The file
format is discussed in the Setting up the Input File section below).
208
definition files.
FIGURE 10-15. Relationship between Zones pop-
Trend-Surface Analysis
Method:Trend-Surface Analysis creates the pop-up dialog shown in Figure 10.17. This dialog is
used to control the trend-surface gridding algorithm.
For the trend-surface algorithm there is only one parameter, the surface order. The remaining
variables, ANOVA File, Grid File, and Residual File, simply define where the results will be saved.
Valid entries for Order are one to five; higher order surfaces are not supported; the solution matrix
size becomes prohibitive.
NOTE: There must be at least:
2D: ((order + 1) * (order + 2)) / 2
3D: ((order + 1) * (order + 2) * (order + 3) / 6
data points in the data set. This is the number of unknowns in the solution equation.
209
Grid
a).
b).
FIGURE 10-16 a,b. These two figures show the difference between a) sharp zonal transitions, and
b) gradational transitions (middle and bottom zone, middle and top transition is sharp).
210
Filenames for the ANOVA, Grid, or Residual files may be entered in the appropriate text field, or by
pressing the appropriate Select button a pop-up dialog (similar to Figure 5.2) will be created
allowing the output file to be selected. The default data file extensions are:
ANOVA: *.anova
Grid:
*.srf (2D) or *.bck (3D)
Residual: *.res.dat
To save the data after a calculation is made, press each of the appropriate Save buttons.
WARNING:
None of the data is automatically saved. If you want to save information you
must do so manually before the next calculation is made using any grid
method!
The ANOVA file will store statistical information about the trend-surface (the same data printed to
the log-status window). The Grid file will store the gridded regional trend- surface. The
Residual file will store six columns of data; the X, Y, Z, location and observed value for each data
point, the residual (actual - estimate), and the trend - surface estimate at each data location.
To calculate the trend-surface grid, press 1) the Calculate button at the bottom of this pop-up
dialog, or 2) press the Calculate button at the bottom of either the bottom of the 2D or 3D pop-up
dialogs, or 3) select either the Method:Calculate or the Grid:Calculate menu-bar options. When
the calculation is started, the pop-up dialog shown in Figure 10.4 will be displayed. This dialog
will be displayed as long as the computer is calculating the grid (This is a fairly fast calculation).
When the computation is complete Trend-Surface Grid Complete will be printed in the log-status
window.
WARNING:
When the grid is calculated, it is not automatically saved. If you want to save
the calculation, press the Save button on this dialog or select the File:Save
menu-bar option. This grid calculation will be lost the next time any grid using
any method is calculated!
211
Grid
When the trend-surface is calculated the trend-surface parameters and several statistical parameters
are printed to the log-status window. The statistical variables are calculated are used to evaluate the
goodness of fit of the calculated surface to the field data. These variables are summarize here, but
further details may be found in the Mathematics section of this chapter, or in Davis (1986). The
statistical analysis is referred to as the Analysis of Variance (ANOVA). The terms of interest are:
SST
SSR
MSR
SSE
MSE
R2
= Squared Multiple Correlation Coefficient.
R
= Multiple Correlation Coefficient.
Fstatistic = A test statistic used to define the worth of the regression.
The two main variables of concern are the R2 and the Fstatistic. The R2 term when multiplied by
100% defines the amount of data variation explain by the regression trend-surface (Davis, 1986).
NOTE: If the number of data observation points equals the number of coefficients in the
trend-surface equation, the surface will pass through all of the points and R2 will
equal 1.0 (100%); i.e. all the variation will be explained.
The Fstatistic is used to evaluate whether all of the coefficients as a group are significantly different
then zero, and the surface is considered meaningful. All of these terms and their use are defined
and explained in the Mathematics section. Suffice it to say here though that the best trend-surface
will maximize R2 and minimize the F-statistic.
Calculate
Once the gridding method/algorithm has been selected, and the grid dimensions (see below) have
been specified, selecting Method:Calculate will start the grid calculation. The status of the
calculation will be shown in the log/status window (Figure 10.1).
NOTE: When the calculation is complete, the results are not automatically saved. To save
the results, select File:Save or File:Save as.
Grid
The Grid sub-menu options control the setup and calculation of the 2D or 3D grid. The sub-menu
options include 2D-Grid, 3D-Grid, and Calculate.
212
2D-Grid
The Grid:2D-Grid menu option defines the gridding parameters needed for estimating twodimensional data, typically data in the X-Y, X-Z, or Y-Z plane. When selected the pop-up dialog
shown in Figure 10.18 is displayed. This dialog defines , the column location of the data within the
data file, and the extents and the density of the calculated grid.
FIGURE 10-18. Grid:2D Grid pop-up
dialog. This dialog allows the user to
set gridding parameters needed for
estimating a two-dimensional grid.
This dialog allows the user to define
variables specific the columns in the
data file specific to the problem, and
the gridding extents of the desired
map.
NOTE: If this option is used, the saved grid file will be saved as if the data came from the XY plane, even if they really came from the X-Z or Y-Z plane.
By default, the program assumes the X data is in column 1, the Y data in column 2, and the Z
(elevation) data is in column 3. If your data file is prepared differently, or you have multiple Z data
columns you will have to redefine the column definitions.
The Grid Dimensions area is where the density and the extent of the grid is specified. The number
of Rows and Columns desired is of user preference. As a rule of thumb though a denser grid (more
rows and columns) will generate a smoother, but not necessarily a more accurate surface. The X
and Y Minimum and Maximum values are by default set the extents of the X and Y data. Depending
on the area of interest within the gridded data, or simply shifting these values to whole numbers
(e.g. 0.19 is close to 0.0) so the coordinate labels on contour maps are not to busy, the grid extents
are user definable.
213
Grid
3D-Grid
The Grid:3D-Grid menu option is essentially the same as the 2D-Grid menu option described
above, but is used to define the grid for three-dimensional data. In addition to the fields shown in
the 2D dialog (Figure 10.18), the 3D dialog (Figure 10.19) has entries for the Z Column, the
number of Layers in the grid, and the Z Minimum and Z Maximum extents of the model grid.
FIGURE 10-19. Grid:3D Grid pop-up
dialog. This dialog allows the user to
set gridding parameters needed for
estimating a three-dimensional grid.
This dialog allows the user to define
variables specific the columns in the
data file specific to the problem, and
the gridding extents of the desired
map.
Calculate
Based on which gridding option has been selected, 2D or 3D (high-lighted in red on Grid pulldown menu), selecting the Grid:Calculate option calculates the requested grid. As the grid is being
generated, the status on the solution is displayed in the log window shown in Figure 10.1. When
the calculation is done, the message Grid Complete will be displayed in the log/status window.
NOTE: Grid:Calculate only calculates the grid. Nothing is saved to the hard disk. If
Grid:Quit Without Saving is used to terminate the program after the calculation the
calculated grid is lost. The user must specifically save the file or quit using the
214
Grid:Quit menu option. The simple quit option checks to see if a grid has been
calculated and not saved. If it has not been saved, the user will be queried for a file
name
Log
The Log menu option is supplied to allow the user to save, view, or print all text which has been
written to the log/status window by the program or added by the user (The log window is also a
simple text editor). The options include View Log, Save, Save as, Clear, and Print. View Log, Save,
and Save as are similar in operation to the menu options under File described above.
View
View allows the user to display the calculated grid as a Contour Map, Surface Map, or a Block Map.
When these menu items are selected, the program saves the calculated grid. If no file has been
saved before, the user is prompted for as to whether the data should be saved to a file (Figure
10.20). If the Save & Plot button is pressed, the grid will be saved to the last selected plot
file. If no file has been selected previously, the program will request a file name (A similar dialog
to Figure 5.2 appears)). Once the file has been saved, grid passes the data file to the program
contour (Chapter 11), surface (Chapter 12), or block (Chapter 13) respectively. Once the
visualization package is open, it is separate and no longer dependent on grid. To terminate the
program you must quit it separately. If the Cancel button is pressed instead, the calculated grid is
not saved and the visualization package is not called.
FIGURE 10-20. Save & Plot or Cancel pop-up dialog. This dialog appears when the users requests
to view the gridded data using contour, surface, or block, and the calculated grid has not been
saved. To view the calculated grid, it must first be saved to a plot file. To save the calculated grid,
press Save & Plot. To stop, press the Cancel button.
NOTE: It is good practice to close block, contour, or surface when you are done, because the
next time you view a file using grids View menu item it will start a new instance of
the program (i.e. you can have multiple versions of block, contour, or surface
running simultaneously); each program uses additional computer memory, and if
you are not careful you can tax the limits of the computers RAM memory.
Help
Help works exactly as explained in Chapter 5 (plotgraph, Figure 5.15) Help section.
UNCERT Users Manual
215
Grid
Inverse-Distance Example
To grid the data shown in Figure 10.21, and create the contour map shown in Figure 10.22 follow
the steps below:
FIGURE 10-21. This map shows the distribution of modeled contaminant concentration values
which need to be interpolated to a regular grid. The overlying mess shows one possible grid
configuration. NOTE: This map was generated by contour (Chapter 11).
216
FIGURE 10-22. This map shows a contour map of a gridded data. The crosses (+) show the
location of the actual data points. The map was gridded using a 50(X) by 25(Z) grid. NOTE: This
map was generated using contour (Chapter 11).
calculated, and 2) it creates the pop-up dialog shown in Figure 10.2 allowing parameters about the
gridding method to be defined. For this example the default Solution Parameters and the Search
Type are reasonable.
Next, select the Grid:2D Grid menu-bar option. The pop-up dialog shown in Figure 10.18 will be
created. The Y Column, however needs to be redefined. Since this data set is a cross-section (X-Z
plane) we need to map the Z data to the Y-axis on the grid (the grid always grids using the X and Y
axes). In this case the Y Column should specify the Z data column in the file; enter 3 in the Y
Column field. The Value Column now specifies data column 3; this needs to be changed to column
4. When you hit <RETURN> in each field, or hit the Apply dialog button, the Minimum and
Maximum File Data values should appear as:
X Column:
0.00
to
Y Column:
12.5
to
Value Column: -0.0175 to
350.0
139.0
100.0
The Grid Dimensions are reasonable, but if you wanted the grid cells to be approximately square,
you could change the number of Columns (X) to 50 (grid cell 7.14 x 5.23). Exit the dialog by
pressing the Done dialog button.
To calculate the grid, select Grid:Calculate from the menu-bar, or press either Calculate button on
the Inverse-Distance dialog or the 2D Grid dialog. This starts the calculation process. In the text
217
Grid
area of the main window the status of the solution is displayed. As each column is calculated it is
printed. When the calculation is completed, the message Grid Complete is displayed. Note that
the program has only calculated the grid; no results have been saved to a file! To save the results,
use the File:Save or File:Save as menu-bar options, or the Save button in the Inverse-Distance
dialog.
To view the results, select form the menu-bar View:Contour Map or View:Surface Map. If the
results have been saved, contour (Chapter 11) or surface (Chapter 12) is called and the gridded
results of the data set are mapped. If the results havent been saved you will be requested to save
the results. If no file has been previously saved, you will be queried for a file name, otherwise the
results will be saved to the file used last (i.e. the previous results will be overwritten).
218
a).
b).
FIGURE 10-23 a,b. These maps show the first order trend-surface (a.) and the residual map (b.) for
This process can be repeated for each trend-surface order, two through five. The resultant trendsurface and residual maps are shown in Figures 24a and 24b, Figures 25a and 25b, Figures 26a and
26b, and Figures 27a and 27b. The results are summarized below.
a).
b).
FIGURE 10-24 a,b. These maps show the second order trend-surface (a.) and the residual map (b.)
2nd order:
estimate
219
Grid
a).
b).
FIGURE 10-25 a,b. These maps show the third order trend-surface (a.) and the residual map (b.) for
b).
FIGURE 10-26 a,b. These maps show the first forth trend-surface (a.) and the residual map (b.) for
3rd order:
estimate
220
a).
b).
FIGURE 10-27 a,b. These maps show the fifth order trend-surface (a.) and the residual map (b.) for
estimate
221
Grid
user to accomplish almost anything that is possible from within the X-windows application from
the command line. This feature can be useful when the user does not have a X-windows/Motif
terminal available, or when many graphs need to be processed quickly, and the operation can be
completed in batch mode without user interaction.
Syntax:grid [-anovae ] [-anovaf ] [-cl #] [-dft #] [-dfte ] [-dftf ] [-dfti #] [-dfto #]
[-do {#}] [-este ] [-estf ] [-gd #] [-help] [-kdbge ] [-kdbgf ] [-kdbgl #]
[-kst #] [-kt #] [-lgf ] [-ly #] [-m1 {#}] [-m2 {#}] [-m3 {#}] [-m4 {#}] [-mode ]
[-modf { }] [-ng {#.#}] [-nm {#}] [-np #] [-npmax #] [-npmin #] [-nz #] [-octmax #]
[-out ] [-pr #.#] [-prf ] [-r1 {#.#}] [-r2 {#.#}] [-r3 {#.#}] [-r4 {#.#}] [-rese ]
[-resf ] [-run] [-rw #] [-s1 {#.#}] [-s2 {#.#}] [-s3 {#.#}] [-s4 {#.#}] [-sang1 {#.#}]
[-sang2 {#.#}] [-sang3 {#.#}] [-sn #] [-so #] [-sr #.#] [-st #] [-syaniso {#.#}]
[-szaniso {#.#}] [-tl #.#] [-tmmax #.#] [-tmmin #.#] [-va11 {#.#}] [-va12 {#.#}]
[-va13 {#.#}] [-va14 {#.#}] [-va21 {#.#}] [-va22 {#.#}] [-va23 {#.#}] [-va24 {#.#}]
[-va31 {#.#}] [-va32 {#.#}] [-va33 {#.#}] [-va34 {#.#}] [-vc #] [-xbdes #] [-xc #]
[-xmax #.#] [-xmin #.#] [-ya1 {#.#}] [-ya2 {#.#}] [-ya3 {#.#}] [-ya4 {#.#}] [-ybdes #]
[-yc #] [-ymax #.#] [-ymin#.#] [-za1 {#.#}] [-za2 {#.#}] [-za3 {#.#}] [-za4 {#.#}]
[-zbdes #] [-zc #] [-ze ] [-zf { }] [-zmax #.#] [-zmin #.#] [-zre ] [-zrf { }]
[filename]
Meaning of flag symbols:
#
#.#
{}
=
=
=
=
integer
float
character string.
variable is an array. Values must be seperated by a , and no spaces are
allowed. Do not use the { } symbols on the command line.
NOTES:
222
=
=
=
=
default = *.anova
default = junk.anova
default = 25
default = 0
-dfte
-dftf
-dfti
-dfto
-do {}
=
=
=
=
=
-dfte
-dftf
-gd
-help
-kdbge
-kdbgf
-kdbgl
-kst
=
=
=
=
=
-kt
= kriging method
0 = simple
1 = ordinary
-lgf
-ly
-m1 {}
-m2 {}
-m3 {}
-m4 {}
-mode
-modf {}
-ng {}
=
=
=
=
=
=
default = *.dft
default = junk.dft
default = 5
default = 4
default = 0
default = *.est
default = junk.est
default = 0
default = *.dbg
default = junk.dbg
default = 0
default = 0
default = 1
default = 0
default = 0
default = 0
default = *.mod
default =
default = 0.0
223
Grid
-nm
-np
-npmax
-npmin
-nz
-octmax
-out
-pr
-prf
-r1 {}
-r2 {}
-r3 {}
-r4 {}
-s1 {}
-s2 {}
-s3 {}
-s4 {}
-rese
-resf
-run
-rw
-sang1 {}
-sang2 {}
-sang3 {}
-sn
=
=
=
=
=
=
=
=
=
=
=
=
=
=
=
=
=
=
=
=
=
=
=
=
=
-so
-sr
-st
224
default = 1
default = 10
default = 16
default = 4
default = 1
default = 4
defalut =
default = 2.0
defalut = grid.prf
default = 0.0
default = 0.0
default = 0.0
default = 0.0
default = 0.0
default = 0.0
default = 0.0
default = 0.0
default = *.res.dat
default = junk.res.dat
default = 25
default = 0.0
default = 0.0
default = 0.0
default = 0
default = 1
default = max. diagonal
default = 0
default = 1.0
default = 1.0
default = 1/10 grid spacing
default = 1.0e30
default = -1.0e30
default = 0.0
default = 0.0
default = 0.0
default = 0.0
default = 0.0
UNCERT Users Manual
-va22 {}
-va23 {}
-va24 {}
-va31 {}
-va32 {}
-va33 {}
-va34 {}
-vc
-xbdes
-xc
-xmax
-xmin
-ya1 {}
-ya2 {}
-ya3 {}
-ya4 {}
-ybdes
-yc
-ymax
-ymin
-za1 {}
-za2 {}
-za3 {}
-za4 {}
-ybdes
-zc
-zc
-zf
-zmax
-zmin
-zrc
-zrf
=
=
=
=
=
=
=
=
=
=
=
=
=
=
=
=
=
=
=
=
=
=
=
=
=
=
=
=
=
=
=
=
default = 0.0
default = 0.0
default = 0.0
default = 0.0
default = 0.0
default = 0.0
default = 0.0
default = 4
default = 1
default = 1
default = maximun data X
default = minimun data X
default = 1.0
default = 1.0
default = 1.0
default = 1.0
default = 1
default = 2
default = maximun data Y
default = minimun data Y
default = 1.0
default = 1.0
default = 1.0
default = 1.0
default = 1
default = 3
default = *.zon
default =
default = maximun data Z
default = minimun data Z
default = *.zrl
default =
-cl 50 -rw 50 -xc 1 -yc 2 -vc 4 -sn 0 -xmin 0.0 - ymin 0.0 water.dat
225
Grid
cells are in which zone use the NODE CENTERED GRID format described in Chapter 11
(contour). If the grid cell is within the zone, this is indicated with a 1, otherwise a 0 is used.
The file describing the inter- zone relationships has the following format:
[Number of Zones]
for (i=1;i# of Zones - 1)
for (j=1;j# of Zones - 1)
[Inter-Relationship](0 = Sharp, 1 = Gradational, 2 = Fuzzy)
end for
for (i=1;i# of Zones - 1)
for (j=1;j# of Zones - 1)
[Fuzzy Boundary Thickness]
end for
For an example, see data file grad.zrl.
Output Files
The output file formats depend on whether a 2D or a 3D grid were generated. 2D files are
formatted as specified for the programs contour and surface (Chapters 11 and 12). 3D files are
formatted for the program block (Chapter 13). See the relevant chapters for a description of the
data file formats.
Grid Mathematics
Inverse-Distance Mathematics
Inverse-distance techniques are often used to estimate the value of a parameter at locations where
no specific data exists. This is done on a regular grid overlaying sparse, randomly located data
(This can be done in either 1, 2, or 3 dimensions). The technique estimates values at a point by
weighting the influence of nearby data the most, and more distant data the least. This can be
described mathematically by (Burrough, 1986):
gi =
226
i =1
n
d ip
1
d p
i
i =1
(10-1)
Grid Mathematics
where gi is the estimated value at the grid location, di is the distance between the grid location and
the sample data, and p is the power to which the distance is raised. The basis of this technique is
that nearby data are most similar to the actual field conditions at the grid location. Depending on
the site conditions the distance may be weighted in different ways. If p = 1, this is a simple linear
interpolation between points. Many people have found that p = 2 produces better results. In this
case, close points are heavily weighted, and more distant point are lightly weighted (points are
weighted by 1 / d2). At other sites, p has been set to other powers and yielded reasonable results.
Inverse-distance is a simple and effective way to estimate parameter values at grid locations of
unknown value. Beside the directness and simplicity, however, inverse-distance techniques have a
number of shortcomings. Some of these are:
If a data point is coincident with a grid location, d = 0, a division by 0 occurs, unless it is
treated specially. Options:
if di = 0, gi = xi.
if d<i minimum distance to the grid location, gi = xi
The pattern and density of data collection can inappropriately weight results.
When residual errors are calculated, other contouring techniques such as kriging and
minimum curvature produce far better results (Kirk, 1991).
Kriging Mathematics
Trend-Surface Analysis Mathematics
Trend-surface analysis is different then kriging and inverse-distance estimation techniques which
try to estimate local features; trend-surface analysis is a mathematical method used to separate
regional from local fluctuations (Davis, 1973). What is defined as local and regional is
also often subjective and a function of scale, and the regional trend may vary with scale. The use
of trend analysis allows a observed data point to be divided into these two components. A trend can
be defined by three components (Davis, 1973):
It is based of geographic coordinates; i.e. the distribution of material properties can be
considered to be a function of location.
The trend is a linear function. That is, it has the form:
V = b1X + b 2 Y + ...
where V is the data value at the described location, the bs are coefficients, and X and Y
are combinations of geographic location.
The optimum trend, or linear function, must minimize the squared deviations from the
trend.
227
Grid
An example is shown in Figure 10.28. For a set of data, a line or surface may be fit exactly through
each point, but a trend (a mathematical equation) can be defined which approximates the data well
regionally with only minor local variations.
FIGURE 10-28. Figure 9.28: These graphs show the concept of trend surfaces (in 2D). a). shows
the actual field data and the actual surface. Pane b). shows a first order trend (a line or plane); c) is
a second order (parabolic) trend, and d) is a third order (cubic) trend. The shading represents
positive and negative trends about the trend (Davis, 1984).
Trend-surface analysis is basically a linear regression technique, but it is applied to two- and threedimensions instead of just fitting a line. A first order linear trend surface equation has the form:
V = b 0 + b1X + b 2 Y
(10-2)
That is, an observation, with a value V, can be described as a linear function of a constant value (b0)
related to the data set mean, and east-west (b1), and north-south (b2) components (Davis, 1973). To
solve for these three unknowns, three normal equations are available (Davis, 1973):
n
i =1
i =1
i =1
Vi = b0 n + b1 X i + b2 Yi
228
(10-3)
UNCERT Users Manual
Grid Mathematics
i =1
i =1
i =1
i =1
i =1
i =1
i =1
i =1
X i Vi = b0 X i + b1 X i 2 + b2 X iYi
(10-4)
Yi Vi = b0 Yi + b1 X i Yi + b2 Yi 2
(10-5)
where n is the number of data points. Solving these equations simultaneously will yield a bestfit, defined by least-squares regression, for a two-dimensional, first-order (a plane) trend surface.
This can be rewritten in matrix format:
n
X i
i =1
n
Yi
i =1
Xi
i =1
n
Xi2
i =1
n
X i Yi
i =1
b
i =1
n
0
X
Y
i i b1 =
b
i =1
n
2
2
Y
i
i =1
Yi
Vi
i =1
X i Vi
i =1
Yi Vi
i =1
(10-6)
This approach can also be applied to higher order surfaces and three-dimensions. In threedimensions, the equivalent equation would be:
V = b 0 + b1X + b 2 Y + b3Z
n
X i
i =1
n
Yi
i =1
n
Zi
i =1
Xi
i =1
n
Xi
i =1
n
Yi
i =1
n
X i Yi
i =1
n
X i Yi Yi 2
i =1
n
i =1
n
X i Z i Yi Z i
i =1
i =1
(10-7)
Zi
i =1
n
b 0
X i Z i b1
i =1
=
n
b
Yi Z i b2
i =1
3
n
Z i 2
i =1
Vi
i =1
X i Vi
i =1
Yi Vi
i =1
Z i Vi
i =1
(10-8)
(10-9)
229
Grid
V = b 0 + b1X + b 2 Y + b3X 2 + b 4 XY + b 5Y 2 + b6 X 3 + b 7 X 2 Y + b8 XY 2 +
b9Y3
(10-10)
V = b 0 + b1X + b 2 Y + b3X 2 + b 4 XY + b 5Y 2 + b6 X 3 + b 7 X 2 Y + b8 XY 2 +
b 9 Y3 + b10 X 4 + b11X 3Y + b12 X 2 Y 2 + b13XY3 + b14 Y 4
(10-11)
(10-12)
(10-13)
(10-14)
(10-15)
230
Grid Mathematics
(10-16)
SST
SSR
MSR
SSE
MSE
R2
R
Fstatistic
where:
n
V
n
2
SST = V i =1
n
i =1
V
n
i
=
1
2
SSR = V
n
i =1
MST =
SST
n 1
(10-18, 10-19)
MSR =
SSR
m
(10-20, 10-21)
MSE =
SSE
n m 1
(10-22, 10-23)
231
Grid
SSR
SST
(10-24)
R = R2
(10-25)
R2 =
Fstatistic
MSR
=
MSE
(10-26)
where n equals the number of data observations, and m equals the number of coefficients in the
trend-surface equation. R, R2 and the Fstatistic are terms used to evaluate the goodness of fit. R is
referred to as the coefficient of multiple correlation and R2 x 100% effects the percent of the data
variation explained by the regional trend- surface (Davis, 1986). The Fstatistic is used with the F-test
to determine if the group of trend-surface coefficients are significantly different than zero; i.e. the
regression effect is not significantly different from the random effect of the data. In formal
statistical terms, the F-test for significance of fit tests the hypothesis (H0) and alternative (H1):
H 0 : b1 = b 2 = b3 = L = b m = 0
H1 : b1, b 2 , b3 ,L = b m 0
(10-27)
The hypothesis tested, is that the partial regression coefficients equal zero, i.e. there is no
regression. If the computed F value exceeds the table value of F (Tables 10.1a, 10.1b, and 10.1c),
the NULL hypothesis is rejected and the alternative is accepted, i.e. all the coefficients in the
regression are significant and the regression is worthwhile.
In addition to the problems of getting a good fit for the trend surface, there are a number of pitfalls to the technique (Davis, 1986):
1). There must be adequate data control. The number of observations should be much
greater then the number of coefficients.
2). The spacing of the observation points is important. It can affect the size and
resolution of features seen. Clustering can cause problems or bias. The distribution
can affect the shape of the surface.
3). There are problems near boundaries. The surface can blow-up in the corners. For
this reason it is important to have a buffer around the area of concern. It amounts to a
problem of interpolating between data to one of extrapolating beyond data
observations.
232
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
40
60
120
INF
199.50
19.00
9.55
6.94
5.79
5.14
4.74
4.46
4.26
4.10
3.98
3.89
3.81
3.74
3.68
3.63
3.59
3.55
3.52
3.49
3.47
3.44
3.42
3.40
3.39
3.37
3.35
3.34
3.33
3.32
3.23
3.15
3.07
3.00
2
215.71
19.16
9.28
6.59
5.41
4.76
4.35
4.07
3.86
3.71
3.59
3.49
3.41
3.34
3.29
3.24
3.20
3.16
3.13
3.10
3.07
3.05
3.03
3.01
2.99
2.98
2.96
2.95
2.93
2.92
2.84
2.76
2.68
2.60
3
224.58
19.25
9.12
6.39
5.19
4.53
4.12
3.84
3.63
3.48
3.36
3.26
3.18
3.11
3.06
3.01
2.96
2.93
2.90
2.87
2.84
2.82
2.80
2.78
2.76
2.74
2.73
2.71
2.70
2.69
2.61
2.53
2.45
2.37
4
230.16
19.30
9.01
6.26
5.05
4.39
3.97
3.69
3.48
3.33
3.20
3.11
3.03
2.96
2.90
2.85
2.81
2.77
2.74
2.71
2.68
2.66
2.64
2.62
2.60
2.59
2.57
2.56
2.55
2.53
2.45
2.37
2.29
2.21
5
233.99
19.33
8.94
6.16
4.95
4.28
3.87
3.58
3.37
3.22
3.09
3.00
2.92
2.85
2.79
2.74
2.70
2.66
2.63
2.60
2.57
2.55
2.53
2.51
2.49
2.47
2.46
2.45
2.43
2.42
2.34
2.25
2.18
2.10
236.77
19.35
8.89
6.09
4.88
4.21
3.79
3.50
3.29
3.14
3.01
2.91
2.83
2.76
2.71
2.66
2.61
2.58
2.54
2.51
2.49
2.46
2.44
2.42
2.40
2.39
2.37
2.36
2.35
2.33
2.25
2.17
2.09
2.01
238.88
19.37
8.85
6.04
4.82
4.15
3.73
3.44
3.23
3.07
2.95
2.85
2.77
2.70
2.64
2.59
2.55
2.51
2.48
2.45
2.42
2.40
2.37
2.36
2.34
2.32
2.31
2.29
2.28
2.27
2.18
2.10
2.02
1.94
240.54
19.38
8.81
6.00
4.77
4.10
3.68
3.39
3.18
3.02
2.90
2.80
2.71
2.65
2.59
2.54
2.49
2.46
2.42
2.39
2.37
2.34
2.32
2.30
2.28
2.27
2.25
2.24
2.22
2.21
2.12
2.04
1.96
1.88
241.88
19.40
8.79
5.96
4.74
4.06
3.64
3.35
3.14
2.98
2.85
2.75
2.67
2.60
2.54
2.49
2.45
2.41
2.38
2.35
2.32
2.30
2.27
2.25
2.24
2.22
2.20
2.19
2.18
2.16
2.08
1.99
1.91
1.83
15
245.95
19.43
8.70
5.86
4.62
3.94
3.51
3.22
3.01
2.84
2.72
2.62
2.53
2.46
2.40
2.35
2.31
2.27
2.23
2.20
2.18
2.15
2.13
2.11
2.09
2.07
2.06
2.04
2.03
2.01
1.92
1.84
1.75
1.67
20
248.01
19.45
8.66
5.80
4.56
3.87
3.44
3.15
2.94
2.77
2.65
2.54
2.46
2.39
2.33
2.28
2.23
2.19
2.16
2.12
2.10
2.07
2.05
2.03
2.01
1.99
1.97
1.96
1.94
1.93
1.84
1.75
1.66
1.57
24
249.05
19.45
8.64
5.77
4.53
3.84
3.41
3.12
2.90
2.74
2.61
2.51
2.42
2.35
2.29
2.24
2.19
2.15
2.11
2.08
2.05
2.03
2.01
1.98
1.96
1.95
1.93
1.91
1.90
1.89
1.79
1.70
1.61
1.52
Source: From Table 22, The Penguin-Honeywell Book of Tables, Copyright F.W. Kellaway (ed.) and Honeywell Controls Ltd. (E.D.P. Division), 1968.
Degrees of
Freedom
Denominator, v2
INF
250.10
19.46
8.62
5.75
4.50
3.81
3.38
3.08
2.86
2.70
2.57
2.47
2.38
2.31
2.25
2.19
2.15
2.11
2.07
2.04
2.01
1.98
1.96
1.94
1.92
1.90
1.88
1.87
1.85
1.84
1.74
1.65
1.55
1.46
Grid Mathematics
233
Grid
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
40
60
120
INF
Degrees of
Freedom for
Denominator, v2
647.79
38.51
17.44
12.22
10.01
8.81
8.07
7.57
7.21
6.94
6.72
6.55
6.41
6.30
6.20
6.12
6.04
5.98
5.92
5.87
5.83
5.79
5.75
5.72
5.69
5.66
5.63
5.61
5.59
5.57
5.42
5.29
5.15
5.02
1
799.50
39.00
16.04
10.65
8.43
7.26
6.54
6.06
5.71
5.46
5.26
5.10
4.97
4.86
4.77
4.69
4.62
4.56
4.51
4.46
4.42
4.38
4.35
4.32
4.29
4.27
4.24
4.22
4.20
4.18
4.05
3.93
3.80
3.69
2
864.16
39.17
15.44
9.98
7.76
6.60
5.89
5.42
5.08
4.83
4.63
4.47
4.35
4.24
4.15
4.08
4.01
3.95
3.90
3.86
3.82
3.78
3.75
4.72
3.69
3.67
3.65
3.63
3.61
3.59
3.46
3.34
3.23
3.12
3
899.58
39.25
15.10
9.60
7.39
6.23
5.52
5.05
4.72
4.47
4.28
4.12
4.00
3.89
3.80
3.73
3.66
3.61
3.56
3.51
3.48
3.44
3.41
3.38
3.35
3.33
3.31
3.29
3.27
3.25
3.13
3.01
2.89
2.79
4
921.85
39.30
14.88
9.36
7.15
5.99
5.29
4.82
4.48
4.24
4.04
3.89
3.77
3.66
3.58
3.50
3.44
3.38
3.33
3.29
3.25
3.22
3.18
3.15
3.13
3.10
3.08
3.06
3.04
3.03
2.90
2.79
2.67
2.57
5
948.22
39.36
14.62
9.07
6.85
5.70
4.99
4.53
4.20
3.95
3.76
3.61
3.48
3.38
3.29
3.22
3.16
3.10
3.05
3.01
2.97
2.93
2.90
2.87
2.85
2.82
2.80
2.78
2.76
2.75
2.62
2.51
2.39
2.29
956.66
39.36
14.54
8.98
6.76
5.60
4.90
4.43
4.10
3.85
3.66
3.51
3.39
3.29
3.20
3.12
3.06
3.01
2.96
2.91
2.87
2.84
2.81
2.78
2.75
2.73
2.71
2.69
2.67
2.65
2.53
2.41
2.30
2.19
963.28
39.37
14.47
8.90
6.68
5.52
4.82
4.36
4.03
3.78
3.59
3.44
3.31
3.21
3.12
3.05
2.98
2.93
2.88
2.84
2.80
2.76
2.73
2.70
2.68
2.65
2.63
2.61
2.59
2.57
2.45
2.33
2.22
2.11
968.63
39.39
14.42
8.84
6.62
5.46
4.76
4.30
3.96
3.72
3.53
3.37
3.25
3.15
3.06
2.99
2.92
2.87
2.82
2.77
2.73
2.70
2.67
2.64
2.61
2.59
2.57
2.55
2.53
2.51
2.39
2.27
2.16
2.05
976.71
39.40
14.34
8.75
6.52
5.37
4.67
4.20
3.87
3.62
3.43
3.28
3.15
3.05
2.96
2.89
2.82
2.77
2.72
2.68
2.64
2.60
2.57
2.54
2.51
2.49
2.47
2.45
2.43
2.41
2.29
2.17
2.05
1.94
12
984.87
39.41
14.25
8.66
6.43
5.27
4.57
4.10
3.77
3.52
3.33
3.18
3.05
2.95
2.86
2.79
2.72
2.67
2.62
2.57
2.53
2.50
2.47
2.44
2.41
2.39
2.37
2.35
2.33
2.31
2.18
2.06
1.94
1.83
15
993.10
39.43
14.17
8.56
6.33
5.17
4.47
4.00
3.67
3.42
3.23
3.07
2.95
2.84
2.76
2.68
2.62
2.56
2.51
2.46
2.42
2.39
2.36
2.33
2.30
2.28
2.25
2.23
2.21
2.20
2.07
1.94
1.82
1.71
20
997.25
39.45
14.12
8.51
6.28
5.12
4.41
3.95
3.61
3.37
3.17
3.02
2.89
2.79
2.70
2.63
2.56
2.50
2.45
2.41
2.37
2.33
2.30
2.27
2.24
2.22
2.19
2.17
2.15
2.14
2.01
1.88
1.76
1.64
24
1001.40
39.46
14.08
8.46
6.23
5.07
4.36
3.89
3.56
3.31
3.12
2.96
2.84
2.73
2.64
2.57
2.50
2.44
2.39
2.35
2.31
2.27
2.24
2.21
2.18
2.16
2.13
2.11
2.09
2.07
1.94
1.82
1.69
1.57
INF
Source: From Table 22, The Penguin-Honeywell Book of Tables, Copyright F.W. Kellaway (ed.) and Honeywell Controls Ltd. (E.D.P. Division), 1968.
TABLE 10.1 b. Critical Values
234
4052.20
98.50
34.12
21.20
16.24
13.75
12.25
11.26
10.56
10.04
9.65
9.33
9.07
8.86
8.68
8.53
8.40
8.29
8.18
8.10
8.02
7.95
7.88
7.82
7.77
7.72
7.68
7.64
7.60
7.56
7.31
7.08
6.85
6.63
1
4999.50
99.00
30.82
18.00
13.27
10.92
9.55
8.65
8.02
7.56
7.21
6.93
6.70
6.51
6.36
6.23
6.11
6.01
5.93
5.85
5.78
5.72
5.66
5.61
5.57
5.53
5.49
5.45
5.42
5.39
5.18
4.98
4.79
4.61
2
5403.40
99.17
29.46
16.69
12.06
9.78
8.45
7.59
6.99
6.55
6.44
5.95
5.74
5.46
5.42
5.29
5.18
5.09
5.01
4.94
4.87
4.82
4.76
4.72
4.86
4.64
4.60
4.57
4.54
4.51
4.31
4.13
3.95
3.78
3
5624.60
99.25
28.71
15.98
11.39
9.15
7.85
7.01
6.42
5.99
5.67
5.41
5.21
5.04
4.89
4.77
4.67
4.58
4.50
4.43
4.37
4.31
4.26
4.22
4.18
4.14
4.11
4.07
4.04
4.02
3.83
3.65
3.48
3.32
4
5763.60
99.30
28.24
15.52
10.97
8.75
7.46
6.63
6.06
5.64
5.32
5.06
4.86
4.69
4.56
4.44
4.34
4.25
4.17
4.10
4.04
3.99
3.94
3.90
3.85
3.82
3.78
3.75
3.73
3.70
3.51
3.34
3.17
3.02
5
5859.00
99.33
27.91
15.21
10.67
8.47
7.19
6.37
5.80
5.39
5.07
4.82
4.62
4.46
4.32
4.20
4.10
4.01
3.94
3.87
3.81
3.76
3.71
3.67
3.63
3.59
3.56
3.53
3.50
3.47
3.29
3.12
2.96
2.80
5928.40
99.36
27.67
14.98
10.46
8.26
6.99
6.18
5.16
5.20
4.89
4.64
4.44
4.28
4.14
4.03
3.93
3.84
3.77
3.70
3.64
3.59
3.54
3.50
3.46
3.42
3.39
3.36
3.33
3.30
3.12
2.95
2.79
2.64
5981.10
99.37
27.49
14.80
10.29
8.10
6.84
6.03
5.47
5.06
4.74
4.50
4.30
4.14
4.00
3.89
3.79
3.71
3.63
3.56
3.51
3.45
3.41
2.36
3.32
3.29
3.26
3.23
3.20
3.17
2.99
2.82
2.66
2.51
6022.50
99.39
27.35
14.66
10.16
7.98
6.72
5.91
5.35
4.94
4.63
4.39
4.19
4.03
3.89
3.78
3.68
3.60
3.52
3.46
3.40
3.35
3.30
3.26
3.22
3.18
3.15
3.12
3.09
3.07
2.89
2.72
2.56
2.41
6055.80
99.40
27.23
14.55
10.05
7.87
6.62
5.81
5.26
4.85
4.54
4.30
4.10
3.95
3.80
3.69
3.59
3.51
3.43
3.37
3.31
3.26
3.21
3.17
3.13
3.09
3.06
3.03
3.00
2.98
2.80
2.63
2.47
2.32
12
6157.30
99.43
26.87
14.20
9.72
7.56
6.31
5.52
4.96
4.56
4.25
4.01
3.82
3.66
3.52
3.41
3.31
3.23
3.15
3.09
3.03
2.98
2.93
2.89
2.85
2.81
2.78
2.75
2.73
2.70
2.52
2.35
2.19
2.04
15
6208.70
99.45
26.69
14.02
9.55
7.40
6.16
5.36
4.81
4.41
4.10
3.86
3.66
3.51
3.37
3.26
3.16
3.08
3.00
2.94
2.88
2.83
2.78
2.74
2.70
2.66
2.63
2.60
2.57
2.55
2.37
2.20
2.03
1.88
20
Source: From Table 22, The Penguin-Honeywell Book of Tables, Copyright F.W. Kellaway (ed.) and Honeywell Controls Ltd. (E.D.P. Division), 1968.
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
40
60
120
INF
v2
Degrees of
Freedom for
Denominator,
6234.60
99.46
26.60
13.93
9.47
7.31
6.07
5.28
4.73
4.33
4.02
3.78
3.59
3.43
3.29
3.18
3.08
3.00
2.92
2.86
2.80
2.75
2.70
2.66
2.62
2.58
2.55
2.52
2.49
2.47
2.29
2.12
1.95
1.79
24
6260.60
99.47
26.50
13.84
9.38
7.23
5.99
5.20
4.65
4.25
3.94
3.70
3.51
3.35
3.21
3.10
3.00
2.92
2.84
2.78
2.72
2.67
2.62
2.58
2.54
2.50
2.47
2.44
2.41
2.39
2.20
2.03
1.86
1.70
INF
Grid Mathematics
235
Grid
Bibliography (grid)
Burrough, P.A., 1986, Principles of Geographical Information Systems for Land Resource
Assessment, Monographs on Soil and Resources Survey No. 12, Oxford Science Publications Clarendon Press, Oxford.
Davis, John C., 1986 (Second Edition), Statistics and Data Analysis in Geology, John Wiley &
Sons, New York, pp 405-425.
Deutsch, C.V., and A.G. Journel, 1992, GSLIB: Geostatistical Software Library and Users Guide,
Oxford University Press, New York.
Kirk, K.G., 1991, Residual Analysis for Evaluating the Robustness of Inverse Distance, Kriging,
and Minimum Tension Gridding Algorithms, GeoTech/GeoChautauqua 91 Conference
Proceedings, Lakewood, Colorado, pg. 15 (presentation).
Wingle, W.L., 1992, Examining Common Problems Associated with Various Contouring Methods,
Particularly Inverse-Distance Methods, Using Shaded Relief Surfaces, Geotech 92
Conference Proceedings, 1992, Lakewood, Colorado.
236
Contour
CHAPTER 11
The contour application contours and performs gradient analysis for two- and three-dimensional,
regularly gridded data. Only two-dimensional views are possible however. Three-dimensional data
sets can be viewed along X-Y, X-Z, and Y-Z planes. Profile lines along the contoured surface can
also be plotted.
NOTE: Contour can read both grid centered and node centered meshes, but node centered
meshes are converted to grid centered meshes inside the application. This is
necessary because of several internal algorithms. This has the effect of averaging
the grid values slightly.
The contour application is composed of two sections (Figure 11.1); the main menu-bar and the
drawing or plot area. The menu-bar is used to select all contour commands and the drawing area is
the display area for the contour maps.
237
Contour
FIGURE 11-1. This is an example of the contour application window. The main menu-bar is on the
top of the application window, with the drawing or plot area below.
File
The File sub-menu options control file and print handling, and exiting the program. The options
include Open, View, Save Preferences, Print Setup, Print, and Quit.
Open
Selecting File:Open generates a pop-up dialog which allows the user to select an existing data file.
This dialog operates as the File:Open dialog in Chapter 5 (plotgraph Figure 5.2). However, the
standard file extension for contour is *.srf (SURFER - *.grd).
View
File:View pops up a simple screen editor with the last saved version of the data file being used.
238
Save Preferences
When using programs with many user options, it is not possible for the program to always pick
reasonable default values for each parameter or input variable. For this reason preference files were
created (See Appendix C). These allow the user to define a unique set of defaults applicable to
the particular project. When File:Save Preferences is selected, contour determines how all the
input variables are currently defined and writes them to the file contour.prf.
WARNING:
If contour.prf already exists, you will be warned that it is about to be overwritten. If you do not want the old version destroyed you must move it to a
new file (e.g. the UNIX command mv contour.prf contour.old.prf would be
sufficient). When you press OK the old version will be over-written! This
cannot be done currently from within the application. To rename the you will
have to execute the UNIX mv command from a UNIX prompt in another
window.
If contour.prf does not exist in the current directory, it is created. This is an ASCII file and can
be edited by the user. See Appendix C for details.
Print Setup
File:Print Setup works exactly as explained in Chapter 5.
Print
File:Print generates a Postscript file of the calculated spatial measure, and depending on how the
print options are define in Print Setup, directs this file to the specified print queue, or to the
specified file.
Quit
File:Quit terminates the program.
239
Contour
lines will be hidden or blanked out (Figure 11.2). This same method can be used to locate surface
features such as buildings or roads (Figure 11.3).
FIGURE 11-2. In addition to viewing data in plan view (Figure 11.3) and orienting the map
vertically on the page, this contour map shows a cross-sectional view of contaminant concentration
levels (conc3.srf) printed in a landscape format. Here the blanked out areas represent areas of no
data (bedrock, and above ground surface).
This is a simple blanking method. More sophisticated contouring packages are available which
force the contour lines to meet the edge of a blanked out area at right angles. For ground water
flow, this method is more accurate, but it not supported by contour.
Selecting Blanking:Modify pops up the blanking input dialog (Figure 11.4). The default is no
blanking. To use blanking, set the blank zones toggle to true, and select a blank filename (the
default extension is *.blk). Once the file has been selected, the file can be viewed using the view file
button option. If blanking is no longer desired, set the blank zones toggle to false.
NOTE: The polygons in the blanking file are not drawn on the contour map until, either the
Apply or the Done dialog button is pressed.
It is also possible to just draw lines on the contoured surface, or draw lines and blank out different
zones. By default the Fill Closed Blank Zones option is True. In this state, all closed loops in the
blanking file will be blanked out. If the loop is not closed, only a line will be drawn. If the Fill
Closed Blank Zones option is set to False, closed loops will also be drawn as lines (Figure 11.5).
240
FIGURE 11-3. This contour map shows several features of contour (file dig.grd). The map was
printed with a portrait orientation using blanking and data posting options. Contour lines of
variable weight are available, a grid may be overlain, surficial features such as building outlines
(rectangular whited-out areas) may be positioned, and the location of important points (assay
values, well IDs, etc.) can be identified.
FIGURE 11-4. Blanking Data pop-up
dialog. This dialog allows the user to
determine if an area of the map should
be blanked out an if line work should be
drawn, and what file should used to
define the areas.
241
Contour
FIGURE 11-5. In addition to drawing contours, and blanking areas of the map (Figure 11.2), it is
possible to drawn line work on the map. In this case, the gray shading represents the thickness on
the aquifer, and the line work shows the position of the river and the channel levees.
Contour
The contour menu option allows the user to specify parameters concerning the spacing and
appearance of the contour lines, and if the data set is three-dimensional, which plane of the data
grid will be viewed.
Active Plane
When a three-dimensional grid file is loaded, this option becomes available. Contour:Active
Planes allows the user to specify which plane of the grid will be displayed and contoured. The popup dialog shown in Figure 11.6 controls these options. The contour map can be cut along the X-Y,
X-Z, or Y-Z Plane. Once the Plane orientation has been selected the Layer, Column, or active Row
may be selected. Pressing the Previous or Next buttons will step contour through the selected
planes. To determine which plane you want to select, refer to Figure 11.7.
Parameters
When the contour map is initially drawn, contour will select minimum and maximum contour
levels and a contour interval. The minimum contour level will be set to the minimum grid value in
the grid file (NOTE: this is not necessarily the same as the minimum value in the field data set; the
grid file is composed of interpolated values). The contour interval will be set so that there are nine
contour lines equally spaced between the minimum and maximum grid value (See Appendix C,
Preference Files, or the Running from the Command Line section below on how to override default
242
values). These values are often not exactly what the user needs. This menu selection allows the
user to specify details about the contour interval and range, and the appearance of the contour lines
themselves.
Selecting Contour:Modify shows the pop-up dialog show in Figure 11.8. The minimum and
maximum grid file values are shown; these are the default minimum and maximum contour levels
(i.e. contour lines (representing equal Z values) will only be drawn if they fall between these two
values). In some cases it is desirable not to contour the entire data range. For example, the propose
of the map might be to examine subtle differences in relatively flat portions of the map; by
243
Contour
contouring the entire map with the desired contour interval, steep zones can become overly
cluttered, and if a larger contour interval is selected, the desired detail in the flat areas are lost.
Note that the base contour interval is the minimum contour level. The character of the contour lines
themselves are also user definable. There are there methods with which contour lines can be drawn
through the grid; Linear, Akima Spline, and Cubic Spline. With the linear method, straight lines are
drawn between grid block borders (Figure 11.9). Using either of the splining techniques will
smooth the contour lines. Determining which method to use is a matter of personal preference.
Some argue that a linear plot best represents the data, and other find the splined contour lines
visually more pleasing. There are also two types of contour lines, main and minor. By default the
main contour line is twice the thickness of the minor. On the computer screen, these values
represent the thickness of the line in pixels (one is the smallest valid value for the screen; one will
be assumed for smaller entries). For Postscript output, the values represent 1/144<dfn>th</dfn>s
of an inch. Frequency is used to specify how often a main contour line is drawn. The base contour
line is a main contour line, and contour lines are drawn from minimum to maximum grid values.
Note, a frequency of five means that for every five contour lines, there will be one main and four
minor lines. By pressing the appropriate toggle, either or both the main and minor contour lines
can be dashed. By default, the main contour lines have a Spacing of approximately three inches (on
different size monitors, this will vary). The default label Format is general (See Appendix A).
Contour labels can be removed by setting the Label Contours toggle to false.
FIGURE 11-8. Contour pop-up dialog. This dialog controls the appearance of the contour lines.
The extents of the contour lines (minimum and maximum) and the contour interval are user
definable, as are the contour line thickness and frequency. Either the main or the minor contour
lines may be dashed or solid. In addition to drawing the map with contour lines, the grid cells
themselves can be color coded by elevation.
244
Sometimes when new files are loaded, the contour interval used for the previous map or the contour
values set in the preference file, are inappropriate; pressing the Auto-Reset Contour Range button
will recalculate reasonable values. These values will probably not be ideal, but they give a
starting point.
By default, the contour map is drawn with a color (or gray) shaded background and white (black on
Postscript output) contour lines. This makes an attractive map on the screen which quickly
identifies differing Z values and trends. Drawing this background does slow the screen refresh
down however, and when printing to a gray scale printer, the higher levels print dark gray to black
and this makes reading the contour lines themselves difficult. This option can be turned on or off
by toggling the Color Contour Grid Blocks toggle.
245
Contour
NOTE: When the color contour grid blocks option is turned off, the contour lines are drawn
using the specified color palette. See the Palette section below on how to specify
different color palettes.
Gradient
Rather then viewing just the elevation contour lines, it is sometimes useful to examine the gradient
or slope of the contour lines or field data. Examining the gradient can be useful for identifying
areas where there might be slope stability problems, ground water flow direction, or problem areas
in the grid itself (The grid is interpolated from field data, and different algorithms can generate
significant problems that bear no relation to the field data (Wingle, 1992)). In this package the
gradient can be indicated by color coded block (no directional information, (Figure 11.10a), by a
color coded arrow (Figure 11.10b), or by variable length arrows (Figure 11.10c).
The gradient at a particular point is not explicitly known, and therefore has to be extrapolated from
the gridded data. To calculate a gradient vector three points are required (three points define a
plane), but because the grid file uses rectangular cells four are used. This unfortunately over
defines the plane and the gradient represents an averaged gradient over the cell. A complete
description of how the gradient is calculated is presented in the Mathematics section below.
Gradient:Modify pops up the dialog shown in Figure 11.11. This dialog allows the user to activate
the gradient option and specify values for the appropriate parameters. In the dialog, there is a color
scale reference. The color scale range can also be extended or narrowed by modifying the Gradient
Color Range Minimum and Maximum values. These values are, by default set to the minimum and
maximum gradients found for the grid file. This is linear scale, and sometimes a few spots with
very steep gradients dominant the color scale. By reducing the maximum gradient for the color
scale (often significantly), low gradient zone are better defined, with minimal detail loss to the steep
zones. If these values have been changed, or specified on the command line or in a preference file,
when a new file is loaded the extents of the gradients are not recalculated. Pressing the Maximize
Gradient Range button will recalculate and respecify the gradient range and color palette.
Normally when plotting the contour maps, gradients are not plotted. To plot gradients, set the
Draw Gradient Vectors toggle to true (setting this toggle to false prevents gradients from being
plotted). When activated, the default settings plot the gradients using uniform length, color coded
arrows (the arrows point down gradient). The arrows may also be scaled in length to indicate
relative gradients; set the Scale Gradient Vectors to true. The arrow tip size can be adjusted using
the Arrow Tip Length text field. The arrow tip size should be reduced on very fine grids, otherwise
they tend to overlap. If arrows make the map appear to busy, or directional information is not
required, the Gradient Indicator Type can be switched from Arrow Vector to Block Fill. Block Fill
fills an entire cell with one color representing the average cell gradient (Figure 11.10a).
246
a).
b).
c).
Posting
In many cases it is useful to post the location and values of the field data (do the calculated grid
values match the field data). It is also convenient to mark the location of wells and there IDs or
other such information. When posting a file a + will mark the data point position, and a label
may be associated with it. The label will be placed above and to the right of the symbol.
247
Contour
Posting:Modify pops up the dialog (Figure 11.12) which controls data posting. By default nothing
is posted. To post the data locations, set the Post Data toggle to true. To post the data labels, both
Post Data and Post Labels must be set to true. Once Post Data is set to true, a file can be selected
(the default extension is *.lbl) by pressing the Post Filename button. Note that this does not load
the file; it just selects the file name. If labels will be associated with the data location, specify the
Label Column. Valid labels are any character string with no spaces; the first space marks the end of
the label (See Setting up a Posting File below for more details). Once all the needed information is
selected, press the dialog Apply or Done button. This will load the label file and post the data as
requested.
Graph
Graph allows the user to specify various attributes about the appearance of the graph. Attributes
about the graph Border, Error-bar Styles, Fonts, Labels, Legends, Mesh, and Line Styles.
Border
Graph:Border is described in Chapter 5 in the Graph:Border section (Figure 5.9).
248
Fonts
Graph:Fonts is largely described in Chapter 5 in the Graph:Fonts section Figures 5.10 and 5.11),
but the font selection for contour is slightly different (Figure 11.13). Different fonts may be
selected for the Main Title, the Secondary Title, the Axes Labels, the Division Labels, the Contour
Labels, the X-Section Labels (End labels on profile/cross-section line), the Annotations (Posted
Labels), and the mouse Position labels.
11-13. Font
pop-up
dialog. This dialog is used to
define the X-windows and
Postscript text fonts and font sizes
for different portions of the
contour map.
FIGURE
249
Contour
Labels
Graph:Labels is described in Chapter 5 in the Graph:Labels section (Figure 5.12).
Mesh
Graph:Mesh is described in Chapter 5 in the Graph:Mesh section (Figure 5.13).
Scale
Graph:Scale controls the scale bar at the bottom of the contour map. By default it is on and the
units are in feet. The width of the error-bar is divided into three width units; where each width unit
is the X Major Tic Frequency wide (See the Border section below). Eventually the scale-bar width
will be specified from this menu, but that is not currently supported.
The scale-bar can be turned on or off by pressing the Use Scale-Bar toggle shown if Figure 11.14.
The units label can be changed in the Units text field. The Scale Width option is not currently
supported.
FIGURE 11-14. Scale pop-up dialog. This dialog allows to specify
whether the map scale at the bottom of the map is draw or not.
Palette
Color in contour is used to aid the user in interpreting the contour and gradient information. Under
different circumstances for different users different color palettes are more appropriate, or better at
displaying specific information. Several color palettes are available in contour.
Set Palette
There are six palettes selectable by the user. Using Scale:Set Palette menu the user can select Gray,
Hue, Hue (Looped), Spectrum, Spectrum (Looped), White, or User Defined. The active selection is
highlighted in red. The looped palettes refer to palettes where the same color represents both the
minimum and maximum values.
250
Color Legend
To specify the range of the color palette, or view the numerical scale associated with the color
palette, select Palette:Color Legend. The dialog shown in Figure 11.15 will be displayed. The
color scale limits are set to the minimum and maximum values of the grid value by default. These
limits can be changed by respecifying the Minimum and Maximum Color Scale Range. If the
values have been reset, or specified from the command line or a preference file, and a new file is
loaded, the color scale will probably not be correct. Pressing Maximize Scale Range will reset the
color scale to the extents of the new data file. By default, values above and below the Minimum and
Maximum Color Scale Range will also be identified with the minimum and maximum color code.
By setting the Cutoff Color Outside Range toggle to true, values outside the range will not have any
color coding. This can be useful for emphasizing only a specific contour range on the map (Figure
11.16). User Palettes may also be loaded, and any palette may be saved to a file (See Setting up
Palette File section below). These files, by default have a *.pal extension. To make the User
Palette active, the Scale:Set Palette:User Defined must be selected.
FIGURE 11-15. Color Scale Legend
NOTE: It common to print color output to a black and white Postscript printer. This works
because the color is dithered to a gray scale equivalent. Be careful with this though,
UNCERT Users Manual
251
Contour
FIGURE 11-16. This map is a fifth order trend surface residual map (See Chapter 9). The shaded
areas, are areas with a negative residual. The blockiness of the shaded borded is due to the shade
fill method. The grid cells are whats shaded, not the contour intervals themselves.
because distinctly different colors (e.g. red and blue) can be dithered in such a way
that they are difficult to differentiate.
TRICK & WARNING: When printing, if the background of the map is color contoured
(colored blocks) and you are printing to a black and white printer, some regions will
be dithered to black. The contour lines are also drawn black. As a result, in some
areas on the map, the contour lines will be very difficult to read. To avoid this
problem, just before printing, select the gray color palette (Palette:Set
Palette:Gray), then lower the Minimum Color Scale Range value. For a map with a
minimum value of 0.0, a maximum value of 100.0, setting the Minimum Color
Scale Range value to -33.0 would yield good results. Lowering the Minimum Color
Scale Range value by one-third the data range is a good rule of thumb.
Plot
Plot:Now and Plot:Refresh are described in Chapter 5 in the Plot:Now and Plot:Refresh sections.
252
Profile
It is often of interest to see a profile or cross-sectional view of an arbitrary line across a contour
map. Selecting the Profile:View menu-bar option generates a pop-up dialog similar to that shown in
Figure 11.17. Once the profile dialog is generated, the middle mouse button can be used to select
profile lines on the contour map. The features of this dialog are discussed below.
FIGURE 11-17. Section Profile popup dialog. This dialog displays and
controls section profile lines on the
contour map.
File
File is similar to File for the main contour program. This menu-bar option controls file handling,
printing, and terminating the profiling session.
Save
File:Save saves the X-Y coordinates of the latest profile line to a MULTIPLE LINE formatted
plotgraph file (See Chapter 5 for file format). The default file extension is *.dat. If a save file has
already been opened, the data are simply saved. If a save file has not been selected yet, a pop-up
dialog similar to that used in File:Open (Figure 5.2) is created. The main difference between the
Open and the Save dialog is that to save a file, the file does not have to pre-exist. For a description
of how the dialog works, see the Open section above and substitute Save for Open wherever
appropriate.
Save as
File:Save as is used to save the profile line X-Y coordinates to a user specified file. A pop-up
dialog similar to that used in File:Open (Figure 5.2) is created.
253
Contour
Print Setup
File:Print Setup works exactly as explained in Chapter 5.
NOTE: This Print Setup dialog is shared by both the main contour program and the line
profile graphing tool. Changes made in one, will effect the other.
Print
File:Print generates a Postscript file of the calculated spatial measure, and depending on how the
print options are define in Print Setup, directs this file to the specified print queue, or to the
specified file.
Quit
File:Quit terminates the line profiling session.
Line
The Line menu-bar option control how profile lines will be created and drawn on the contour map.
A new line is drawn each time this is repeated.
Random Line
Using Line:Random Line, a new profile line is created between where the mouse button (middle) is
first pressed, and where it is released. The X-axis of the graph is set to the length of the profile line.
254
This dialog allows the user to specify the map coordinate end line
points (A and A) for a profile line.
Graph
Graph allows the user to specify various attributes about the appearance of the graph. Attributes
about the graph Border, Error-bar Styles, Fonts, Labels, Legends, Mesh, and Line Styles.
Border
Graph:Border is described in Chapter 5 in the Graph:Border section (Figure 5.9). Note that the
values are not necessarily the same as those in from the same dialog in the main portion of contour.
This same dialog is used by both contour and contour:profile, but the entries are separate.
Labels
Graph
Graph:Labels is described in Chapter 5 in the Graph:Labels section (Figure 5.12). Note that the
values are not necessarily the same as those in from the same dialog in the main portion of contour.
This same dialog is used by both contour and contour:profile, but the entries are separate.
Section
Labels:Section is used to enter the Starting Label (A) and the Ending Label (A) for the profile line.
These values are entered using the pop-up dialog shown in Figure 11.19.
Mesh
Graph:Mesh is described in Chapter 5 in the Graph:Mesh section (Figure 5.13). Note that the
values are not necessarily the same as those in from the same dialog in the main portion of contour.
This same dialog is used by both contour and contour:profile, but the entries are separate.
255
Contour
This
dialog allows the user to specify the profile section line end
labels.
General Comments
In addition to the features of the profiler, several additional comments about the appearance of the
graph, plotting the profile section line on the contour map, and removing the profile line from the
contour map are needed.
Graph Appearance
The Section Profile graph is scaled on the X-axis to the length of the profile line, and on the Y-axis
between the minimum and maximum contour interval. The X-axis division labels are positioned at
every 0.2 * profile line length units. The Y-axis division labels, by default, are placed at every
major contour interval. The start of the profile line is placed to the left (A), and the end is to the
right (A). Font sizes can be controlled using the Fonts:Modify option on the main contour menubar.
Help
Help works exactly as explained in Chapter 5 (plotgraph, Figure 5.15) Help section.
256
FIGURE 11-20. This map is a zoomed in region of the maps shown in Figures 11.1 and 11.2. Also
shown is a section profile line (A-A) near the line graphed in Figure 11.17.
257
Contour
Zoom
When a map is plotted on the screen, the extents of the X and Y axes can be changed in
Border:Modify, or more simply, if less precisely, using the mouse. To zoom into an area on the
map, 1) picture the rectangular region of interest, 2) move the mouse pointer to one corner, 3) press
and hold down the right mouse button, 4) drag the mouse pointer to the opposite corner of interest
(a rectangles will be as the mouse is moved), and 5) when the area of interest is enclosed in the
temporary rectangle, release the mouse (you may have to nudge the mouse pointer slightly after
releasing the mouse button). The new region will then be redrawn and scale to the screen (Figure
11.20). To zoom out, you can only return to the full map. To zoom out, press and release the right
mouse bottom anywhere in the map window.
NOTE:
This feature only works for the contour map. It is not possible to zoom into portions
of the section profile.
258
then those passed on the command line were defined. These variables could have been set using the
menus, or using a preference file (Appendix C). Every time contour runs, it searches the current
working directory for the file contour.prf. If it exists, contour reads the file and sets the variables as
specified. This is the third way to open a file, because one of the arguments in the preference file is
the name of the grid file. This could be done by typing the following (Figure 11.2):
> contour -prf conc3.prf conc3.srf
Note, this selects a non-default preference file (the default preference file is contour.prf). Once this
file has been loaded, displaying a contour map, a gradient arrow vector map might be desired. To
change the map type, select Gradient:Modify from the menu bar, press the Draw Gradient Vectors
toggle and finally the dialogs Apply or Done button (Figure 10.10b). If instead of vector arrows,
filled blocks are desired (Figure 10.10c), when in the Gradient:Modify dialog, also press the Block
Fill menu option under Gradient Indicator Type. If at any time, just a regular contour map is
desired (no gradient information), set the Draw Gradient Vector toggle back to false.
259
Contour
#
#.#
{}
=
=
=
=
integer
float
character string.
variable is an array. Values must be seperated by a , and no spaces are
allowed. Do not use the { } symbols on the command line.
NOTES:
260
-arrw
-bf
default = 6.0
default = 1
-bl
default = 0
-blf
-cb
= blanking file
= color contour grid blocks
0 = false
1 = true
default =
default = 0
-cfmt
-cint
-cmax
-cmin
-colcut
=
=
=
=
=
default = g
default = dz / 10.0
default = z maximum
default = z minimum
default = 0
-cplt
default = 1
-csmax
-csmin
-csmo
default = z maximum
default = z minimum
default =
-cspc
-ctyp
default = 3
default = 2
-esp
default = 0
-fnt1
-fnt2
-fnt3
-fnt4
-fnt5
-fnt6
-fnt7
-fnt8
-fnts1
-fnts2
-fnts3
-fnts4
-fnts5
-fnts6
-fnts7
-fnts8
-gdmax
-gdmin
-gs
default = Helvetica-Bold
default = Helvetica-Bold
default = Helvetica
default = Helvetica
default = Helvetica
default = Helvetica
default = Helvetica
default = Helvetica
default = 24.0
default = 15.0
default = 15.0
default = 12.0
default = 10.0
default = 15.0
default = 10.0
default = 12.0
default = steepest gradient
default = smallest gradient
default = 0
-gt
default = 0
-gv
default = 1
-help
-lc
-lgf
-lf
-lMd
=
=
=
=
=
default = 4
defalut = log.dat
default = 5
default = 0
261
Contour
0 = false
1 = true
-lmd
default = 0
-lMt
-lmt
-lpbm
-lpc
-lpd
=
=
=
=
=
default = 2.0
default = 1.0
default = 1.5
default = 1
default = 0
-lpf
-lph
= print filename
= print header page
0 = false
1 = true
default = "junk.ps"
default = 0
-lplm
-lpo
default = 1.5
default = 0
-lppsext
-lpq
-lpr
-lprm
-lps
=
=
=
=
=
default = "*.ps"
default = "ps"
-lptm
-md
-mox
-moy
-ms
-mx
-my
262
= X mesh frequency
= Y mesh frequency
default = 1.0
default = 0
default = 1.5
default = 0
default = 0.0
default = 0.0
default = 0
default = 1/10 DX
default = 1/10 DY
-pA
-pAA
-pal
-pbx
-pby
-pbx
-pby
-pl
=
=
=
=
=
=
=
=
-prf
-ptd
defalut = contour.prf
defalut = 0
-ptf
-ptl
defalut =
defalut = 0
-rfh
= screen refresh
0 = on exposure
1 = on update
default = 0
-scl
defalut = 1
-sttl
-ttl
-units
-vp
=
=
=
=
default =
default = Filename
default = feet
defalut = 0
-vpc
-vpl
-vpr
Secondary title
Main title
map scale unit label
show map scale
0 = X-Y
1 = X-Z
2 = Y-Z
default = A
default = A
default =
default = 0.0
default = 0.0
default = 0.0
default = 0.0
default = 0
defalut = 1
defalut = 1
defalut = 1
263
Contour
-xc
-xfmt
-xlabel
-xmax
-xmin
-xMt
-xmt
-xsec
=
=
=
=
=
=
=
=
defalut = 1
default = ".2f"
default = "X"
default = Data Maximum
default = Data Minimum
default = 1/10 DX
default = 5
defalut = 0
-xto
-xy
-yc
-yfmt
-ylabel
-ymax
-ymin
-yMt
-ymt
-ys
-yto
=
=
=
=
=
=
=
=
=
=
=
default = 0.0
default = 1.5
default = 2
default = ".2f"
default = "Y"
default = Data Maximum
default = Data Minimum
default = 1/10 DY
default = 5
default = Calculated
default = 0.0
264
Grid Centered
The grid centered UNCERT ormat specifies the number of rows (n) columns (m), and layers (o) in
the grid, the X-Y-Z map origin, the width of the map in the X, Y, and Z directions, and a value for
every grid location in the matrix (NO dummy or NOT KNOWN values are allowed). The format is
as follows:
GRID CENTERED GRID
# col # row # lay width (X) width (Y) height (Z) X-origin
[int] [int] [int] [real]
[real]
[real]
[real]
row-1:col-1:lay-1 row-1:col-2:lay-1 ...... row-1:col-n:lay-1
[real]
row-2:col-1:lay-1 row-2:col-2:lay-1 ...... row-2:col-n:lay-1
[real]
:
:
row-m:col-1:lay-1 rowm:col-2:lay-1 ...... row-m:col-n:lay-1
[real]
row-1:col-1:lay-2 row-1:col-2:lay-2 ...... row-1:col-n:lay-2
[real]
row-2:col-1:lay-2 row-2:col-2:lay-2 ...... row-2:col-n:lay-2
[real]
:
:
row-m:col-1:lay-2 rowm:col-2:lay-2 ...... row-m:col-n:lay-2
[real]
:
:
row-1:col-1:lay-o row-1:col-2:lay-o ...... row-1:col-n:lay-o
[real]
row-2:col-1:lay-o row-2:col-2:lay-o ...... row-2:col-n:lay-o
[real]
:
:
row-m:col-1:lay-o rowm:col-2:lay-o ...... row-m:col-n:lay-o
[real]
Y-origin
[real]
Z-origin
[real]
When defining the layers, start at the bottom layer (top of file) and work up to the top layer (bottom
of file).
Node Centered
The node centered format is identical to the GRID CENTERED GRID format except for the first
header line. The first header line is:
265
Contour
SURFER
SURFER Files must be created using the ASCII option under OUTPUT when gridding the file.
Otherwise, a binary grid file is created and this program will not read it correctly.
GSLIB
For GSLIB files there are two requirements. One, the grid value information must be in column
one, and two, grid flie must have an associated parameter (*.par) file. This is where row, column,
layer, and dimension data is read from. The filename of the parameter file must have the same
prefix as the grid file, and it must have a *.par extentsion. A series number bewteen the prefix and
extension can be given in the grid file name. Valid parameter filenames for the grid file water..12out
would be:
water.12.par
water.par
If a correct parameter does not exist, the grid file is not loaded.
Y-coordinate
Y-coordinate
Y-coordinate
Y-coordinate
Y-coordinate
Line/Polygon-ID
Line/Polygon-ID
Line/Polygon-ID
Line/Polygon-ID
Line/Polygon-ID
To build the data file, the polygon IDs must start at 1 and be incremented positively. The X-Y
data pairs must follow a border (clockwise or counter-clockwise) because the polygon border will
be drawn between consecutive points. Note: the area interior to the polygon will be the area
266
blanked out. This is also an unformatted file; all numbers must be separated by a space. As an
example, the blanking file building.blk used in Figure 11.3 is shown below:
0.40
0.90
1.40
0.40
4.25
4.25
4.00
4.00
4.25
4.25
5.00
5.00
4.25
0.19
0.69
0.19
0.19
3.25
3.45
3.45
3.65
3.65
3.90
3.90
3.25
3.25
1
1
1
1
2
2
2
2
2
2
2
2
2
START Polygon #1
END polygon #1
START polygon #2
END polygon #2
NOTE: A polygon is distinguishable from a line, because the first and last points are the
same, i.e. the polygon is closed.
Y-coordinate
Y-coordinate
Y-coordinate
Y-coordinate
Y-coordinate
...
...
...
...
...
Label
Label
Label
Label
Label
...
...
...
...
...
As an example, the posting file label.lbl used in Figure 11.3 is shown below:
! This is a test label set for use with gridded data sets created
! from the file water.dat
!
!X Y
Z
Label
!---------------------------------------------------------------2.93 2.38 0.0 1.0
4.01 3.51 0.0 2.0
4.15 3.73 0.0 3.0
UNCERT Users Manual
267
Contour
3.74
1.29
5.20
4.60
4.03
0.30
1.45
4.05
2.68
3.77
4.02
3.20
0.0
0.0
0.0
0.0
0.0
0.0
4.0
5.0
12345abcde
B
CC-456
12.0
0
0
0
255
255
255
255
0
0
255
255
255
0
255
255
0
255
255
0
0
0
0
255
0
The first entry in the data file will be assigned to the minimum color palette value (See
Palette:Color Legend section above), and the 175th entry will be assigned to the maximum color
palette value.
Contour Mathematics
Slope Gradient Determination
The calculation of the cell gradient is simple, but only an approximation. Because four points (one
point at each corner) define a cell and only three points are needed to define a plane, the solution is
over-defined, and there is no guaranty that these four points lie on the same plane. One alternative
solution would be to divide each rectangular cell into two triangles and calculate the gradient for
each. This, though mathematically correct was not done, because of the computational expense,
appropriate means for deciding how to define the triangles, and questions about visualization of
gradient arrows within a cell. Instead, all four point were used in a single calculation. Two vectors
were calculate, one in the +X direction and one in the +Y direction; with two vectors, a plane can
be defined. Given the cell grid values:
268
Bibliography (contour)
p1
p
3
p2
p 4
where p<em>i</em> represents an individual grid point. The X and Y vectors are calculated:
( p p 2 z ) + ( p 3z p 4 z )
v
x = dx, 0, 1z
2
(11-1)
( p p 3z ) + ( p 2 z p 4 z )
v
y = 0, dy, 1z
2
(11-2)
Bibliography (contour)
Wessel, P. and W.H.F. Smith, 1991, GMT-SYSTEM Software, The School of Ocean and Earth
Science and Technology, University of Hawaii, and the Scripps Institution of Oceanography,
University of California at San Diego.
Wingle, W.L., 1992, Examining Common Problems Associated with Various Contouring Methods,
Particularly Inverse-Distance Methods, Using Shaded Relief Surfaces, Geotech 92
Conference Proceedings, 1992, Lakewood, Colorado, pp 362-376.
269
Contour
270
Surface
CHAPTER 12
Surface is a 2-1/2 dimensional visualization program for viewing regularly gridded data as a color
contoured, gradient, or shaded relief surface. It is included in the UNCERT software as a tool to
view the values in a two-dimensional array as three dimensional surface. This is referred to as a 21/2 dimensional surface because for each X-Y grid location, there is only one Z value. In a true 3D
model (see block, Chapter 13) each X-Y location may have multiple Z values. This package is used
to view gridded surface data generated from grid (Chapter 10), or for examining layers or crosssections from sisim (Chapter 14), MODFLOW (McDonald and Harbaugh, 1984) (Chapter 15), and
MT3D (Chapter 16) output files. Surface may also be used to display any regularly gridded data,
from other sources, DEMs (Digital Elevation Models) being a common example.
NOTE: surface can read both grid centered and node centered meshes, but node centered
meshes are converted to grid centered meshes inside the application. This is
necessary because of several internal algorithms. This has the effect of averaging
the grid values slightly.
In order to make UNCERT run as smoothly as possible, surface was designed so that many of the
various data formats created by other UNCERT modules, and some other commercial software, are
compatible with the program, thus reducing continuous modifications often required to
accommodate third party software.
The surface application is composed of five sections (Figure 12.1); the main menu-bar, the slider
bar bulletin board, the surface orientation block (upper right), the log/status area, and the drawing
or graphical area. The menu-bar is used to select all surface commands and the drawing area is the
display area for the rendered surfaces. Four slider bars are present to shift the color palette, to tilt
and rotate the surface, and to move the light source (These sliders are discussed in more detail in
the Palette and View:Parameters menu sections below). In the orientation block, the left arrow
points to map north, and the right arrow points towards the light source (This is only appropriate
when a shaded relief view type is selected (discussed in View:View Type menu section)). The log/
status area is used to display messages to the user about the programs status, and is available for
the user to type in notes.
271
Surface
FIGURE 12-1. This is an example of the surface application window. The main menu-bar with
slider bars and the log/status window is on the top-left of the application window, on the top-right
are the north arrow (left) and the direction to light source arrow (right), and the map view area is
below.
272
File
The File sub-menu options control file and print handling, and exiting the program. The options
include Open, View, Save, Save as, Save Preferences, Print Setup, Print, Quit, and Quit Without
Saving.
Open
Selecting File:Open generates a pop-up dialog which allows the user to select an existing data file.
This dialog operates exactly as the Open:File dialog in Chapter 5 (plotgraph Figure 5.2). However,
unlike plotgraph the default data file extension is *.srf.
View
File:View pops up a simple screen editor with the last saved version of the file being graphed.
Save Preferences
When using programs with many user options, it is not possible for the program to always pick
reasonable default values for each parameter or input variable. For this reason preference files were
created (See Appendix C). These allow the user to define a unique set of defaults applicable to
the particular project. When File:Save Preferences is selected, surface determines how all the input
variables are currently defined and writes them to the file surface.prf.
WARNING:
If surface.prf already exists, you will be warned that it is about to be overwritten. If you do not want the old version destroyed you must move it to a
new file (e.g. the UNIX command mv surface.prf surface.old.prf would be
sufficient). When you press OK the old version will be over-written! This
cannot be done currently from within the application. To rename the you will
have to execute the UNIX mv command from a UNIX prompt in another
window.
If surface.prf does not exist in the current directory, it is created. This is an ASCII file and can
be edited by the user. See Appendix C for details.
Print Setup
File:Print Setup works exactly as explained in Chapter 5.
273
Surface
Print
File:Print generates a Postscript file of the calculated spatial measure, and depending on how the
print options are define in Print Setup, directs this file to the specified print queue, or to the
specified file.
Quit
File:Quit terminates the program, but if additions have been made to the graph, the user will first be
queried to supply a file to save the changes in.
View
The View options are used to control the rendering technique for the surface, the view orientation,
scaling, light source parameters, and hidden line/grid removal options.
Active Plane
When a three-dimensional grid file is loaded, this option becomes available. View:Active Planes
allows the user to specify which plane of the grid will be displayed and contoured. The pop-up
dialog shown in Figure 11.6 (Chapter 11, contour) controls these options. The contour map can be
cut along the X-Y, X-Z, or Y-Z Plane. Once the Plane orientation has been selected the Layer,
Column, or active Row may be selected. Pressing the Previous or Next buttons will step surface
through the selected planes. To determine which plane you want to select, refer to Figure 11.7
(Chapter 11, contour).
View Type
There are four surface rendering techniques which are set using View:View Type:[Gradient] [Grid]
[Filled Grid] [Shaded Relief]. Gradient generates a filled surface where the color coding is based
on the slope or gradient of the surface at that location (Figure 12.2). Grid generates a fishnet or
wire mesh surface (Figures 12.3a and 12.3b) with the colors representing elevation or the
magnitude if the z variable. When using this view type hidden lines (lines are not visible because
closer surface obstructs their view) can be used (lines or sections on lines not visible will not be
drawn, Figure 12.3a) or turned off (Figure 12.3b). To set the use of hidden lines see the
View:Parameters section on hidden line removal. For the remaining other types, hidden line
removal refers to if the wire grid is drawn. For these other view types, the grid blocks are filled
with a solid color based on elevation or z magnitude (Filled Grid, Figure 12.4), or obliqueness of
the surface to the light source (Shaded Relief, Figure 12.5). On the shaded relief maps, surfaces
being struck directly by the light source will be white, surfaces facing 180 from the light source
will be black, and intermediately oriented surfaces will be colored with an appropriate shade of
gray.
274
FIGURE 12-2. This map surface shows a relief surface, where the color coding refers to the
steepness, or gradient, of the surface in the area. White is steep, black is flat. This plot was printed
with a portrait format.
FIGURE 12-3 a. The map surface shown here was draw using the View:View Type:Grid option
Parameters
In addition to defining the color coding of the surface map, it is useful to control the maps
Exaggeration, view Angle Above Horizon, view Angle From North, Zoom, Sun Angle or light
source direction, light source Brightness, Hidden Line Removal, and Drop Box. These parameters
275
Surface
FIGURE 12-3 b. The map surface shown here was draw using the View:View Type:Grid option
without Hidden Line Removal (View Parameters). Note, because hidden lines are not removed,
areas where a valley would be hidden from view are very cluttered. The surface is being
overdrawn, and this creates a confused display.
FIGURE 12-4. The map surface shown here was draw using the View:View Type:Filled Grid option
276
FIGURE 12-5. The map surface shown here was draw using the View:View Type:Grid
277
Surface
case the exaggeration should be increased. If the surface is tall and thin (in the extreme case a
vertical line extending off the top and bottom of the screen) the exaggeration should be decreased.
The Angle Above the Horizon and the Angle From North refer to the viewers reference to the
surface. In surface the viewer is always looking down on the map surface, and the Angle Above the
Horizon describes viewers viewpoint as an angle above the horizontal; 90 is looking straight down
on the surface (plan-view map), 0 is looking horizontally across the surface (valid values are 0 to
90). The Angle From North defines the direction that the viewer is looking, i.e. to look to the north
across the map surface the correct Angle From North angle would be 0 (valid values are 0 to
360).
Zoom allows the user to zoom closer to (enlarge view), or further from (shrink map) the center of
the map surface. The default is 1.0, with legal values greater then 0.0. Values greater than 1.0 will
zoom closer to the surface, values less than 1.0 will zoom further from the surface.
The Sun Angle and the Brightness control the light source. The Sun Angle describes the direction
from the surface map the light source; it is assumed the light source is on the horizon. Legal values
are between 0 and 360. A common direction used is 315; this puts the sun to the northwest.
Even though in the northern hemisphere, the sun will never be in this position, when surface maps
are viewed in plan view (Angle From North = 0, and Angle Above the Horizon = 90) the eye will
interpret the shading correctly so that mountains appear raised and valleys appear depressed. If the
light source comes from the southeast (135), the relief appears reversed. The Brightness makes
the brights brighter, and the darks darker. Valid values are greater than 1.0. This is useful
particularly in areas of low relief, and modifying the brightness can accentuate minor surface
features.
Hidden Line Removal is used in combination with View:Type. The option behaves slightly
differently though depending on which view option is selected. When using the Grid view type
hidden lines (not visible because closer surface obstructs their view) can be removed (lines or
sections on lines not visible will not be drawn, Figure 12.3a) or viewed (Figure 12.3b). For the
remaining view types, hidden line removal refers to if a wire mesh is drawn over the map. The grid
blocks are filled with a solid color based on elevation, gradient, or obliqueness of the surface to the
light source. On low relief surfaces, drawing the mesh can help define the surface texture, however,
the mesh can hide the color of the grid blocks and thus still hide the surface texture.
NOTE: There are potential reasons to use, or not use, hidden line removal for the view type
Grid. The map surface rendering is slightly faster when hidden line removal is not
used, and if there is not much relief, this setting may be appropriate. If there is
significant relief , the map can become cluttered (Figure 12.3b) as areas are
overdrawn. At the cost of slowing the drawing rate slightly hiding hidden lines will
eliminate this problem (Figure 12.3a).
WARNING:
278
The hidden line removal algorithm (when using View:View Type:Grid) has
some problems at angles near 0, 90, 180, and 270. When lines are drawn
near vertical on the screen, the algorithm breaks down and the wire mesh
rendering appears to be missing portions of line segments.
UNCERT Users Manual
The Draw Drop Box option allows the user to add a drop box bottom to the base of the rendered
surface showing the X and Y coordinates. This feature can be toggled on (Figure 12.2) or off
(Figure 12.4) by pressing the Draw Drop Box toggle button.
Gradient
Rather then viewing just contoured or a shaded relief surface, it is sometimes useful to examine the
gradient or slope of that surface (Figure 12.2). Examining the gradient can be useful for identifying
areas where there might be slope stability problems, ground water flow direction, or problem areas
in the grid itself (The grid is interpolated from field data, and different algorithms can generate
significant problems that bear no relation to the field data (Wingle, 1992)). Currently, in this
package, the gradient can only be indicated by color coded block (no directional information,
(Figure 12.2). The, color coded arrow, and by variable length arrows options in Chapter 11
(contour) currently are not available.
NOTE: This menu option works slightly differently then the Gradient option in the contour
module (Chapter 11). In surface to activate the gradient option, Gradient under the
View:Type menu-bar option must be selected.
The gradient at a particular point is not explicitly known and therefore has to be extrapolated from
the gridded data. To calculate a gradient vector three points are required (three points define a
plane), but because the grid file uses rectangular cells four are used. This unfortunately over
defines the plane and the gradient represents an averaged gradient over the cell. A complete
description of how the gradient is calculated is presented in the Mathematics section in Chapter 11.
Gradient:Modify pops up the dialog shown in Figure 12.7. This dialog allows the user to specify
values for the appropriate parameters. In the dialog, there is a color scale reference. The color
scale range can also be extended or narrowed by modifying the Gradient Color Range Minimum
and Maximum values. These values are, by default set to the minimum and maximum gradients
found for the grid file. This is linear scale, and sometimes a few spots with very steep gradients
dominant the color scale. By reducing the maximum gradient for the color scale (often
significantly), low gradient zone are better defined, with minimal detail loss to the steep zones. If
these values have been changed, or specified on the command line or in a preference file, when a
new file is loaded the extents of the gradients are not recalculated. Pressing the Maximize Gradient
Range button will recalculate and respecify the gradient range and color palette.
When the gradient view type is activated, the default settings plot the gradients using coded filled
blocks (this is non-directional information). The options allowing the use of arrows are
currently not supported (Use contour (Chapter 11) if directional (arrow) information is needed).
279
Surface
Palette
Palette:Set Palette and Palette:Color Legend are described in Chapter 11 in the Palette:Set Palette
and Palette:Color Legend sections (Figure 11.15). NOTE: The White palette option is not
available.
Fonts
Fonts:Modify is largely described in Chapter 5 in the Fonts:Modify section (Figures 5.10 and 5.11),
but the font selection for surface is slightly different (Figure 12.8). Different fonts may be selected
for the Main Title, Annotations, the North and Sun Direction Arrow labels, and the Axes Division
Numbers.
Log
The Log menu option is supplied to allow the user to save, view, or print all text which has been
written to the log/status window by the program or added by the user (The log window is also a
simple text editor). The options include View Log, Save, Save as, Clear, and Print. View Log,
Clear, Save, and Save as are similar in operation to the menu options under File described above.
280
This dialog is used to define the Xwindows and Postscript text fonts
and font sizes for different
portions of the contour map.
Plot
Plot:Now and Plot:Refresh are described in Chapter 5 in the Plot:Now and Plot:Refresh sections.
Help
Help works exactly as explained in Chapter 5 (plotgraph, Figure 5.15) Help section.
281
Surface
For example:
> surface dig.grd
will open the graph file shown in Figure 12.4, and
> surface -vt 2 -pl 1 dig.grd
will open the file and plot it with a gray palette using shaded relief (Figure 12.5). NOTE: in both
Figures 12.4 and 12.5, other variables then those passed on the command line could have been
defined. These variables could have been set using the menus, or using a preference file (Appendix
C). A preference file is used to define user preferred variable default values. Every time surface
runs, it searches the current working directory for the file surface.prf. If it exists, surface reads the
file and sets the variables as specified. This is the third way to open a file, because one of the
arguments in the preference file is the name of the surface file.
{}
=
=
=
=
integer
float
character string.
variable is an array. Values must be seperated by a , and no spaces are
allowed. Do not use the { } symbols on the command line.
NOTES:
282
=
=
=
=
-db
default = 1
-ex
-fnt1
-fnt2
-fnt3
-fnt4
-fnts1
-fnts2
-fnts3
-fnts4
-gdmax
-gdmin
-help
-hl
=
=
=
=
=
=
=
=
=
=
=
=
=
exaggeration
main title font
secondary font
arrow label font
axes division font
main title font size
secondary title font size
arrow label size
axes division font size
maximum gradient
minimum gradient
give this help menu
hide lines
0 = false
1 = true
default = 1.0
default = Helvetica-Bold
default = Helvetica-Bold
default = Helvetica
default = Helvetica
default = 15.0
default = 12.0
default = 12.0
default = 10.0
default = steepest gradient
default = smallest gradient
-hz
-lgf
-lpbm
-lpc
-lpd
=
=
=
=
=
default = 25.0
default = log.dat
default = 1.5
default = 1
default = 0
-lpf
= print filename
default = 1
default = z maximum
default = z minimum
default = 0
default = 1
default = "junk.ps"
283
Surface
284
-lph
default = 0
-lplm
-lpo
default = 1.5
default = 0
-lppsext
-lpq
-lpr
-lprm
-lps
=
=
=
=
=
default = "*.ps"
default = "ps"
-lptm
-pal
-pl
default = 1.5
default =
default = 0
-prf
-rfh
defalut = surface.prf
default = 0
-rt
-sn
-vp
= viewing direction
= direction to sun (light source)
= show map scale
0 = X-Y
1 = X-Z
2 = Y-Z
default = 45.0
default = 315.0
defalut = 0
-vpc
-vpl
-vpr
-vt
=
=
=
=
defalut = 1
defalut = 1
defalut = 1
defalut = 0
default = 1.0
default = 0
0
1
2
3
4
-zm
=
=
=
=
=
fill grid
gradient
wire mesh
overlay (NOT AVAILABLE)
shaded relief
= zoom
default = 1.0
Surface Mathematics
In order to make the software package useful, sophisticated computer graphics algorithms are
necessary to convert field and model data into images the computer monitor or the printed page that
makes sense to the user. In this section some of the algorithms used in this software package will
be discussed.
[ x
y z] = [x y z]R+ T
(12-1)
where < x, y, z > are the translated coordinates, < x, y, z > are the real world coordinates, is the
rotation angle (about the Z-axis) from North, is the rotation angle from the horizon (about the Xaxis), and R is a rotation matrix given by (Foley et al, 1990):
UNCERT Users Manual
285
Surface
FIGURE 12-9. These axes display the required transformation of world coordinate data in a right-
R x axis
R z axis
0
0
1
0 cos sin
=
0 sin cos
0
0
0
0
0
0
1
(12-2a)
cos sin 0 0
sin cos 0 0
=
0
1 0
0
0
0 1
0
(12-2b)
R = R z axis R x axis
0
0
0
0
0
(12-2c)
and
cos sin cos sin sin
R= sin cos cos cos sin
0
sin
cos
286
(12-2d)
Surface Mathematics
0
1
0
0
0
1
dy
dz
dy
dz
0
0
0
1
(12-3a)
and
T= d x
(12-3b)
dy
dz
dx
0
0
(12-4)
When T and R are substituted into equation 12-1 T and R are defined as the portions of matrix 124 defined as follows (Foley et al, 1990):
r
r
R and T=
r
[t
r
r
r r
t t]
r
r
0
0
1
(12-5)
By applying equation 12-1, all real world points in data set can be translated into screen
coordinates. The screen coordinates, x, y, and z, can now be solved and reduce to:
= x cos y sin
(12-6a)
(12-6b)
(12-6c)
This transformation is called a parallel transformation which is described in the next section.
287
Surface
Parallel Transformation
By translating world data coordinates into screen coordinates (with the z-axis pointing out of the
display) the relationship of which objects are in front of others can be quickly determined (those
with the highest z-value are the closest). Without the transformation; instead of only two
comparisons (x1 = x2 and y1 = y2), four additional divisions are necessary; this is computationally
expensive and should be avoided (Foley et al, 1990).
288
Bibliography (surface)
(12-7)
gray % = (cos (l - n) + ) / 2
(12-8)
or
where l is the light source vector, n is the vector normal to the surface, 0% gray = white, and 100%
gray = black.
Slope-Gradient Determination
As discussed in contour>, this is an approximation method (See Chapter 11, Slope-Gradient
Determination section for details) for determining the slope or gradient of each cell within the grid.
Bibliography (surface)
Foley, J.D., A. Van Dam, S.K. Feiner, and J.F. Hughes, 1990, Computer Graphics, Principles and
Practice, Addison-Wesley, Reading, Massachusetts.
Gómez-Hernández, J.J. and R.M. Srivastava, 1990, ISIM3D: An ANSI-C Three
Dimensional Multiple Indicator Conditional Simulation Program, Computers in Geoscience,
Vol. 16, No. 4, pp. 395-440.
Lillesand, T.M. and R.W. Kiefer, 1987, Remote Sensing and Image Processing, Second Edition,
John Wiley and Sons, New York.
McDonald, M.G., and A.W. Harbaugh, 1984, A Modular Three-Dimensional Finite- Element Flow
Model, U.S. Geological Survey OFR 83-875.
289
Surface
290
Block
CHAPTER 13
Block is a 3-dimensional visualization program for viewing regularly gridded data or spatial data
points and lines (Both cannot be viewed at the same time). It is included in the UNCERT software
as a tool to view the values in a three-dimensional arrays as three dimensional blocks. This
package is used to view gridded block data generated from grid (Chapter 10), or for examining
output from sisim (Chapter 14), modmain (Chapter 15), and mt3dmain (Chapter 16) files.
NOTE: block can read both grid centered and node centered meshes, but grid centered
meshes are converted to node centered meshes inside the application. This is
necessary because of several internal algorithms. This has the effect of averaging
the grid values slightly.
The block application is composed of five sections (Figure 13.1); the main menu- bar, the slider bar
area, the block orientation block (upper right), the log/status area, and the drawing or graphical
area. The menu-bar is used to select all block commands and the drawing area is the display area
for the rendered blocks. Three slider bars are also present to shift the color palette, and to tilt and
rotate the block (These sliders are discussed in more detail in the Palette and View:Parameters
menu sections below). The log/status area is used by the program to report important messages or
results. In the orientation block, the arrow points to map north.
291
Block
FIGURE 13-1. This is an example of the block application window. The main menu-bar with
slider bars is on the top-left of the application window, on the top-right are the north arrow, and the
map view area is below.
type and size of fonts used within the Labels dialog. Log is used by the program to report important
messages or results. Plot plots the graph. Help gives the user a selection of pop-up help topics.
Each menu item is fully described below with all the available options.
File
The File sub-menu options control file and print handling, and exiting the program. The options
include Open, View, Save, Save as, Save Preferences, Print Setup, Print, Quit, and Quit Without
Saving.
Open
Selecting File:Open generates a pop-up dialog which allows the user to select an existing data file.
This dialog operates exactly as the Open:File dialog in Chapter 5 (plotgraph Figure 5.2). However,
unlike plotgraph the default data file extension is *.bck.
292
Open SU
If block has been compiled with the -DSU compile option (See Makefile in source directory), the
File:Open SU menu option will be active. This option allows Seismic UNIX (Cohen and
Stockwell, 1994) block files to be loaded. Selecting File:Open SU generates a pop-up dialog
shown in Figure 13.2, which allows the user to select a SU data file and specify the number of
receiver lines and receivers per line. The default data file extension is *.su.
FIGURE 13-2. Open SU File pop-up dialog. This dialog
allows the user to specify the receiver line, number of
receivers per line, and the name of the Seismic UNIX
grid file.
View
File:View pops up a simple screen editor with the last saved version of the file being graphed.
Save Preferences
When using programs with many user options, it is not possible for the program to always pick
reasonable default values for each parameter or input variable. For this reason preference files were
created (See Appendix C). These allow the user to define a unique set of defaults applicable to
the particular project. When File:Save Preferences is selected, block determines how all the input
variables are currently defined and writes them to the file block.prf.
WARNING:
If block.prf already exists, you will be warned that it is about to be overwritten. If you do not want the old version destroyed you must move it to a
new file (e.g. the UNIX command mv block.prf block.old.prf would be
sufficient). When you press OK the old version will be over-written! This
cannot be done currently from within the application. To rename the you will
have to execute the UNIX mv command from a UNIX prompt in another
window.
If block.prf does not exist in the current directory, it is created. This is an ASCII file and can be
edited by the user. See Appendix C for details.
293
Block
Print Setup
File:Print Setup works exactly as explained in Chapter 5.
Print
File:Print generates a Postscript file of the calculated spatial measure, and depending on how the
print options are defined in Print Setup, directs this file to the specified print queue, or to the
specified file.
Quit
File:Quit terminates the program, but if additions have been made to the graph, the user will first be
queried to supply a file to save the changes in.
View
The View options are used to control the rendering technique for the block, the view orientation,
scaling, light source parameters, and hidden line/grid removal options. In addition to defining the
color coding of the block map, it is useful to control the maps Exaggeration, view Angle Above
Horizon, view Angle From North, Zoom, Hide Block Outline, and Quick Draw. These parameters
are defined in the pop-up dialog shown in Figure 13.3. To display this dialog, select
View:Parameters from the menu- bar.
Exaggeration controls the vertical exaggeration of the map block. Exaggeration is a multiplicitive
factor, therefore legal values are greater then 0.0; 1.0 will give the true relief, values between 0.0
and 1.0 will reduce the exaggeration, and values greater than 1.0 will increase the exaggeration.
With many data sets it is useful to substantially increase or decrease this value. If the block appear
flat, the difference in z values is small compared to the horizontal distances covered by the may; in
this case the exaggeration should be increased. If the block is tall and thin (in the extreme case a
vertical line extending off the top and bottom of the screen) the exaggeration should be decreased.
The Angle Above the Horizon and the Angle From North refer to the viewers reference to the block.
In block the viewer is always looking down on the map block, and the Angle Above the Horizon
describes viewers viewpoint as an angle above the horizontal; 90 is looking straight down on the
block (plan-view map), 0 is looking horizontally across the block (valid values are 0 to 90). The
Angle From North defines the direction that the viewer is looking, i.e. to look to the north across the
map block the correct Angle From North angle would be 0 (valid values are 0 to 360).
Zoom allows the user to zoom closer to (enlarge view), or further from (shrink map) the center of
the map block. The default is 1.0, with legal values greater then 0.0. Values greater than 1.0 will
zoom closer to the block, values less than 1.0 will zoom further from the block.
294
Hide Block Outline is used to define whether the entire map block will be outlined. If it is turned
on, a box outline is drawn (Figure 13.4).
FIGURE 13-4. This block map shows several features of block. This map was drawn with a limited
data range; the volume above the ground surface, the basalt anticline underlying the aquifer, and the
inter-unit clays and muds were not drawn. This left only the gravel (blue), sandy-gravel (orange),
and sand (green) to be drawn. The figure was printed with a landscape format using the color
postscript printing option. A border box also surrounds the model.
Quick Rotate is useful when rotating large data sets. Because large block models can take some
time to draw, rotating them to the desired position using slider bars can be cumbersome. By setting
295
Block
Quick Rotate to true, only the outline of the model is drawn, and the desired orientation can quickly
be set. Once set, toggling Quick Rotate back to false will allow the entire map to be drawn.
Data Range
This option allows the user to perform two tasks based on the value of each block grid location.
One, it allows the user to specify a range of block values that will be plotted or drawn, and two, it
performs volume calculations.
A useful technique for seeing into a 3D body is to draw only the information that is of concern; this
was done in Figure 13.4 and in Figure 13.5. In Figure 13.4, the volume (of air) above the ground
surface, a basalt anticline beneath the unconfined aquifer, and the low permeability clays and muds
were removed. This allows the user to quickly identify the generally good connectivity of the high
permeability units (blue = gravel, orange = sandy-gravel, green = sand). It also suggests that there
are thin clay layers in the unit, which at least locally will impede vertical flow.
FIGURE 13-5. The block map shows the results when the data range is limited. Here areas of low
concentration were not drawn. The entire map area has also been outlined with a border box.
Data Range:Modify pops up the dialog (Figure 13.6) which allows the user to specify the range of
values which will be displayed. By default the entire data range is displayed; the data range is
shown at he bottom of the dialog. To specify one continuous range the Minimum and Maximum
range values can be specified in the two text fields, or by moving the Range Sliders. The range
sliders are convenient when an approximate value will work and a precise value is not required.
Sometimes it is also useful to display values not within the specified range. By turning off the View
in Range toggle blocks are drawn which have values between the minimum data value and the
296
minimum range value, and between the maximum range value and the maximum data value. Note,
if the full data range is set to be displayed, and this toggle is turned off, no blocks will be drawn.
FIGURE 13-6. Data Range pop-up dialog.
This
dialog allows the user to limit which blocks will be
plotted based on block cell value. The minimum and
maximum range values can be set directly by
entering desired values into text field, or by
positioning slider bars. By setting the View in
Range toggle to false, instead of the cell blocks with
values in range being drawn, they we not be drawn,
and the cells outside the range will be drawn.
Volume calculations can also be controlled from this
dialog.
Data Range:Modify also allows the user to calculate volumes based on the value of each grid cell.
The method used fairly simple. It assigns each grid value to the full volume of the grid cell. There
is no interpolation. The volumes are then summed based on which volume group they belong. The
first group are all values less the Data Display Minimum. Groups are then sized by the Volume
Calculations Step Interval until the Data Display Range is exceeded. For example, if the Data
Display Maximum equals 10.0, the Step Interval is 20.0, and the Data Display Maximum equals
40.0, the volume groupings would be:
-
10.0
30.0
50.0
When the dialog Calculate button is pressed the volume of grid cells in each data range will be
calculated and displayed in the log/status window.
Bench
In addition to hiding sections of the 3D block by value, it is also often convenient to cut into the 3D
block by pulling off full or partial layers, rows, or columns (Figure 13.7). The Bench:Modify pops
up the dialog in Figure 13.8 and option allows the user to perform this task. Note, cuts are always
made from the corner of the 3D block closest to the viewer. There are two main features to this
dialog. There is a slider bar for plane; planes can be stripped of Vertically, West-East, and NorthUNCERT Users Manual
297
Block
South. The cut control for each plane can also be controlled. If a plane is defined as Cut-Active,
only the specified number of blocks can be removed from that particular plane.
FIGURE 13-7. Bench Cuts pop-up dialog. This dialog allows the user to specify bench cuts into
the block model, and slice off portions of any plane. Note, cuts or slices are always from the corner
of the model visually closest to the viewer.
FIGURE 13-8. The map block map shows a 3-plane bench cut with the entire model block outlined.
NOTE: If any plane is set to a depth of zero, and the Active-Cut toggle for that plane is set to
true, no bench will be cut.
When a bench cut is desired, the depth of the cut must be specified in all the planes that the ActiveCut toggle is set to true. The cut-depth specifies for the given plane the maximum numbers of
block levels that can be removed.
If instead of making a bench cut, it is more appropriate to remove an entire layer or group of layers
(rows, or columns) (Figure 13.9), there are two methods. The first is to maximize the cut-depth for
the two planes not of concern; then by setting the cut- depth on the desired plane to the desired
depth. The alternative method is to set the Active-Cut toggles for the two planes not of concern to
false, and then setting the cut- depth on the desired plane to the desired depth. Either method will
accomplish the same task.
298
FIGURE 13-9. The map block map shows a 1-plane slice cut with the entire model block outlined.
A final option on the Bench:Modify dialog is Automatic Replot. This option automatically redraws
the block model whenever a bench cut depth is modified using a slider-bar. This allows the user, on
small data sets, to peal away layers, columns, or rows in real time. On large data sets, do not use
this option; the refresh rate for the map is to slow.
Blank
Using the Data Display Range and Bench Cut options discussed above, it often not possible to hide
all the grid cells desired. Using the Blank:Modify pop-up dialog shown in Figure 13.10 it is
possible to turn off, or not draw any cell in the grid. To do this the user needs to make another grid
file with the same dimensions as the grid being displayed. This file though is made of only 0s
(dont draw grid cell) and 1s (draw grid cell). To use this technique, Blank Zones must be set to
True, and a valid Blank Filename must be specified. An example is shown in Figure 13.11. The
file is loaded when Apply or Done is pressed.
FIGURE 13-10. Blanking Data pop-up
dialog. This dialog is used to load a
blanking grid file. The blanking files
defines which grid cells will not drawn,
and which can be drawn.
Post
Often it is important to view raw field data in three-dimensions before any analysis is done (i.e. we
havent gridded the data yet). Block can be used to post data points and lines in three-dimensions
using the Post:Modify menu option. This options creates the pop-up dialog shown in Figure 13.12.
UNCERT Users Manual
299
Block
FIGURE 13-11. This map surface shows a regularly gridded map, where the portions of the grid
With this dialog the appropriate columns from the data file can be read for the X, Y, Z, and Value
Columns. The extents of the graphed region can be specified (X, Y, and Z Minimum and Maximum).
The plot characteristics for each line can also be described. If only points are being plotted, several
options are available: 1) each point can be located with a Plot Cross (+), 2) the relative magnitude
of each point can be plotted (Plot Magnitude Length), and 3) if there are negative values in the data
set, the magnitude lines can be defined to start or end at 0.0. When the magnitude is plotted, the
Maximum Length can be set (This is in pixels on the screen and 1/72 points for Postscript output).
The Symbol Size for the Plot Cross or individually defined symbols can also be specified (This is in
pixels on the screen and 1/72 points for Postscript output). If points and/or lines are to be plotted,
there must be a column in the data file specifying which point a line belongs to. This is similarly
true for plotting individual symbols or colors. These options can be activated by setting the Plot
File IDd Lines, Plot File Symbols, or Specify File Colors toggles to True. For each, ID Column,
Symbol Column, or Color Column, a data file column number must be specified. The file format is
discussed in the Setting Up Input File section later in the chapter. Figure 13.13 shows an example
data set with digitized contour lines and well values color coded to material type.
Palette
Palette:Set Palette and Palette:Color Legend are described in Chapter 11 in the Palette:Set Palette
and Palette:Color Legend sections (Figure 11.15). There are two difference in block though. One,
the White palette option is not available, and two each grid cell is described by three colors. To give
the grid cells a three-dimensional appearance, the color palette is shifted slightly (10 color shades
out of 175) for each face drawn. This is shown in the Palette:Color Legend dialog in Figure 13.14.
300
Fonts
Fonts:Modify is largely described in Chapter 5 in the Graph:Fonts section (Figures 5.10 and 5.11),
but the font selection for contour is slightly different (Figure 13.15). Different fonts may be
selected for the Main Title, Annotations, and the North Arrow label.
Log
The Log menu option is supplied to allow the user to save, view, or print all text which has been
written to the log/status window by the program or added by the user (The log window is also a
simple text editor). The options include View Log, Save, Save as, Clear, and Print. View Log,
Clear, Save, and Save as are similar in operation to the menu options under File described above.
301
Block
FIGURE 13-13. This map surface shows spatially located 3D point data.
Plot
Plot:Now and Plot:Refresh are described in Chapter 5 in the Plot:Now and Plot:Refresh sections.
Help
Help works exactly as explained in Chapter 5 (plotgraph, Figure 5.15) Help section.
302
FIGURE 13-14. Color Scale Legend popup dialog. This dialog is shows the current
color scale mapping, it allows the user to
specify the color scale range, and if desired,
load a user defined color palette.
13-15. Font
pop-up
dialog. This dialog is used to
define the X-windows and
Postscript text fonts and font sizes
for different portions of the
contour map.
FIGURE
303
Block
FIGURE 13-16. This map surface shows a regularly gridded map, where the color coding refers to
304
[-ar #] [-bl #] [-blf ] [-bv #] [-colcut #] [-csmax #.#] [-csmin #.#] [-ewa #]
[-ewb #] [-ex #.#] [-fnt1 ] [-fnt2 ] [-fnt3 ] [-fnts1 #.#] [-fnts2 #.#]
[-fnts3 #.#] [-hb #] [-help] [-hz #.#] [-lcc #] [-lgf ] [-lic #] [-lpbm #.#] [-lpc #]
[-lpd #] [-lpf ] [-lph #] [-lplm #.#] [-lpo #] [-lppsext ] [-lpq ] [-lpr]
[-lprm #.#] [-lps #] [-lptm #.#] [-lsc #] [-nsa #] [-nsb #] [-pal ] [-pc #] [-phc #]
[-phl #.#] [-php #] [-pl #] [-pll #] [-ppp # [-prf ] [-ps #] [-qr #] [-rfh #] [-rt #.#]
[-ss #.#] [-suf ] [-sul #] [-sur #] [-va #] [-vb #] [-vmax #.#] [-vmin #.#]
[-vsi #.#] [-xc #] [-xmax #.#] [-xmin #.#] [-yc #] [-ymax #.#] [-ymin #.#] [-zc #]
[-zmax #.#] [-zmin #.#] [-zm #.#] [filename]
Meaning of flag symbols:
#
#.#
{}
=
=
=
=
integer
float
character string.
variable is an array. Values must be seperated by a , and no spaces are
allowed. Do not use the { } symbols on the command line.
NOTES:
default = 0
-bf
default = 1
-blf
-bv
= blanking file
= show blocks within view range limit
0 = false
1 = true
default =
default = 1
-cmax
-cmin
-colcut
default = z maximum
default = z minimum
default = 0
-cplt
default = 1
305
Block
306
-csmax
-csmin
-ewa
default = z maximum
default = z minimum
default = 1
-ewb
-ex
-fnt1
-fnt2
-fnt3
-fnts1
-fnts2
-fnts3
-hb
=
=
=
=
=
=
=
=
=
default = 0
default = 1.0
default = Helvetica-Bold
default = Helvetica
default = Helvetica
default = 24.0
default = 15.0
default = 15.0
default = 1
-help
-hz
-lcc
-lgf
-lic
-lpbm
-lpc
-lpd
=
=
=
=
=
=
=
=
-lpf
-lph
= print filename
= print header page
0 = false
1 = true
default = "junk.ps"
default = 0
-lplm
-lpo
default = 1.5
default = 0
-lppsext
-lpq
-lpr
-lprm
-lps
=
=
=
=
=
default = "*.ps"
default = "ps"
default = 25.0
default = 1
defalut = log.dat
default = 1
default = 1.5
default = 1
default = 0
default = 1.0
default = 0
1 = color
-lptm
-lsc
-nsa
default = 1.5
default = 1
default = 1
-nsb
-pal
-pc
default = 0
default =
default = 0
-phc
default = 1
-phl
-php
-pl
= use mesh
1 = gray
2 = spectrum
3 = spectrum (looped)
4 = hue
5 = hue (looped)
6 = white
7 = user defined
= post points with lines
0 = false
1 = true
default = 0
-ppp
default = 1
-prf
-ps
defalut = block.prf
default = 0
-qr
= quick rotate
default = 0
-pll
default = 0
307
Block
0 = false
1 = true
-rfh
= screen refresh
0 = on exposure
1 = on update
default = 0
-rt
-ss
-suf
= viewing direction
= post points symbol size
= read Seismic Unix file
0 = false
1 = true
default = 45.0
default = 5.0
default = 0
-sul
-sur
-va
default = 0
default = 0
default = 1
-vb
-vc
-vmax
-vmin
-vsi
minimum
-xc
-xmax
-xmin
-yc
-ymax
-ymin
-zc
-zm
-zmax
-zmin
=
=
=
=
=
default = 0
default = 4
default = data maximum
default = data minimum
default = maximum -
=
=
=
=
=
=
=
=
=
=
default = 1
default = data X maximum
default = data X minimum
default = 2
default = data Y maximum
default = data Y minimum
default = 3
default = 1.0
default = data Z maximum
default = data Z minimum
308
Comment lines are not allowed within the data area defining the block cell
values of the grid file, they however, can appear above or below this section.
They can appear anywhere within the non-gridded data file format.
Equal Dimensions
Three are three file formats that assume equal grid block dimensions. These are GRID
CENTERED GRID, NODE CENTERED GRID, and EQUALLY_SPACED. The first two formats
are discussed in Chapter 11 (contour), the last is discussed below.
NOTE: The EQUALLY_SPACED format is being phased out. No other UNCERT file
generates this format. It only remains to serve older users and pre-existing files.
Equally Spaced
If x = constant, y = constant, and z = constant, follow these instructions. The UNCERT equal
dimensioned block grid file format specifies the number of rows (n), columns (m), and layers (o) in
the grid, that the file is EQUALLY_SPACED (spelled exactly as shown in upper case), and the
width of the map in the X, Y, and Z directions. When specifying the grid, the bottom layer is
specified first, upward to the top layer last. The row and column values are defined west to east on
a single line, then north to south on successive lines. The format is as follows:
# columns # rows #layers
[integer][integer][integer]
EQUALLY_SPACED
map width (X) map width (Y) map height (Z)
[real][real][real]
row-1:col-1:lay-1 row-1:col-2:lay-1 ...... row-1:col-n:lay-1
[real]
row-2:col-1:lay-1row-2:col-2:lay-1 ...... row-2:col-n:lay-1
[real]
:
:
309
Block
17
8
7
7
11
10
7
24
19
13
13
19
24
7
10
11
7
7
8
17
14
9
8
8
7
6
5
13
10
9
9
10
13
5
6
7
8
8
9
14
11
15
7
9
8
6
5
4
5
6
6
5
4
5
6
8
9
7
15
11
29
31
18
10
9
7
6
6
7
7
7
7
6
6
7
9
10
18
31
29
34
36
29
17
12
8
7
7
8
9
9
8
7
7
8
12
17
29
36
34
Non-Equal Dimensions
If x does not equal a constant, y does not equal a constant, or z does not equal a constant,
follow these instructions. The UNCERT equal dimensioned block grid file format specifies the
number of rows (n), columns (m), and layers (o) in the grid, that the file has
IRREGULAR_GRID_SPACING (spelled exactly as shown in upper case), and the cell dimensions
in the X, Y, and Z directions. When specifying the grid, the bottom layer is specified first, upward
to the top layer last. The row and column values are defined west to east on a single line, then north
to south on successive lines. The format is as follows:
# columns # rows #layers
[integer][integer][integer]
IRREGULAR_GRID_SPACING
col-1-width col-2-width ...... col-n-width
[real][real][real]
row-1-width row-2-width ...... row-m-width
[real][real][real]
lay-1-height lay-2-height ...... lay-o-height
310
[real][real][real]
row-1:col-1:lay-1 row-1:col-2:lay-1 ...... row-1:col-n:lay-1
[real]
row-2:col-1:lay-1 row-2:col-2:lay-1 ...... row-2:col-n:lay-1
[real]
:
:
row-m:col-1:lay-o rowm:col-2:lay-o ...... row-m:col-n:lay-o
[real]
The example data file, well.bck (Figure 13.5) is shown below:
10 10 2
IRREGULAR_GRID_SPACING
20 10 5 2.5 1.25 0.625 0.625 1.25 2.5 5
20 10 5 2.5 1.25 0.625 0.625 1.25 2.5 5
2.5 7.5
1
3
8
11
14
2
5
7
12
16
4
8
10
15
26
7
10
16
22
34
9
14
21
30
55
6
11
18
25
51
5
10
15
20
40
4
9
13
18
38
3
7
10
15
22
1
3
6
8
15
1
3
8
11
17
2
5
7
18
22
4
8
10
15
26
7
10
16
22
34
9
14
21
30
55
6
11
18
25
51
5
10
15
20
40
4
9
13
18
38
3
7
10
15
22
1
3
6
8
15
23
20
31
45
63
96
90
91
86
43
35
45
31
45
63
96
90
91
86
43
28
34
32
36
70
98
99
97
87
53
40
55
32
36
70
98
99
97
87
53
: column spacing
: row spacing
: layer spacing
31
22
40
31
34
21
29
13
55
26
82
38
79
21
67
17
23
7
13
4
45
29
59
31
34
21
29
13
55
26
82
38
79
21
67
17
23
7
13
4
8
15
10
5
7
9
0
0
0
0
8
15
10
5
7
9
0
0
0
0
Non-Gridded Files
Sometimes it is important to display raw field data in three dimensions. The file format for this data
is the same as described in Chapter 5 (plotgraph) for the basic and the GEO-EAS formats. There
are several requirements for this file though. There must be at least one column for the X, Y, and Z
coordinate for each point. On a practical basis there also needs to be a point value column. Four
columns are sufficient if only points are to be drawn or all points are in a single line. If the points
are to be connected by different lines, there needs to be a column identifying the line the point
belongs to. Note, points in a single line must be consecutive. Points with the same line ID,
UNCERT Users Manual
311
Block
separated by another line, are drawn as separate lines! If point symbols are to be used, another
column is required. The same is true, if points or line segments are to be colored. The symbol code
is:
-1= No Symbol
0 = Circle
1 = Cross
2 = Diamond
3 = Square
4 =X
The color code is:
0
1
2
3
4
5
6
7
= Black
= White
= Red
= Green
= Blue
= Magenta
= Yellow
= Cyan
Blanking Files
Blanking files can be in any three-dimensional block format, but they must match the grid
dimensions of the grid file already open. The only difference between a blanking file and the other
formats is that the grid values are either 1s or 0s. If the value is 1, the grid cell will be plotted. If
the value is 0, the grid cell will not be plotted.
Block Mathematics
In order to make the software package useful, sophisticated computer graphics algorithms are
necessary to convert field and model data into images the computer monitor or the printed page that
makes sense to the user. The mathematics for block are identical to those used in surface. See
Chapter 12 for details about rotations and transformations, hidden block and line removal, parallel
transformation, and back to front drawing.
312
Bibliography (block)
Bibliography (block)
Cohen, J,K, and J.W Stockwell, 1994, The SU Users Manual, Center for Wave Phenomena,
Colorado School of Mines, Golden, Colorado.
Gómez-Hernández, J.J. and R.M. Srivastava, 1990, ISIM3D: An ANSI-C Three
Dimensional Multiple Indicator Conditional Simulation Program, Computers in Geoscience,
Vol. 16, No. 4, pp. 395-440.
McDonald, M.G., and A.W. Harbaugh, 1984, A Modular Three-Dimensional Finite-Element Flow
Model, U.S. Geological Survey OFR 83-875.
313
Block
314
Stochastic Simulation:
Sisim & Sisim3d
CHAPTER 14
Sisim is a graphical user interface (GUI) for sisim3d an indicator kriging and conditional stochastic
simulation program for discrete data (non-continuous data: e.g. clay, sand, gravel) developed at
Stanford University by Gmez-Hernndez and Srivastava (ISIM3D, 1990) and modified at the
Colorado School of Mines by McKenna (1994) to utilize soft data (Discussed in Chapter 8
Mathematics section). Up to eight indicators can be modeled in a single simulation. In its basic
form sisim3d can be awkward to use, particularly when many simulations are required based on
varying semivariogram models. This interface assists the user in handling data files, input
parameters, coordinating multiple simulation, tasking jobs to other computers, calculating
simulation statistics, and visualizing results.
This chapter goes into the details of using sisim as an interface for sisim3d. The documentation
supplied with sisim3d is limited, and this program will try to clarify some of the points.
The sisim application is composed of two sections; the main menu-bar and the log-status text area.
The menu-bar is used to select all sisim commands, and the log/status area is used by the program
to report important messages and results. The log/status area may also be used to personally enter
important comments or notes; it is a simple text editor.
NOTE: Sisim and sisim3d are different programs. Sisim is merely a GUI for sisim3d and if
desired, sisim3d can be run independently. The information created by each
program is passed between program mainly using data files. Sisim also passes some
information to sisim3d using command line arguments (Sisim executes sisim3d).
The two programs also communicate during run-time using UNIX network
protocols. This only contains information about the status of the programs (%
completion, etc.) and nothing about the data required for the simulation.
315
save, or create the data files needed for sisim3d. Simulator defines the number of simulations, seed
values, and output file names. Network is used to specify which computers in the network will
participate in solving the models. Run creates required files, executes sisim3d on appropriate
computers, and monitors model completions. Statistics is used after models have been calculated
to evaluate basic statistics about individual simulations or series of simulations. View is used to
view individual model simulations or statistical compilations. Log is used to save, print, or view
any message information printed in the log/status window. Help gives the user a selection of popup help topics. Each menu item is fully described below with all the available options.
FIGURE 14-1. This is an example of the sisim application window. The main menu-bar is on the
top of the application window, and the log/status window in the lower portion.
Project
The Project menu options control project file handling, and exiting the program. The options
include Open Project, View Project, Save Project, Save as, Save Preferences, Quit, and Quit
Without Saving. A project file contains the names of all the files required to run a set of
simulations, the number of simulations to be run, the names of the result files, and the names of the
computers used to calculate the models.
Open Project
Project:Open Project generates a pop-up dialog which allows the user to select an existing project
file. The dialog functions as the File:Open dialog in Figure 5.2 (plotgraph - Chapter 5). The
default project file name extension, though is *.prf.
316
View Project
Project:View Project pops up a simple screen editor with the last saved version of the opened or
saved project file.
Save Project
Project:Save Project saves the names of the appropriate files and computers, and various simulation
parameters to the named project file (See Sisim Output Data File section for file format). If no
project file name exists, the user will be queried for a file name. The dialog functions as the
File:Open dialog in Figure 5.2, except that the file name does not have to pre-exist. For a
description of how to use the dialog, see the File:Save section in Chapter 5.
Save Project as
Project:Save as is used to save the project to a new file. A pop-up dialog similar to that used in
File:Open (Figure 5.2) is created. This option is the same as Project:Save Project except that a
project file name must be selected.
Quit
File:Quit terminates the program, but if additions have been made to the project or any project subfiles, the user will first be queried to supply a file to save the changes in.
Packages
To run sisim3d, several data files or packages are required. Some of these can be created with the
sisim interface, but all must be loaded or defined so that sisim can correctly task out and run the
simulations (Sisim executes sisim3d). There are five required files. They include the 1),
configuration, 2) data, 3) geometry, 4) semivariogram, and 5) soft data uncertainty files. These are
each described below. Note, unless all these files are defined, no simulations can be run.
Configuration
The configuration file specifies several limiting characteristics about the simulation calculation.
Features present in the data set can be temporarily turned off for testing or debugging.
317
Open Configuration
Packages:Configuration:Open Geometry generates a pop-up dialog which allows the user to select
and existing configuration file. The dialog functions as the File:Open dialog in Figure 5.2
(plotgraph - Chapter 5). The default project file name extension, though is *.set.
View Configuration
Packages:Configuration:View Geometry pops up a simple screen editor with the last saved version
of the opened or saved geometry file.
Save
Packages:Configuration:Save saves the current configuration parameter specifications to the
configuration file (See Sisim Input Data File section for file format). If no geometry file name
exists, the user will be queried for a file name. The dialog functions as the File:Open dialog in
Figure 5.2, except that the file name does not have to pre-exist. For a description of how to use the
dialog, see the File:Save section in Chapter 5.
Save as
Packages:Configuration:Save as is used to save the geometry to a new file. A pop-up dialog similar
to that used in File:Open (Figure 5.2) is created. This option is the same as Project:Save Project
except that a project file name must be selected.
Modify
Packages:Configuration:Modify generates the pop-up dialog shown in Figure 14.2. This dialog
allows the user to define several things about what data is treated in the simulation. These generally
effect the speed with which the simulation will run. The options can also be set, so that the
simulation will ignore certain data. This saves having to build new data sets. The Grid Dimension
can be either 2D or 3D. Hard Data Only can be used; if the data set includes soft data, it will be
ignored. No Type B Data is in the data set. If this is set, it will improve the efficiency of some
calculations. Coarse Simulation Only allows only the first pass of the simulation to be run.
Sisim3d normally uses two passes to create a simulation grid. One pass makes a coarse grid, and
the second pass makes a finer grid using the results from the first pass. Often it is worth running the
coarse grid only on the first simulation. This allows the user to fine basic logic errors, without
having to wait for a full simulation to complete.
318
used to set run-time parameters for sisim3d that may simplify the
solution.
Data
The input file containing the X, Y, Z, and indicator data must be read in before any other operations
can be done (Project files can be opened, because they open a data file). This is because other
portions of the program depend on the extents of the data set.
NOTE: There is currently no way within sisim to edit the data file.
Geometry
The geometry file in sisim3d specifies details about the model grid, and how search parameters for
finding data points near the location being evaluated. The following options allow the user to load
and edit an existing geometry file or create a new file.
Open Geometry
Packages;Geometry:Open Geometry generates a pop-up dialog which allows the user to select and
existing geometry file. The dialog functions as the File:Open dialog in Figure 5.2 (plotgraph Chapter 5). The default project file name extension, though is *.geom.
UNCERT Users Manual
319
View Geometry
Packages;Geometry:View Geometry pops up a simple screen editor with the last saved version of
the opened or saved geometry file.
Save
Packages;Geometry:Save saves the current geometry parameter specifications to the geometry file
(See Sisim Input Data File section for file format). If no geometry file name exists, the user will be
queried for a file name. The dialog functions as the File:Open dialog in Figure 5.2, except that the
file name does not have to pre-exist. For a description of how to use the dialog, see the File:Save
section in Chapter 5.
Save as
Packages;Geometry:Save as is used to save the geometry to a new file. A pop-up dialog similar to
that used in File:Open (Figure 5.2) is created. This option is the same as Project:Save Project
except that a project file name must be selected.
Modify
Packages;Geometry:Modify generates the pop-up dialog shown in Figure 14.3. This dialog is used
to specify all of the parameters needed for the model geometry, and the search parameters required
to locate appropriate data points when evaluating a location. Parameters that need to be defined
are: (1) the X, Y, and Z coordinates model grid Origin; (2) the size of each node (Delta X, Y, Z),
coarse and fine;
NOTE: The dimensions of the coarse grid, must be an integer multiple of the fine grid. The
coarse grid is used on the first pass, the fine grid uses the coarse grid when
calculating the fine and final simulation grid.
(3) the number of Nodes in each direction; (4) which nodes will be evaluated during this simulation
(From-To -- this option is used to debug sub-regions of the model); (5) the shape and size of the
search ellipsoid (data points within the search ellipsoid around the node being evaluated will be
used); and (6) details about the search direction rotation. Details about the extents of the data set
are also provided.
The Direction Cosines and the Rotation Flag define the orientation of the search ellipsoid, and are
important to define when the semivariogram models are not isotropic. Under isotropic conditions
the Direction Cosine matrix is an identity matrix (1s on the diagonal, 0s elsewhere), and the No
rotation Rotation Flag is used. For two-dimensional and some three-dimensional data sets it may
be adequate to do a rotation about a single axis; with more complicated models though it may be
necessary to perform a General rotation about all three axes. The values for the Direction Cosine
matrix can be calculated by pressing the Calculate Direction Cosine button. This will create the
320
FIGURE 14-3. Geometry Setup pop-up dialog. This dialog is used to set parameters for the
sisim3d model simulation grid, the debug level, and search parameters for finding neighboring data
points of interest.
pop-up dialog shown in Figure 14.4. Rotation angles should be entered in degrees. You can also
enter the direction cosine values directly. Depending on the Rotation Flag, they are based on the
following matrixs (Foley et al, 1990):
1 0 0
0 1 0
0 0 1
(14-1)
0
0
1
0 cos() sin()
0 sin() cos()
(14-2)
cos( ) 0 sin( )
0
1
0
sin( ) 0 cos( )
(14-3)
321
cos( ) sin( ) 0
sin( ) cos( ) 0
0
0
1
(14-4)
General rotation
cos( ) cos( )
cos( ) sin( )
sin( )
sin
sin
cos
cos
sin
sin
sin
sin
+
cos
cos
sin
) cos( )
(
)
(
)
(
)
(
)
(
)
(
)
(
)
(
)
(
)
(
)
(
cos() sin( ) cos( ) + sin() sin( ) cos() sin( ) sin( ) sin() cos( ) cos() cos( )
(14-5)
NOTE: More complicated the rotation matrixs add significantly to the model solution time.
FIGURE 14-4. Rotation Calculator popup dialog. This dialog is used calculate
the direction cosines.
Semivariogram
With sisim there are two methods of defining the model semivariograms, Single and LatinHypercube Solutions. Single is the most common approach. Latin-Hypercube Solutions are used
only when the threshold or indicator semivariograms have been calculated using the jackknifing
option in vario (Chapter 8) and the latin-hypercube sampling option in variofit (Chapter 9). This
option, on a practical basis is only applied when there is very little hard and soft data.
NOTE & WARNING: Normally, the term indicator semivariogram is used define a
semivariogram that was based on one cutoff between two indicators. For the
remainder of this chapter, these models will be refered to as threshold
semivariograms. The use of the term indicator semivariogram will be reserved for
semivariogram models that were based a single indicator vs. all other indicators.
Functionally, the threshold semivariogram only uses a high cut-off and an indicator
semivariogram uses a high and a low cutoff.
322
When using indicator semivariograms there will be the same number of indicators
as semivariogram models. When using threshold semivariograms there will be one
less semivariogram model then the number of indicators.
Single
Packages:Single model semivariograms and needed for each threshold/indicator in the indicator
model, and there are one fewer thresholds than indicators. This section allows the user to open,
save, and edit the sisim3d semivariogram file. Note that one file contains all of the threshold model
semivariogram definitions.
Open Semivariogram
Semivariogram:Single:Open Semivariogram generates a pop-up dialog which allows the user to
select and existing semivariogram file. The dialog functions as the File:Open dialog in Figure 5.2
(plotgraph - Chapter 5). The default project file name extension, though is *.var.
View Semivariogram
Packages:Semivariogram:Single:View Semivariogram pops up a simple screen editor with the last
saved version of the opened or saved semivariogram file.
Save
Packages:Semivariogram:Single:Save saves the current semivariogram parameter specifications to
the semivariogram file (See Sisim Input Data File section for file format). If no semivariogram file
name exists, the user will be queried for a file name. The dialog functions as the File:Open dialog
in Figure 5.2, except that the file name does not have to pre-exist. For a description of how to use
the dialog, see the File:Save section in Chapter 5.
Save as
Packages:Semivariogram:Single:Save as is used to save the semivariogram to a new file. A pop-up
dialog similar to that used in File:Open (Figure 5.2) is created. This option is the same as
Project:Save Project except that a project file name must be selected.
Modify
Packages:Semivariogram:Single:Modify first generates the pop-up dialog shown in Figure 14.5.
This dialog allows the user to define the Solution Type (Threshold, the must commonly used, or
Indicator), the Number of Thresholds/Indicators that will be used in the simulation and therefore
323
the number of required semivariogram models. It also allows the user to specify which Threshold
Semivariogram is to be edited. Under the Cumulative Distribution Function the Threshold cutoff
(values less then this threshold and greater than lesser thresholds will be evaluated with this
semivariogram model) and the Prior cdf (cumulative distribution function) must be defined. The
Prior cdf represents the decimal percent of the data set with values less than the cutoff (histo
(Chapter 6) can be used to calculate these values). The semivariogram model definition allows up
to four nested structures (sisim3d allows more, and the data file can be edited independently, but it
is felt that more than this many structures is unrealistic, except in the most unusual circumstances).
For each structure, Range, Sill, Semivariogram Model, and X, Y, and Z Anisotropys must be
defined. The model Nugget must also be defined, and if a Power model is used, C Maximum must
be defined. The model specifications are aligned vertically from left to right (1, 2, 3, 4). Note, if
higher order nests are not used, be sure to mark None for the Semivariogram Model type.
When calculating the anisotropys assume the major semivariogram model axis is the X-axis, and
the Y or Z-axis is the minor axis. The X anisotropys will then be 1.0, and the Y and Z anisotropys
will be the X anisotropy divided by the Y or Z anisotropy (With this approach, anisotropys will be
greater or equal to 1.0). Using this approach, the model semivariogram must also be rotated
accordingly. The Direction Cosine matrix should be defined using equations 14-1, 14-2, 14-3, 144, and 14-5 or the dialog shown in Figure 14.4.
Latin-Hypercube Solutions
When there is not enough data to adequately define a model semivariogram, latin-hypercube
sampling can be used to define a range of reasonable model semivariograms (See variofit
Mathematics section, Chapter 9). Simulations can then be run on each model semivariogram for
each threshold.
NOTE: If latin-hypercube semivariogram models are used for one threshold, they must be
used for all thresholds. A latin-hypercube series of model semivariograms, however
can be made up of a single model semivariogram (i.e., for that threshold, there is
negligible uncertainty).
Because of the complexity of this data set, it is best to let variofit (Chapter 9) build the data file.
Refer to Chapter 9 for the data file format if interested.
Open Semivariograms
Packages:Semivariogram:Latin-Hypercube Solutions:Open Semivariogram generates a pop-up
dialog which allows the user to select and existing latin-hypercube semivariograms file. The dialog
functions as the File:Open dialog in Figure 5.2 (plotgraph - Chapter 5). The default project file
name extension, though is *.lhc.
324
FIGURE 14-5. Define Semivariogram Models pop-up dialog. This dialog is used to define the
View Semivariograms
Packages:Semivariogram:Latin-Hypercube Solutions:View Semivariogram pops up a simple screen
editor with the last saved version of the opened or saved latin- hypercube semivariograms file.
Uncertainty
The uncertainty file describes the probability distributions for the soft data. Even if no soft data is
used still file is required, though it is very simple.
UNCERT Users Manual
325
NOTE: There is currently no way within sisim to create or edit the uncertainty file.
Simulator
Simulator:Modify generates the pop-up dialog shown in Figure 14.6. This dialog used to specify
parameter related to the number of model simulation calculated, the destination file name, and the
random number used to start each simulation. The starting Random Number Seed is used for
defining the path through the model grid and making random picks from the cdf to determine a
node value. The seed is incremented by a constant amount (Seed Increment) for each simulation.
FIGURE 14-6. Simulator Options
pop-up dialog. This dialog is used
to define the random number seed
generator and increment, the
number of simulations to create,
and the simulation file name
prefixes.
WARNING:
If a Starting Simulation Number other than 1 is used, the Random Number Seed will be
incremented accordingly. This allows simulations to be rerun individually without having the user
calculate the appropriate seed. After the simulations are run, the simulation results will be saved to
326
files based on the Output File Name selected here. If example.junk is specified, for 5
simulations, starting at simulation 5. The output files would be named:
Coarse Grid
Fine Grid
example.junk.cor.5.sim
example.junk.cor.6.sim
example.junk.cor.7.sim
example.junk.cor.8.sim
example.junk.cor.9.sim
example.junk.5.sim
example.junk.6.sim
example.junk.7.sim
example.junk.8.sim
example.junk.9.sim
These files are saved in NODE CENTERED GRID format (See Chapter 11, Data File Format
section).
Network
Sisim is designed to work in a stand-alone networked environment. Because of the number of
simulations computed, even though each one is fairly fast, it can take a long time to complete all of
the simulations. Were resources permit, it may be better to distribute the processes over a number
of different machines (parallelize). This uses up the computational resources of more computers,
but it gets the job done faster. Depending on the site, these jobs can be timed to run during low use
periods to reduce the impact to other users (They are automatically set to run at the lowest
priority). At many locations, though, jobs will have to be run only on the machine currently being
used (there is no network; you dont have the rights, privileges, or priority to use other computers).
NOTE & WARNING: To run programs over the network, in the users login directory, there
must be a .netrc file on each computer used. This file is a list of computer names,
and the user name, and the user password for each computer. This file allows the
user to remotely run processes on other computers in the list. Also note, this file has
the user password spelled out; it is not encrypted! Make sure that file protections are
set for this file so that only the user can read it! Also note, anyone with root
privilege can read this file regardless of the privileges!
Mode
Network:Mode allows the user to select between Single computer mode (no network connection
required) or between Multiple networked computer mode.
327
Select Computers
Network:Select Computers generates the pop-up dialog shown in Figure 14.7. This option is only
valid when sisim is in the Multiple Mode. This shows a list of the computers available for use (The
list comes from the loaded *.net file. It is the users responsibility to insure this file is correct). By
default no computers are used. Select the computers which are available. All computers on the list
can be marked, but there are several thing to keep in mind:
1). Each selected computer is a client. The computer running sisim and managing the
other computers, is the server. The computer running sisim can also be selected to run
sisim3d processes, in which case it is also a client.
2). If only a few simulations are required, you might want to use only the fastest
machines.
3). If other people are using a given computer, if possible it should be avoided.
When using other computers, consider your job priority versus that of other users!
Run
Run:Now will start the simulation process. The dialog in Figure 14.8 will be displayed while any
simulation is still being executed. To stop the simulations, press the Stop Remaining Simulations
button. When all the simulations are complete, the dialog will disappear. During the simulation
process, a number of things occur, these are listed below:
As needed, appropriate files are created as input for each sisim3d simulation. For
example, when latin-hypercube semivariogram models are used, that file format is
incompatible with sisim3d, and a single semivariogram model type (*.var) file needs to
be created. These files are created, passed as command line arguments, and destroyed
when the simulation is completed. Output file names are also generated and passed to
sisim3d.
Make UNIX system calls and remote shell commands to run sisim3d on the appropriate
computers
Sisim receives status messages from the client sisim3d programs on there completion
status. To a certain extent sisim maintains a link with the client computer to make sure it
is still available. If the client computer stops responding, it will be removed from the
computer list, and its processes will be redirected to another computer.
328
The status of simulation executions will be printed to the log/status window (Figure
14.1).
329
Statistics
After one simulation is complete, or after all the simulations are complete, it may be useful to
examine statistically simulation results; i.e. what is the distribution of indicators for an entire
simulation, or what is the probability a particular cell will be occupied with a particular indicator.
Model Summary
Model Summary is used to determine the distribution frequency, and frequency variance and
standard deviation for each indicator in a single simulation, or in all on the simulations.
Individual Simulation
Statistics:Model Summary:Individual Simulation will generate the pop-up dialog in Figure 14.9.
This dialog will allow the user to specify which simulation series is of concern and which
simulation in the series will be evaluated. The statistics when calculated (Press View Statistics)
will be display in a pop-up dialog similar to Figure 14.10. The statistics can also be echoed to the
log/status window if the Print Statistics to Log/Status Window toggle is set. In this dialog, for each
indicator, the cell count and frequency of occurrence is displayed. The cumulative count and
frequency, mean, median, and mode are also displayed.
14-9. Model
Statistics
Summary:
Individual Simulation pop-up dialog. This dialog is
used to specify which simulation is of interest and
calculate the statistics.
FIGURE
All Simulations
Statistics:Model Summary:All Simulation will generate the pop-up dialog in Figure 14.11. This
dialog will allow the user to specify which simulation series is of concern, the output file name
prefix, and which simulations (from-to) in the series will be evaluated. The statistics when
calculated (Press Calculate Map & View Statistics) will be display in a pop-up dialog similar to
Figure 14.12. The statistics can also be echoed to the log/status window if the Print Statistics to
Log/Status Window toggle is set. In this dialog, for each indicator, the frequency of occurrence,
variance, and standard deviation is displayed. The cumulative frequency, mean, median, and mode
are also displayed. Calculated and created with the statistics, are several maps. The Probability
330
FIGURE 14-10. Statistical Summary of Single Simulation pop-up dialog. This dialog displays
Map Files specify the calculated probability that the given indicator will be present in each cell
(Figures 14.13a, 14.13b, and 14.13c). The Certainty Map File indicates the maximum probability
of occurrence of any indicator at every cell location (Figure 14.14). This map highlights zones of
good and poor data control. It, however cannot be used to identify what indicator is present. The
final map created is the Best-Guess Map. This map will determine which indicator is the most
probable at each cell location, and assign the appropriate indicator value (Figure 14.15). If enough
simulations were run, this map should appear nearly identical to an indicator kriged map which
always selected the best-guess (0.50) cdf indicator.
FIGURE 14-11. Model Statistics Summary: All Simulations pop-up dialog. This dialog is used to
specify which simulations are of interest, calculate their statistics, and use block to display the
results.
331
This dialog
View
View:Map generates the pop-up dialog shown in Figure 14.16. This dialog allows the user to view
individual simulation results. The simulation series name and the simulation number must be
specified. Pressing the Block Map will pass the desired simulation to block (Chapter 13).
Individual simulation examples are shown in Figures 14.17a and 14.17b.
Log
The Log menu option is supplied to allow the user to save, view, or print all text which has been
written to the log/status window by the program or added by the user (The log window is also a
simple text editor). The options include View Log, Save, Save as, and Print. View Log, Save, and
Save as are similar in operation to the menu options under File described above.
Help
Help lists topics about the program for which there is help. When a item is selected a pop-up dialog
with a scrolled text area is generated which is similar to Figure 5.15 with the desired information.
NOTE: Only one help window may be open at a time. Help files are editable ASCII data
files; for further information see Appendix D.
332
FIGURE
a).
b).
c).
333
334
a).
b).
335
336
{}
=
=
=
=
integer
float
character string.
variable is an array. Values must be seperated by a , and no spaces are
allowed. Do not use the { } symbols on the command line.
NOTES:
=
=
=
=
=
=
=
=
=
default =
default = sisim3d.cou
default = sisim3d.dat
default = sisim3d.dbg
default = sisim3d.geom
-set
-sim
-unc
-var
=
=
=
=
default = sisim3d.set
default = 1
default = sisim3d.unc
>default = sisim3d.var
default = sisim3d.out
default = -1
default = 0
337
Project Files
The project tells sisim what file need to be loaded for input, what file names will be for output, the
seed, the seed increment, the starting simulation number, the number of simulations that will be
run, and the list of computers that will be used for the model calculations. The format is specified
below:
configuration file name
hard conditioning data file name
geometry file name
semivariogram file name
uncertainty file name
latin-hypercube semivariogram file name (NA if not used)
debug file name
simulation output file name prefix
computer list file name
seed
seed increment
starting simulation number
number of simulations
list of computers used (one name per line)
An example file is three.prj.
338
record 5: from.x, from.y, from.z, x,y and z indices of the beginning of the simulation
sub-area (adimensional)
record 6: to.x, to.y, to.z, x,y,z indices of the end of the simulation sub-area
(adimensional)
record 7: radius.x, radius.y, radius.z, search ellipsoid radii. Special care should be taken
in setting these radii since the speed of the search will be dependent on the
number of cells contained within the ellipsoid. Also the memory required is
proportional to to the product of the three radii. (in units of length)
record 8: coarse_radius.x,coarse_radius.y,coarse_radius.z, search ellipsoid radii for the
coarse simulation. By setting a coarse grid spacing large enough, a large search
radius can be used over the coarse grid without containing a disordinate
number of cells. (in units of length)
339
record 9, 10, and 11:direction cosines of the rectangular system of search axes.Same for
all structures and all variograms. (adimensional)
record 12: rotation, flag indicating a particular case of the rotation matrix, this record is
the result of the procastination of the author who could have written a small
routine to find out with respect to which axis is the rotation of the search
ellipsoid. In any case, this value is required to optimize the storage of the
information associated with the search.
rotation = 0, (no rotation; identity matrix)
rotation = 1, (rotation around the x axis)
rotation = 2, (rotation around the y axis)
rotation = 3, (rotation around the z axis)
rotation = 4, (general rotation)
record 13: orig_max_per_octant_1,
new_max_per_octant,
octant_percent,
orig_max_per_octant_2 and new_max_per_octant_2, maximum number of
original hard data points and other conditioning points per octant to be retained
for kriging, the percent completion at which to switch from the 1 flags to the 2
flags.
record 14: kriging flags, 1st one is for the first part of the simulation (up tp krig_percent
completion), 2nd one is for the remainder of the simulation. 0 = SK, 1 = OK.
Krig_percent, percentage completion at which to switch from flag1 to flag2.
record 15: Maximum and minimum weight to be applied to the global prior cdf in the
estimation of the local posterior cdfs and the maximum number of points found
in the search neighborhood for which the prior global cdf will be used as the
local posterior cdf. Standard simulation practice would set all three variables
to zero.
record 16: ok flag, dbg, the ok flag specifies the number of data points necessary within
the search neighborhood to use ordinary kriging (assuming record 14 is set to
1), the debugging flag is generally set to 0, if set to 3 or higher, a _large_
amount of information is output, this option should be used only if the
simulation subarea is very small. (adimensional)
An example file is three.geom.
340
record 1: nind, and prior cdf flag, the number of thresholds (equal to the number of
discretization classes minus one) (adimensional) and the flag indicating which
variable soft or hard is used as the true cdf in the cokriging. A 0 means the
prior cdf differences at each threshold are given as hard cdf - soft cdf, a 1
means soft cdf - hard cdf. 0 assumes the hard data cdf is the best estimate
of the true cdf and 1 assumes the soft data cdf is closer to the true estimate.
NOTE:
the different variograms for each indicator variable are input in order starting
with the indicator corresponding to the smaller threshold.
record 2: threshold value associated with first indicator variable and the differences in
the hard and soft cdf at this threshold calculated according to the flag in record
1.
record 3: prior cdf and difference between the primary and secondary variable cdf the
prior cdf is used in case that simple kriging estimation is required, or if OK is
used and no data points are found in the search neighborhood.
record 4 nugget, of the indicator variogram
record 5: cmax, an upper bound of the maximum value that the variogram can reach
within the limits of the search neighborhood. To be used with power
variograms only, although a value must be input for all variogram models
record 6: num_struct, number of nested structures in the variogram
record 7 type, of each nested structure
1: spherical
2: exponential
3: gaussian
4: power
record 8: sill, of each structure (scaling coefficient for the power model)
record 9: , range of each structure (power for the power model) (in units of length)
NOTE:
it is assumed that all the nested structures have the same axes of anisotropy
although they can have different anisotropy ratios. The anisotropy ratio is the
value that multiplied by the range in that direction gives the range input in
record 8.
record 10: anis.x, anisotropy ratio of each structure in the x direction after rotation of the
cartesian axes (adimensional)
record 11: anis.y,anisotropy ratio of each structure in the y direction after rotation of the
cartesian axes
record 12: anis.z,anisotropy ratio of each structure in the z direction after rotation of the
cartesian axes
record 13,14,15: direction cosines of the rectangular system of anisotropy axes.
records 2 to 15 are repeated for each of the remaining indicator variables.
An example file is three.var.
341
.netrc
The .netrc file is a UNIX system which allows a user application to run processes on remote
computers. This file must be on each computer used in the users login directory. This file is a list
of computer names, and the user name, and the user password for each computer. The file format
is:
machine computer name login user name password user password
One entry is needed for every computer. An example file might look like:
342
This file has the user password spelled out; it is not encrypted! Make sure that
file protections are set for this file so that only the user can read it! Also note,
anyone with root privilege can read this file regardless of the privileges!
Debug
Depending on the debug level set, sisim3d will list out more or less detail about the simulation
calculations. The output is in a free format.
Sisim Mathematics
Kriging and Indicator Kriging
Kriging is a statistical estimation technique used to assign property values at locations where no
data exists (where data exists it is exactly honored). The theory of kriging will not be discussed
here, but the refinement of theory used in indicator, and Bayesian kriging will be discussed.
Once the semivariograms have been developed, the sample data can be indicator kriged or Bayesian
kriged at each cutoff. The process of determining the weight of sample values at the point being
estimated is identical to that used in ordinary kriging whether blocks or points are being evaluated:
F(gc) = S wii(xi)
(14-6)
343
Z*(x) = S bixi
(14-7)
where wi and bi are weights, gc is the global distribution, F(gc) and Z*(x) are kriged estimates, and
the summations are from 1 to the number of data points (n). Note that these are basically the same
equations except that equation 14-7 is multiplied by the indicator value (0 or 1). To determine the
indicator value at the prescribed point, a cumulative distribution function (cdf) is developed. In
Figure 14.18, a simple example is shown for defining the cdf for an individual block. In this case,
five samples are equally distant from the block (and within the range of influence), and therefore
the weights are equal (w1 = w2 = w3 = w4 = w5 = 0.20). Cutoffs were set at 0.02, 0.10, 0.13, and
0.26. Only one point is less than or equal to the first cutoff (0.02) so there is a 20% probability the
value at the point is less then 0.02, 40% probability of being less then 0.10, 60% probability of
being less then 0.13, and an 80% probability of being less then 0.26.
14-18. Using
indicator
kriging, with the block equidistant from
five sample locations, the best indicator
estimate (50% probability), determined
from the cdf, for the block hydraulic
conductivity (K) is 0.12 cm/sec. If
ordinary kriging were used, the best
estimate would be 2.21 cm/sec. This
shows how indicator kriging does not
allow high-value outlier data to
overwhelm more plentiful low-value
data. The cdf may also be used to
approximately indicate the probability
the block estimate is below a certain
value; here the cdf suggests that there is
a 75% probability the grade of the block
is below 0.24 cm/sec. Use of stochastic
simulation would yield a small
percentage of realizations with a large K
for the block. If enough blocks have the
potential to be defined as high K, some
realizations will exhibit a continuous
zone of high K.
The number of
realizations which exhibit this condition,
divided by the total number of
realizations is the probability that such a
condition exists.
FIGURE
From this point several tacks may be taken in evaluating the indicator data based on the cdf; 1)
maps can be made defining the best estimate of parameter values (value defined equal the value
equal to the 50% probability), 2) maps can define the probability that the value of a parameter is
above or below some specified level, 3) maps can define the parameter value above or below given
a specified probability, or 4) realization maps can be made where the values are determined by
344
Sisim Mathematics
randomly selecting the indicator for each location from the cdf; this last option is a stochastic
simulation.
Stochastic Simulation
There is a distinct difference between ordinary kriging (and most other estimation methods) and
Bayesian Kriging with conditional simulation. Most techniques tend to average or smooth the data
to achieve a best estimate of conditions between measured points. Conditional simulation provides
a means of representing the variability of observed in nature, while still honoring the field data
(Figure 14.19). Conditional simulation does not produce a best estimate of reality, rather it yields
equiprobable models with characteristics similar to those observed in reality.
FIGURE 14-19. Ordinary kriging (and most other estimation methods) tends to average or smooth
the data to achieve a best guess of reality. Bayesian kriging with conditional simulation provides a
means for modeling the variability observed in nature, while still honoring the field data.
Conditional simulation does not produce a best estimate of reality, but it yields equiprobable
models with characteristics similar to reality. Unfortunately, when only hard data is used, the range
of values for the conditional simulation is limited to the range of the hard data. Because it is not
reasonable to expect the exploration program to identify the full data range, by incorporating soft
data, these bounds can be exceeded. When multiple simulations are made their average values will
approximate the smoothed, best fit curve.
The process of stochastic simulation, described by Gmez-Hernndez and Srivastava (1990), takes
advantage of cdfs determined by indicator kriging, and Monte-Carlo techniques. To generate an
individual realization, or a stochastic simulation, a search grid is selected (Figure 14.20). Starting
with the first indicator range (e.g. clay), grid blocks at hard data locations are defined as 1 (clay)
or another indicator type (e.g. 2 = sand, 3 = gravel; in kriging calculations these values are
treated as 0 if another indicator is being evaluated or as 1 if it is the indicator currently under
consideration). At soft data locations, blocks are defined with the aid of a random number
generator. A random number between 0.0 and 1.0 is generated. If the value is less than the
probability the property exists, the location is defined as the given indicator type; if the random
number is greater than the probability, the indicator exists, then the block is defined with an
alternate indicator type (for example, if the location has a 70% probability of being clay, 20% sand,
UNCERT Users Manual
345
and 10% gravel, and a random number of 0.87 is generated, the location is defined as sand, 2).
Because many realizations are created, at this location, clay will be present about 70% of the time,
sand 20%, and gravel 10%. When all the hard and soft data are entered, a random starting location
within the model grid is selected and the location is kriged based on the indicator cdf and a new
random number. The cdf, at this point is based only of the hard and soft data, where the soft data
are treated as hard or exact (It has been defined and is now known). If the random number is less
then the probability, the indicator value exists, a 1 is assigned (i.e. clay is present at the grid
location), otherwise a 0 is defined (another indicator is present). Next, another random grid
location for which an indicator has not been defined is considered, and its indicator value is
determined based on the hard and soft data, and the previously kriged indicators at other locations
(now considered a hard data values for the remainder of the simulation). This process of selecting
random grid locations and kriging them, based on the hard, soft, and previously kriged data, is
continued until all grid locations are defined and a map of 1s (clay) and 0s (not clay) is
created. The next indicator range is then selected (sand) and all the locations still containing 0s
are re-kriged (here the cdf is based only on the possibility the parameter value is sand or gravel).
This re-kriging process is repeated until all the indicator ranges have been evaluated and the map is
composed of all 1s.
FIGURE 14-20. This illustration shows the step wise manner in which a grid is kriged using
Bayesian Kriging in conjunction with stochastic simulation. Grid cells containing hard data are
defined, cells containing soft data are evaluated, and finally cells with no information are kriged.
To krige the cells with no data, a random starting location is selected, and as a cell is kriged, the
next undefined cell is randomly selected. This process is continued until all grid cells have been
visited.
To create another realization, the process is repeated. Soft data locations are re- evaluated, cdfs are
calculated, a new random path through the grid is selected and each grid location is re-kriged and
simulated. Alternative realizations can be created following this process until the desired number
of simulations are created.
346
Bibliography (sisim)
Each realization honors the statistics of the original data, and has equal probability of existing.
These realizations can be used as maps of parameters for modeling of the site.
Bibliography (sisim)
Foley, J.D., A. Van Dam, S.K. Feiner, and J.F. Hughes, 1990, Computer Graphics, Principles and
Practice, Addison-Wesley, Reading, Massachusetts.
Gmez-Hernndez, J.J. and R.M. Srivastava, 1990, ISIM3D: An ANSI-C Three Dimensional
Multiple Indicator Conditional Simulation Program, Computers in Geoscience, Vol.. 16, No. 4,
pp. 395-440.
McKenna, S.A., 1994, Utilization of Soft Data for Uncertainty Reduction in Groundwater Flow and
Transport Modeling, Ph.D. Dissertation, Colorado School of Mines.
347
348
MODFLOW Interface:
Modmain
CHAPTER 15
Modmain is a graphical user interface for MODFLOW, the MODular three-dimensional, finite
difference FLOW model developed by the United States Geological Survey (McDonald and
Harbaugh, 1984). MODFLOW is a program designed to model ground water flow and heads
(pressure and elevation) in confined and unconfined aquifer systems. In its basic form,
MODFLOW can be difficult, or awkward to use. The modmain program module is designed to
simplify data entry, model editing, and analysis of results.
This chapter goes into the details of using modmain as an interface for MODFLOW. It however
does not explain the theory or use of MODFLOW, that is better left to the MODFLOW users
manual (McDonald and Harbaugh, 1984).
WARNING:
To use modmain and MODFLOW effectively, you must have the MODFLOW
users manual, and it should be readily available whenever you are building
data sets with modmain. In the current release, modmain does not error check
data file formats; this can lead to incorrect numbers for any variable, and it can
cause segmentation faults which will terminate modmain. To find the
problem, the data files may have to be examined line by line to determine
where the problem is. This can only be done if the MODFLOW users manual
is available!
The modmain application is composed of two sections; the main menu-bar, and the status and log
text area. The menu-bar is used to select all modmain commands, and the log/status area is used by
the program to report important messages or results. In addition to the main window and
supporting pop-up dialog windows, a graphical editor is available for creating and modifying twodimensional arrays. Modmain also uses other UNCERT (grid, contour, surface, and block)
modules for visualizing model output.
349
saving, naming project files), and allows the user to quit the application. Project is not a feature of
MODFLOW, but it allows a complete set of MODFLOW data files to be handled as a set; this
option controls the loading, and saving of these project files. This menu-bar option also allows
the user to quit the application. Packages allows the user to individually load, modify, or save each
MODFLOW package data file. Run executes MODFLOW using the currently defined data files.
View allows the user to view the standard text output file or view the model results using grid
(Chapter 10) and contour (Chapter 11), surface (Chapter 12), or block (Chapter 13). Simulator and
Network currently are not installed, but will allow MODFLOW to be run using different data files
describing material distributions simultaneously on different computers over the network. Log
allows the user to save to a file all information printed to the log-status window. Help gives the user
a selection of pop-up help topics. Each menu item is fully described below with all the available
options.
FIGURE 15-1. This is an example of the modmain application window. The main menu-bar is on
the top of the application window, and the log/status window in the lower portion.
Project
The Project sub-menu options control project file handling, and exiting the application. The
options include Open Project, View Project, Save Project, Save Preferences, Quit, and Quit Without
Saving.
Open Project
Selecting Project:Open Project generates a pop-up dialog which allows the user to select an
existing data file. This dialog functions as the File:Open dialog in Figure 5.2 (plotgraph - Chapter
5) and allows the user to select an existing project file. The default project file name extension,
though is *.prj.
350
View Project
Project:View Project pops up a simple screen editor with the last opened or saved version of the
project file.
Save Project
Project:Save saves the name of the MODFLOW files currently being used. If a save file has
already been opened, the data are simply saved. If a save file has not been selected yet, a pop-up
dialog similar to that used in File:Open (Figure 5.2) is created. The main difference between the
Open and the Save dialog is that to save a file, the file does not have to pre-exist. For a description
of how the dialog works, see the File:Save section in Chapter 5.
Save Preferences
When using programs with many user options, it is not possible for the program to always pick
reasonable default values for each parameter or input variable. For this reason preference files were
created (See Appendix C). These allow the user to define a unique set of defaults applicable to
the particular project. When File:Save Preferences is selected, modmain determines how all the
input variables are currently defined and writes them to the file modmain.prf.
WARNING:
If modmain.prf already exists, you will be warned that it is about to be overwritten. If you do not want the old version destroyed you must move it to a
new file (e.g. the UNIX command mv modmain.prf modmain.old.prf would be
sufficient). When you press OK the old version will be over-written! This
cannot be done currently from within the application. To rename the file, you
will have to execute the UNIX mv command from a UNIX prompt in another
window.
If modmain.prf does not exist in the current directory, it is created. This is an ASCII file and can
be edited by the user. See Appendix C for details.
Quit
Project:Quit terminates the program, but if changes have been made to any MODFLOW package,
the user will first be queried to supply appropriate filenames for the modified files. Also, if
packages have been added, deleted, or substituted with a new file, the project file will also have to
be saved.
351
Packages
To use MODFLOW, there are a number of different packages that can be used: Basic, Block
Centered Flow, Drain, Evapotranspiation, General Head, MT3D, Output Control, PCG, Recharge,
River, SIP, SSOR, Well, and Utility. The only ones required are the Basic and Block Centered Flow
packages, with either the
PCG2, SIP, or SSOR solver. MODFLOW will run, however with only the Basic module without
error, but no meaningful information will be calculated. When using the Packages pull-down
menu, all of the available MODFLOW packages are displayed. The titles are also color coded;
RED indicates that with the current settings, this package is required, but has not been defined.
GREEN also means it is required, but it is defined sufficiently for MODFLOW to run (This does
not mean all data entries are correct for the particular model). BLACK means that the package is
not currently needed, and it may be ignored.
NOTE: When first starting modmain, a project must be loaded, a Basic package file must be
loaded, or a Basic package must be created (Packages:Basic:Modify menu option)
before any package besides the Basic package can be read or created. This is
because the Basic package contains the grid row, column, layer dimension
information that all of the other packages require.
Each package has a pull-down sub-menu with five menu options: Open Package, View
Package:Save, Save as, and Modify. The Open option generates a dialog similar to that shown in
Figure 5.2. The dialogs works the same as that dialog too, except that the default file name
extensions are different. For each package the default file name extensions are:
BASIC:
BLOCK CENTERED FLOW:
DRAIN:
EVAPOTRANSPIRATION:
GENERAL HEAD:
OUTPUT CONTROL:
PCG2:
RECHARGE:
RIVER:
SIP:
SSOR:
WELL:
*.bas
*.bcf
*.drn
*.evt
*.ghb
*.oc
*.pcg
*.rch
*.riv
*.sip>
*.ssor
*.well>
These file extensions are strictly conventions, and do not have to be followed. It, however, is
recommended that you follow some consistent naming convention. The View menu option will
display the last saved or loaded version of the data file.
NOTE: Changes made to a package within the modmain application (using Modify below)
will not be reflected in the data file until the changes have been saved (see Save
below).
352
The Save menu option will save any modifications, overwriting the last opened or saved package
file. If no package file has been loaded or saved previously, a pop-up dialog will appear similar to
Figure 5.2s, but showing the appropriate default file extension. To save the package file, select an
existing file, or enter a new file name, then press the Save button on the dialog. Save as is similar
to Save, except that you are queried for a file name. Modify will generate a new pop-up dialog
which will allow the user to enter all the appropriate data for that particular package. These
package dialogs are discussed below.
NOTE: The packages and dialogs discussed below explain how and where to enter data and
package parameter values. The meaning of different variables is not discussed, and
is left to the MODFLOW users manual (McDonald and Harbaugh, 1984).
WARNING:
As dialogs are generated, default values will be assumed. These values though
may have no meaning with regard to a particular model, and it is the modelers
responsibility to insure all entries are correct.
Basic
Selecting Packages:Basic:Modify will generate the pop-up dialog shown in Figure 15.2. This
dialog allows all the parameters needed for the Basic package to be defined. Listed below are the
MODFLOW variable names with a description of the equivalent dialog entry.
NOTE and WARNING: Before any grid/array can be defined, the number of rowscolumns
and layers must be defined. Also, once these values are defined, they cannot be
changed without restarting modmain. If the values are not define, the error message
in Figure 15.3 will be displayed.
1). HEADNG(32)
2). HEADNG(continued)
3). NLAY
NROW
NCOL
NPER
ITMUNI
4). IUNIT
Heading (#1)
Heading (#2)
Layers
Rows
Columns
Stress Periods
Time Units: Pressing the appropriate menu toggle (Undefined,
Seconds, Minutes, Hours, Days, Years) will set the value
appropriately.
Packages Used and Solver Used: By pressing the toggle
button for each package, it may be selected and a unit
number will by assigned that package. For the solver,
select either the SIP or the SSOR package. The default unit
numbers are as follows (These unit numbers must be
honored):
BAS
BCF
=1
= 11
353
FIGURE 15-3. Grid Setup Error pop-up dialog. If the row, columns, and layers, defined in the
Basic Package all equal 1, a valid model grid has not been defined. Before an array can be edited
the row, column, and layer dimensions must be defined.
DRN
EVT
GHB
MT3D
OC
PCG2
RCH
RIV
SIP
354
= 13
= 15
= 17
= 32
= 22
= 23
= 18
= 14
= 19
SOR
WEL
5). IAPART
ISTRT
6). IBOUND
7). HNOFLO
8). Shead
9). PERLEN,
NSTP, &
TSMULT
= 21
= 12
355
FIGURE 15-5. Block-Centered Flow pop-up dialog. This dialog is used to set parameters for
MODFLOWs Block-Centered Flow Package. Input variables consist of layer type and layer
specifications, grid dimensions, and whether the model is transient or steady-state.
IBCFCB
2). LAYCON
3). TRPY
4). DELR
5). DELC
6). sf1,
Tran,
HY,
BOT,
Vcont,
sf2,
356
TOP
Drain
Selecting Packages:Drain:Modify will generate the pop-up dialog shown in Figure 15.8. This
dialog allows all the parameters needed for the Drain package to be defined. Listed below are the
MODFLOW variable names with a description of the equivalent dialog entry:
1). MXDRN
IDRNCB
357
FIGURE 15-7. Layer by Layer Definition pop-up dialog. For each layer, depending on the layer
type, the position of the layer, whether the model is steady-state or transient, etc. various arrays
need to be defined for each layer. This dialog specifies which arrays are needed.
FIGURE 15-8. Drain Package pop-up dialog. This dialog is used to set drain parameters and cell-to
358
Row,
Col,
Elevation,
Cond
359
Evapotranspiration
Selecting Packages:Evapotranspiration:Modify will generate the pop-up dialog shown in Figure
15.11. This dialog allows all the parameters needed for the Evapotranspiration package to be
defined. Listed below are the MODFLOW variable names with a description of the equivalent
dialog entry:
1). NEVTOP
IEVTCB
To define the remaining parameters, press the Time Dependent Evapotranspiration Data button.
This will create the dialog shown in Figure 15.12. This dialog allows data entry for each stress
period.
2). INSURF
INEVTR
INEXDP
INIEVT
NOTE:
360
FIGURE 15-12. Time Dependent Evapotranspiration Options pop-up dialog. For each stress period
the surface to apply ET, the maximum ET rate, the extinction depth, and the layer must be
specified.
3). SURF
4). EVTR
5). EXDP
6). IEVT
General Head
Selecting Packages:General Head:Modify will generate the pop-up dialog shown in Figure 15.13.
This dialog allows all the parameters needed for the General Head Boundary package to be defined.
Listed below are the MODFLOW variable names with a description of the equivalent dialog entry:
FIGURE 15-13. General Head Boundary Package pop-up dialog. This dialog is used to set fixed
heads locations and cell-to cell output (if desired) for MODFLOWs General Head Boundary
Package.
361
1). MXBND
IGHBCB
To define the remaining parameters, press the Set Active GHB Zones button. This will create the
dialog shown in Figure 15.14. This dialog allows data entry for each stress period.
FIGURE 15-14. Active GHB Zone
2). ITMP
3). Layer,
Row,
Col,
Head,
Cond
362
FIGURE 15-15. Stress Period GHB Zone Definitions pop-up dialog. For each general head
boundary zone in each stress period, the row, column, layer, elevation, and conductance must be
specified.
MT3D
If the MODFLOW output is being used as input for MT3D (See Chapter 16), a specially formatted
head file must be selected. This is done using the dialog shown in Figure 15.16.
FIGURE 15-16. MODFLOW Head and
Flow Output for MT3D pop-up dialog.
If MT3D is going to be used following
the MODFLOW simulation, the results
need to be named in a special file named
here.
Output Control
Selecting Packages:Output Control:Modify will generate the pop-up dialog shown in Figure 15.17.
This dialog allows all the parameters needed for the Output Control package to be defined. Listed
below are the MODFLOW variable names with a description of the equivalent dialog entry:
1). IHEDFM
Head Print Format menu list is used to select the file output
format for heads. This is the same format list used in the
utility package.
363
IDDNFM
IHEDUN
IDDNUN
To define the remaining parameters, select the desired Stress Period and press the Set Time Step
Data button. This will create the dialog shown in Figure 15.18 for the specified stress period. This
dialog allows data entry for each time step.
2). INCODE
IHDDFL
IBUDFL
ICBCFL
364
The Output Code may be specified as the same as the Last time
step (Not applicable for the first time step), as all layers
treated the Same, or on a By-Layer basis.
Head & Drawdown, when set to TRUE these terms will be
saved.
Print Budget, when set to TRUE the water budget will be
saved.
Cell-to-Cell, when set to TRUE will print data from the
packages saving cell-to-cell information.
FIGURE 15-18. Output Control - Stress Period pop-up dialog. This dialog is used to specify the
desired output for each time step for a specified stress period.
If the Output Code is set to all layers treated the Same or By-Layer, several more terms must be
specified for that time step. Press the Specification button to display these terms (Figure 15.19).
Note: this button will be highlighted in RED (undefined) or GREEN (already defined) if needed;
otherwise it will be faded GRAY and not pressable.
FIGURE 15-19. Output Control - Time Step
3). Hdpr
Ddpr
Hdsv
Ddsv
NOTE: If the modmain interface is going to be used to create head or drawdown maps, head
and drawdown must be printed to standard output. The standard output file is read
my modmain to determine the result values.
365
PCG
Selecting Packages:PCG:Modify will generate the pop-up dialog shown in Figure 15.20. This
dialog allows all the parameters needed for the Preconditioned Conjugate-Gradient 2 package to be
defined. Listed below are the MODFLOW variable names with a description of the equivalent
dialog entry:
15-20. Preconditioned ConjugateGradient 2 Package pop-up dialog. This
dialog is used to define the various parameters
for MODFLOWs PCG Package.
FIGURE
1). MXITER
ITER1
NPCOND
2). HCLOSE
RCLOSE
RELAX
366
NBPOL
IPRPCG
MUTPCG
IPCGCD
Recharge
Selecting Packages:Recharge:Modify will generate the pop-up dialog shown in Figure 15.21. This
dialog allows all the parameters needed for the Recharge package to be defined. Listed below are
the MODFLOW variable names with a description of the equivalent dialog entry:
FIGURE 15-21. Recharge Package pop-up dialog. This dialog is used to set recharge parameters
and cell-to cell output (if desired) for MODFLOWs Recharge Package.
1). NRCHOP
IRCHCB
367
To define the remaining parameters, press the Time Dependent Recharge Data button. This will
create the dialog shown in Figure 15.22. This dialog allows data entry for each stress period.
FIGURE 15-22. Time Dependent
Recharge Options pop-up dialog.
For each stress period the recharge
rate and layer must be specified.
2). INRECH
NOTE:
3). RECH
The last array, is only read if recharge is Applied to Highest Active Cell. If it is needed, the button
will be highlighted in RED (undefined) or GREEN (previously defined), or in faded GRAY if not
needed.
4). IRCH
River
Selecting Packages:River:Modify will generate the pop-up dialog shown in Figure 15.23. This
dialog allows all the parameters needed for the River package to be defined. Listed below are the
MODFLOW variable names with a description of the equivalent dialog entry:
1). MXRIVR
IRIVCB
368
FIGURE 15-23. River Package pop-up dialog. This dialog is used to set river reach information
and cell-to cell output (if desired) for MODFLOWs River Package.
2). ITMP
3). Layer,
Row,
Col,
Stage,
369
Cond,
Rbot
FIGURE 15-25. Stress Period Reach Definitions pop-up dialog. For each river reach in each stress
period, the row, column, layer, stage, conductance, and bottom must be specified.
SIP
Selecting Packages:SIP:Modify will generate the pop-up dialog shown in Figure 15.26. This
dialog allows all the parameters needed for the Strongly Implicit Procedure package to be defined.
Listed below are the MODFLOW variable names with a description of the equivalent dialog entry:
1). MXITER
NPARM
2). ACCL
HCLOSE
IPCALC
370
WSEED
IPRSIP
SSOR
Selecting Packages:SSOR:Modify will generate the pop-up dialog shown in Figure 15.27. This
dialog allows all the parameters needed for the Slice-Successive Overrelaxation package to be
defined. Listed below are the MODFLOW variable names with a description of the equivalent
dialog entry:
FIGURE 15-27. Slice Successive Overrelaxation
1). MXITER
2). ACCL
HCLOSE
UNCERT Users Manual
371
IPRSOR
Printout Interval.
Well
Selecting Packages:Well:Modify will generate the pop-up dialog shown in Figure 15.28. This
dialog allows all the parameters needed for the Well package to be defined. Listed below are the
MODFLOW variable names with a description of the equivalent dialog entry:
FIGURE 15-28. Well Package pop-up dialog. This dialog is used to set well pumping/injection rate
information and cell-to cell output (if desired) for MODFLOWs Well Package.
1). MXWELL
IWELCB
To define the remaining parameters, press the Set Active Wells button. This will create the dialog
shown in Figure 15.29.
2). ITMP
3). Layer,
Row,
Col,
Recharge
372
Utility
Many of the MODFLOW packages require 1D and 2D arrays. For these arrays MODFLOW has
several standard utility modules for inputting this information. These are implemented with popup dialogs for U1DREL, U2DREL, and U2DINT. They are all very similar to the dialog shown in
Figure 15.31. The parameters are explained below and can be defined for each applicable layer
(These can be step through by using the Next and Previous buttons):
REAL ARRAYS (U1DREL and U2DREL):
1). LOCAT
Data Source (File/Constant): values for an array can be set as
Array = Constant Value, or values may be read from an
Unformatted File or a Formatted File. If the LOCAT does
not define a constant, the Define Array button will be
activated; RED indicates the array needs to be defined, and
GREEN indicates it has been previously defined. To enter
UNCERT Users Manual
373
FIGURE 15-31. U1DREL, U2DREL and U2DINT array input pop-up dialog. This dialog controls
2-Dimensional REAL array input (The 1D real and 2D integer dialogs are very similar). This
dialog is used to specify the array data location or array value (a constant), and the input and output
data formats.
CNSTNT
FMTIN
IPRN
374
11G10.3
9G13.6
15F7.1
15F7.2
15F7.3
15F7.4
20F5.0
20F5.1
20F5.2
20F5.3
20F5.4
10G11.4
INTEGER ARRAYS (U2DINT):
1). LOCAT
Data Source (File/Constant): values for an array can be set as
Array = Constant Value, or values may be read from an
Unformatted File or a Formatted File. If the LOCAT does
not define a constant, the Define Array button will be
activated; RED indicates the array needs to be defined, and
GREEN indicates it has been previously defined. To enter
the array position values, press the Define Array button; it
will pop-up the array editor discussed below. If LOCAT
does not define a constant, a file Unit Number ID (positive
only) must be specified. If the Unit Number ID does not
match the unit number ID for the package being modified
(See Packages:Basic - IUNIT section above), a Data
Filename associated with the Unit ID also needs to be
specified.
ICONST
If LOCAT is set to Array = Constant Value, the constant value
should be entered in the Constant for Array text field.
Otherwise, individual values are read for each array
position, and the multiplier should be entered in the
Multiplier text field.
FMTIN
Input FORTRAN77 FORMAT: This format is user specified for
reading the data file. It must be a valid FORTRAN77
format for INTEGER numbers, and there must be an entry.
IPRN
Output FORTRAN77 FORMAT: This button allows the user to
select the standard output file format for this array. For
U2DINT, the valid formats can be selected from the menu
list button. Valid formats are:
10I11
60I1
40I2
30I3
375
25I4
20I5
Run
Once all the data packages are built, a script to run MODFLOW with the current data files can be
built and MODFLOW can be executed.
NOTE and WARNING: The standard release of MODFLOW does not support the PCG2
solver or MT3D. If you want to use these packages, you need to update the software
yourself, or download the copy of MODFLOW released with UNCERT (See
Chapter 2, the Acquiring Software section).
Now
Run:Now will check to see that all package modifications have been saved, update the run script,
and make a system call to execute MODFLOW.
NOTE: MODFLOW is executed with a system call; as a result modmain cannot
independently determine when MODFLOW is done or if there has been a problem.
The standard MODFLOW messages, however will still be printed to the xterm
window that launched modmain. When MODFLOW is complete, STOP will be
printed in the xterm window.
Save Script
Modmain does not contain MODFLOW; it is simply a user interface for MODFLOW. To execute
MODFLOW, modmain builds an executes a script. The script file copies all the active packages
files from there user defined names to their FORTRAN77 unit ID names (e.g. f93.bas, a Basic
package data file, would be copied to fort.1), and redirects standard output to a data file (default =
mod.out). Finally the copied fort.* data files are deleted since they are no longer needed.
View
Once MODFLOW has been run, the standard output data file can be read, and head data and
drawdown data may be striped from the file and formatted into files compatible with contour,
surface, and block.
NOTE: These options are not available until MODFLOW has been run, using the Run:Now
menu-bar option.
WARNING:
376
Before you try to make a grid or block file, examine the file to insure it has
completed successfully.
FIGURE 15-33. This is an example standard output file from a MODFLOW run (f93cor.prj)
377
NOTE: The X-Z and Y-Z Plane options are not installed.
NOTE: f a cell has gone dry, MODFLOW will mark it by giving it a head value of HNOFLO
(Basic Package). When creating the X. Y, Z, head value data file, any cells with
heads greater than HNOFLO will be considered dry and not included in the data file.
When a cell value is discarded, a WARNING message identifying the cell position
will be displayed in the log-status window.
NOTE: Because grid is an independent program module, it must be terminated separately
from modmain. Also, once grid is open, modmain can no longer inform it what files
to load. If the Grid Data button is pressed repeatedly with the Create XYZ Data File
and Grid option set, there will be multiple instances of grid running using up
computer memory. It is therefore best to 1) quit grid after each grid data file is
378
created, or 2) to set the Run Gridding Program option to Create XYZ Data File Only.
Once the XYZ data file is created, it is a simple matter to load the new XYZ data file
with grid.
Drawdown/Head - Block
View:Drawdown/Head:Block generates the pop-up dialog in Figure 15.35. This dialog allows the
user to specify which stress period the user is interested in mapping. This dialog is used to strip the
appropriate head/drawdown data from MODFLOWs redirected output (MODFLOW Output
Filename). Once the head/drawdown data is striped, the head/drawdown values are matched to
their X, Y, and Z positions. If Grid is pressed the stripped values and coordinates are passed to grid
(Chapter 10) asdiscussed in the View:Drawdown/Head:Contoured Surface section above. If the
Make Block Map button is pressed, a NODE CENTERED GRID file (Chapter 11) is created if
DELR and DELC (Block Centered Flow Package) are constant arrays, otherwise an
IRREGULAR_GRID_SPACING file (Chapter 13) is created. In either case the grid file is sent to
block (Chapter 13) for viewing.
FIGURE 15-35. 3D Block Map Generation pop-up
dialog. This dialog is used to specify the stress
period that the user wants to strip head data from
the MODFLOW standard output file to generate
block maps.
NOTE: If a cell has gone dry, MODFLOW will mark it by giving it a head value of
HNOFLO (Basic Package). When creating the X. Y, Z, head value data file, any
cells with heads greater than HNOFLO will be considered dry and not included in
the data file. When a cell value is discarded, a WARNING message identifying the
cell position will be displayed in the log-status window.
NOTE: Cells with values outside the minimum and maximum head/drawdown will not be
display when viewed using block.
NOTE: Because grid and block are independent programs module, they must be terminated
separately from modmain. Also, once open, modmain can no longer inform them
what files to load.
379
Simulator
NOT INSTALLED.
Network
NOT INSTALLED.
Log
The Log menu option is supplied to allow the user to save, view, or print all text which has been
written to the log/status window by the program or added by the user (The log window is also a
simple text editor). The options include View Log, Save, Save as, and Print. View Log, Save, and
Save as are similar in operation to the menu options under File described above.
Help
Help lists topics about the program for which there is help. When a item is selected a pop-up dialog
with a scrolled text area is generated which is similar to Figure 5.15 with the desired information.
NOTE:Only one help window may be open at a time.
Help files are editable ASCII data files; for further information see Appendix D.
Editor
The array editor is composed of two sections; the menu-bar, the drawing and editing screen. In the
drawing area, the current array will be displayed with the array name at the top of the map and
either the row-column coordinates for the grid, or the actual X-Y distance coordinates (Example:
Figure 15.36). In the upper-left corner of the drawing area, as the mouse pointed is moved over the
array, the current array values, and row/column position for the cell will be displayed.
Array
The editor Array menu options control printing, and quitting.
380
Editor
Print Setup
Array:Print Setup allows the user to define the print destination and the number of copies that will
be made. These features are explained in detail in Chapter 5; the pop-up dialog used is shown in
Figure 5.3.
Print
Array:Print generates a Postscript file of the array/map, and depending on how the print options are
define in Print Setup, directs this file to the specified print queue, or to the specified file.
Quit
Array:Quit terminates the array editor. Changes to the array are saved automatically.
381
Edit
Edit:New Value generates the pop-up dialog shown in Figure 15.37. This dialog controls the color
palette, the input values to the array, and the method values will be input into the array.
FIGURE 15-37. Array Value Input
pop-up dialog. This dialog is used
to set the color palette for the array
editor, the starting and ending entry
values for data entry, and the
populate area rule for using the
mouse.
The color palette, is set linearly from the minimum to the maximum data values in the array
(Normal Scaled Color Palette Gradation). Because many parameters very over several orders of
magnitude (e.g. hydraulic conductivity - clay = 10-5 ft/day, gravel = 103 ft/day), however, a linear
color scale will hide much of the detail. For this reason, the color scale can be Log Scaled (Note,
any values 0.0 will be set to BLACK). If array value range has been changed, pressing the Refit
Color Palette button will rescale the color palette.
To enter new values in the array, enter the desired value(s) in the Start Array Value and End Array
Value (optional) text fields. Pressing the Apply Start Value to All Cells will apply the value in the
Start Array Value text field to all the cells in the array; this is a quick way to initialize the array to a
given value. Values may also be entered into the array using the mouse. When the left mouse
button is held down and the mouse is moved, a small rectangle is created. When the mouse button
is released, all cells that are under or within the rectangle will be reset based on the current
Populate Rule. Individual cells can be changed by pointing at a cell and pressing and releasing the
left mouse button. The rules are detailed below:
Constant (Start Value): All cells marked, will be set to the value in the Start Array Value text
field.
Left to Right: When marking the rectangle, cells on the side of the rectangle where the mouse
was first pressed will be assigned the Start Array Value. Those on the other side will
be assigned the End Array Value. Cell in-between, will be defined based on the
linear distance of the middle of the cell to the distance between the middle of the
right and left most selected cells (Figure 15.38a).
382
Editor
a).
b).
c).
d).
e).
FIGURE 15-38 a-e. These plots demonstrate the different populate rules for applying mouse drawn
rectangular zones in the array editor: a) used the Left to Right rule, b) used the Top to Bottom rule,
c) and d) used the Start to End Vector rule, and e) used a combination of the Start to End Vector and
Constant rules.
Top to Bottom: When marking the rectangle, cells on the side of the rectangle where the mouse
was first pressed will be assigned the Start Array Value. Those on the other side will
be assigned the End Array Value. Cell in-between, will be defined based on the
linear distance of the middle of the cell to the distance between the middle of the top
and bottom most selected cells (Figure 15.38b).
Start to End Vector: When marking the rectangle, the cell in the corner of the rectangle where
the mouse was first pressed will be assigned the Start Array Value. The cell in the
opposite corner will be assigned the End Array Value. Cell in-between, will be
defined based on the linear distance of the middle of the cell starting cell to the
middle of the ending cell (Figure 15.38c & 15.38d).
383
A combination of the rules can be used to create more complex arrays as shown in Figure 15.38e or
Figure 15.36.
When modmain reads in f93.bas, it recognized that the BCF, OC, RCH, RIV, and SIP packages are
also required. These are defined by the Packages Used and Solver Used toggles and toggle menu
(IUNIT variables, Figure 15.2). If you pull down the Packages menu-bar you will note that all of
the above packages are highlighted in GREEN. This means that all of the required packages are
defined, and MODFLOW can be run.
WARNING:
To execute MODFLOW with these modules, select the Run:Now menu-bar option. A pop-up
dialog will appear (similar to Figure 5.2) asking for the name of a script file. A script file is simple
an ASCII text file with instructions the UNIX operating system can understand and execute. Select
f93cor.csh. Modmain will determine what files are needed, build the script file, and tell the UNIX
operating system to execute the script. In the log/status window a message will be printed:
Executing MODFLOW
SYSTEM CALL: $PWD/f93cor.csh &
NOTE: WAIT for STOP to appear in the xterm text window before continuing.
The system call executes the script. Because a shell script is being executed, however, modmain
has no way of knowing when MODFLOW is complete; this has to be determined by the user. In
the xterm window were modmain was executed, when MODFLOW is done, the word STOP will
be printed (For this data set, you will only have to wait a few seconds).
384
Once MODFLOW is complete, modmain can be used to view the results. To view the text file,
select the View:MODFLOW output file menu-bar option. Press the View Output Filename button
(Figure 15.32) and the standard output file generated by MODFLOW will be displayed (Figure
15.33). To make a contour map of the model head results, select the View:as Contoured Surface
menu-bar option. The model was steady-state, so there is only one stress period; it was also, only a
one layer X-Y plane model, so the default values shown in Figure 15.34 are correct. To create a
data file containing the X, Y coordinates and head value for each cell, press the Grid Data button.
NOTE: In the log-status window, there are several WARNINGS. For six cells, the head
value was 1.0e30! This is how MODFLOW indicates the cell went dry. These
values are not included in the data set, because they have no real meaning with
regard to the water table level at those locations.
Once the file has been created, grid (Chapter 10) is called, and several parameters are
passed to that application. The most important are the X, Y, head value file name, and the number
of columns and rows to use to build the grid (By default, twice the MODFLOW rows and columns
plus 1 are used; this reduces some of the problems with the inverse-distance gridding algorithm
honoring the MODFLOW results). For this example, it is sufficient to select Method:Calculate
from the grid menu-bar. When the grid is calculated, select View:Contour Map from the grid
menu-bar, and save the file to junk.srf (See Chapter 10 for further detail on using grid). Once the
file is saved, the grid file will be passed to contour (Chapter 11) and displayed. A map similar to
Figure 15.39 will be displayed. In addition to the normal features of a contour map, the areas
outside of the flow region have been blanked out (description file = f93.blk) and the location the
calculated heads are marked by + symbols (description file = junk.dat; this is the X, Y, head value
data file created by modmain, and used by grid to generate the displayed grid file). These filenames
are specified in the contour preference file, contour.prf.
NOTE: In the lower-left portion of the map area, where the regular grid would imply that
there should be some head measurement, the + symbol is missing. These are some
of the locations where the cells went dry.
For further details on using contour, refer to Chapter 11.
385
Setting up Files
In addition to the MODFLOW file formats, modmain adds two more file. One is a project file
which is used to keep track of all the files used in a project. The other is a shell script file which is
used to rename files for MODFLOW, run MODFLOW, redirect standard output, and clean up after
MODFLOW whens its done.
Project File
The project file saves three groups of files, packages files, cell-to-cell flow files, and files associated
with the package files. These later files are defined with a unit number in the 1D and 2D array
386
Bibliography (modmain)
utility cards with a LOCAT file unit number different from the parent package. The files are listed
in the following order:
Associated Files
MODFLOW Package Files (Basic Package file first)
Cell-to-Cell Files
For each file, three pieces of information are needed: 1) the files unit ID, 2) the files name, and 3)
the unit ID of the owning package (e.g. block.ctc might be owned by the Block Center Flow
Package, unit 11). A sample file might look like:
41
42
1
11
43
starting_head.dat
hyd_cond.dat
sample.bas
sample.bcf
block.ctc
1
11
1
11
11
Script File
The shell script file has three sections: 1) copying user named files to fort.<unit ID> files, 2)
executing MODFLOW and redirecting standard output, and 3) cleaning up after MODFLOW is
down. For the above example (Project File), the shell script would look like:
\cp sample.bas
\cp sample.bcf
\cp starting_head.dat
\cp hyd_cond.dat
modflow > mod.out
\rm
\rm
\rm
\rm
\mv fort.42
fort.1
fort.11
fort.41
fort.42
fort.1
fort.11
fort.41
fort.42
block.ctc
Bibliography (modmain)
McDonald, M.G., and A.W. Harbaugh, 1984, A Modular Three-Dimensional Finite-Element Flow
Model, U.S. Geological Survey OFR 83-875.
387
388
MT3D Interface:
Mt3dmain
CHAPTER 16
Mt3dmain is a graphical user interface for MT3D, a modular three-dimensional transport program
developed by Papadopulos and Associates for the U.S. Environmental Protection Agency (EPA).
MT3D is a program designed to model contaminant transport based on a pre-solved ground-water
flow model (MODFLOW is often used to solve the ground water flow equations (Chapter 15).
MT3D uses the solution aquifer heads to base the transport results). In its basic form, MT3D can
be difficult, or awkward to use. The mt3dmain program module is designed to simplify data entry,
model editing, and analysis of results.
This chapter goes into the details of using mt3dmain as an interface for MT3D. It however does not
explain the theory or use of MT3D, that is better left to the MT3D users manual (Zheng, 1990).
WARNING:
To use mt3dmain and MT3D effectively, you must have the MT3D users
manual, and it should be readily available whenever you are building data sets
with mt3dmain. In the current release, mt3dmain does not error check data file
formats; this can lead to incorrect numbers for any variable, and it can cause
segmentation faults which will terminate mt3dmain. To find the problem,
the data files may have to be examined line by line to determine where the
problem is. This can only be done if the MT3D users manual is available!
The mt3dmain application is composed of two sections; the main menu-bar, and the status and log
text area. The menu-bar is used to select all mt3dmain commands, and the log/status area is used
by the program to report important messages or results. In addition to the main window and
supporting pop-up dialog windows, a graphical editor is available for creating and modifying twodimensional arrays. Mt3dmain also uses other UNCERT (grid, contour, surface, and block)
modules for visualizing model output.
NOTE: There is a public domain (Zheng, 1990, EPA) version of MT3D and a proprietary
version (Zheng, 1990, Papadopulos). Both version are supported by this interface.
389
requested. For mt3dmain there are eight items on the main menu: Project, Packages, Run, View,
Simulator, Network, Log, and Help (Figure 16.1). Project controls project file handling (opening,
saving, naming project files), and allows the user to quit the application. Project is not a feature of
MT3D, but it allows a complete set of MT3D data files to be handled as a set; this option controls
the loading, and saving of these project files. This menu-bar option also allows the user to quit
the application. Packages allows the user to individually load, modify, or save and MT3D package
data file. Run executes MT3D using the currently defined data files. View allows the user to view
the standard text output file or view the model results using grid (Chapter 10) and contour (Chapter
11), surface (Chapter 12), or block (Chapter 13). Simulator and Network currently are not
installed, but will allow MT3D to be run using different data files describing material distributions
simultaneously on different computers over the network. Log allows the user to save to a file all
information printed to the log-status window. Help gives the user a selection of pop-up help topics.
Each menu item is fully described below with all the available options.
FIGURE 16-1. This is an example of the mt3dmain application window. The main menu-bar is on
the top of the application window, and the log/status window in the lower portion.
Project
The Project sub-menu options control project file handling, and exiting the application. The
options include Open Project, View Project, Save Project, Save Preferences, Quit, and Quit Without
Saving.
Open Project
Selecting Project:Open Project generates a pop-up dialog which allows the user to select an
existing data file. This dialog functions as the File:Open dialog in Figure 5.2 (plotgraph - Chapter
5) and allows the user to select an existing project file. The default project file name extension,
though is *.prj.
390
View Project
Project:View Project pops up a simple screen editor with the last opened or saved version of the
project file.
Save Project
Project:Save saves the name of the MT3D files currently being used. If a save file has already been
opened, the data are simply saved. If a save file has not been selected yet, a pop-up dialog similar
to that used in File:Open (Figure 5.2) is created. The main difference between the Open and the
Save dialog is that to save a file, it does not have to pre-exist. For a description of how the dialog
works, see the File:Save section in Chapter 5.
Save Preferences
When using programs with many user options, it is not possible for the program to always pick
reasonable default values for each parameter or input variable. For this reason preference files were
created (See Appendix C). These allow the user to define a unique set of defaults applicable to
the particular project. When File:Save Preferences is selected, mt3dmain determines how all the
input variables are currently defined and writes them to the file mt3dmain.prf.
WARNING:
If mt3dmain.prf already exists, you will be warned that it is about to be overwritten. If you do not want the old version destroyed you must move it to a
new file (e.g. the UNIX command mv mt3dmain.prf mt3dmain.old.prf would
be sufficient). When you press OK the old version will be over-written! This
cannot be done currently from within the application. To rename the you will
have to execute the UNIX mv command from a UNIX prompt in another
window.
If mt3dmain.prf does not exist in the current directory, it is created. This is an ASCII file and
can be edited by the user. See Appendix C for details.
Quit
Project:Quit terminates the program, but if changes have been made to any MT3D package, the
user will first be queried to supply appropriate filenames for the modified files. Also, if packages
have been added, deleted, or substituted with a new file, the project file will also have to be saved.
391
Packages
To use MT3D, there are a number of different packages that can be used: Flow File, Basic,
Advection, Dispersion, Sink & Source Mixing, and Chemical Reaction. The only one strictly
required is the Basic package. When using the Packages pull-down menu, all of the available
MT3D packages are displayed. The titles are also color coded; RED indicates that with the current
settings, this package is required, but has not been defined. GREEN also means it is required, but it
is defined sufficiently for MT3D to run (This does not mean all data entries are correct for the
particular model). BLACK means that the package is not currently needed, and it may be ignored.
Each package has a pull-down sub-menu with five menu options: Open Package, View
Package:Save, Save as, and Modify. The Open option generates a dialog similar to that shown in
Figure 5.2. The dialogs works the same as that dialog too, except that the default file name
extensions are different. For each package the default file name extensions are:
BASIC:
*.bas
ADVECTION:
*.adv
DISPERSION:
*.dsp
SINK & SOURCE MIXING:*.ssm
CHEMICAL REACTION: *.rct
These file extensions are strictly conventions, and do not have to be followed. It, however, is
recommended that you follow some consistent naming convention. The View menu option will
display the last saved or loaded version of the data file.
NOTE: Changes made to a package within the mt3dmain application (using Modify below)
will not be reflected in the data file until the changes have been saved (see Save
below).
The Save menu option will save any modifications, overwriting the last opened or saved package
file. If no package file has been loaded or saved previously, a pop-up dialog will appear similar to
Figure 5.2, but showing the appropriate default file extension. To save the package file, select an
existing file, or enter a new file name, then press the Save button on the dialog. Save as is similar
to Save, except that you are queried for a file name. Modify will generate a new pop-up dialog
which will allow the user to enter all the appropriate data for that particular package. These
package dialogs are discussed below.
NOTE: The packages and dialogs discussed below explain how and where to enter data and
package parameter values. The meaning of different variables is not discussed, and
is left to the MT3D users manual (Zheng, 1990).
WARNING:
392
As dialogs are generated, default values will be assumed. These values though
may have no meaning with regard to a particular model and it is the modelers
responsibility to insure all entries are correct.
NOTE: The LOCAT (Chapter 15, Utility Section) identifiers allowed by MT3D of 100, 101,
102, and 103 will be read from existing files correctly, but the data will be saved
using conventional MODFLOW formatted formatting. When specifying unit
numbers for the different MT3D packages, the following unit numbers must be used.
BTN= 1
ADV= 11
DSP= 12
SSM= 13
RCT= 14
FLOW= 15 (Ground water flow file)
Flow File
Selecting Packages:Flow File will generate the pop-up dialog shown in Figure 16.2. Not strictly a
package, but never the less needed, is a file defining the ground water flow field. This file is
selected using this dialog and can be created with MODFLOW (modmain (Chapter 15)) or any
other flow package, such that the requirements in the MT3D Users Manual are meet.
FIGURE 16-2. Pre-calculated Head and Flow
Basic
Selecting Packages:Basic:Modify will generate the pop-up dialog shown in Figure 16.3. This
dialog allows all the parameters needed for the Basic package to be defined. Listed below are the
MT3D variable names with a description of the equivalent dialog entry:
1). HEADNG(32)
2). HEADNG(continued)
3). NLAY
NROW
NCOL
NPER
4). TUNIT
LUNIT
MUNIT
5). TRNOP(10)
Heading (#1)
Heading (#2)
Layers
Rows
Columns
Stress Periods
Name of Time Units (e.g. DAY or HOUR)
Name of Length Units (e.g. FT or CM)
Name of Mass Units (e.g. LBS of KG)
Transport Options logical flags: Ten entries are required on
this line. Each entry is either a T or an F. One entry is
393
FIGURE 16-3. Basic Package Input pop-up dialog. This dialog is used to set parameters for
394
= advection module
= dispersion module
= sink & source module
= chemical reaction module
= Uninstalled
= Uninstalled
= Uninstalled
= Uninstalled
= Uninstalled
= Uninstalled
Layer Types button: To enter the layer type for each layer the
dialog shown in Figure 16.4 is used. Note, only the top
layer can be unconfined (Type 1). If there are more than
ten layers, use the Next and Previous buttons to define all
layers.
FIGURE 16-4. Active Layer Definition pop-up
7). DELR
8). DELC
9). HTOP
Cell Width (along rows (x)) button: This calls the 1D utility
array editor (See Chapter 15, U1DREL).
Cell Height (along columns (y)) button: This calls the 1D
utility array editor (See Chapter 15, U1DREL).
Top Elevation button: This defines the top elevation of the first
(top) layer. To define the layer, press the Define Array
Control Data button. This calls the utility 2D real array
editor (See Chapter 15, U2DREL).
The following items (#10 - #13) are entered using the 3-Dimensional Arrays button. There is one
entry for each item for each layer in the model. These arrays are selected using the pop-up dialog
shown in Figure 16.5. Each layer can be specified in the utility dialog.
FIGURE 16-5. Layer by Layer Definition pop-up dialog. For each
layer various arrays need to be defined . This dialog allows the user to
specify and edit each array.
10). DZ
395
11). PRSITY
12). ICBUND
13). SCONC
For the following items, there is only one entry (#14 - #16). These arrays are selected using the
pop-up dialog shown in Figure 16.3.
14). CINACT
15). IFMTCN
IFMTNP
IFMTRF
IFMTDP
SAVUCN
396
16). NPRS
If NPRS (record #16) is greater than 0, the next card needs to be defined for each output time.
17). TIMPRS
This
dialog is used to explicitly define when calculated
concentration information will be output.
18). NOBS
If NOBS (record #18) is greater the 0, the next card needs to be defined for each observation point.
The values are entered using the pop-up dialog shown in Figure 16.8.
19). KOBS
IOBS
JOBS
20). CHKMAS
Observation Layer
Observation Row
Observation Column
Save Mass Balance toggle: This flag indicates whether the
mass balance information should be saved in the default
unformatted file (MT3D.MAS).
397
For each stress period, there must be an entry for each of the following cards (#21 - #23). These
cards are defined by pressing the Define Stress Periods button, and filling the entries in the pop-up
dialog shown in Figure 16.9.
21). PERLEN
NSTP
TSMULT
398
If TSMULT for the stress period is less then or equal to 0.0, enter record #22. This is done by
pressing the appropriate Transport Time Steps button. The pop-up dialog is Figure 16.10 is used to
enter the time step length values.
FIGURE 16-10. Travel Time-Step Length pop-up dialog. This
22). TSLNGH
23). DT0
MXSTRN
Advection
Selecting Packages:Advection:Modify will generate the pop-up dialog shown in Figure 16.11. This
dialog allows all the parameters needed for the Advection package to be defined. Listed below are
the MT3D variable names with a description of the equivalent dialog entry:
1). MIXELM
PERCEL
MXPART
Courant Number
Maximum Number of Moving Particles
The information for the remaining cards is entered using the pop-up dialog shown in Figure 16.12.
This dialog is created by pressing the Set Conditional Solution Parameters button. Note that
UNCERT Users Manual
399
FIGURE 16-12. Conditional Advection Parameter Input pop-up dialog. This dialog is used to
specify the different solution parameters based on whether a MOC, MMOC, Hybrid MOC/MMOC,
or finite-difference solution scheme is used.
depending on the Advection Solution Scheme selected different options will be required. The
remaining menu items and text entry fields will be disabled.
If MIXELM equals 1, 2, or 3 (MOC, MMOC, or HMOC) enter the following record.
400
2). ITRACK
WD
Dispersion
Selecting Packages:Dispersion:Modify will generate the pop-up dialog shown in Figure 16.13.
This dialog allows all the parameters needed for the Dispersion package to be defined. Listed
below are the MT3D variable names with a description of the equivalent dialog entry:
The Longitudinal Dispersivity button allows the user to define card #1 for each item. The
individual layers are defined in the utility dialog (See Chapter 15, Figure 15.31).
401
1). AL
2). TRPT
3). TRPV
4). DMCOEF
Well
Drain
Recharge
Evapotranspiration
River
General-Head-Dependent Boundary
If any of these options were used in the flow model, its representative flag must be set to T,
otherwise, it should be set to F.
402
2). MXSS
NOTE: This option is not user definable in the mt3dmain interface. This value is calculated
based on the settings in CARD #1.
The Stress Period Parameter button allows the user to define cards 3 - 8 for each stress period with
the pop-up dialog shown in Figure 16.15.
FIGURE 16-15. Sink & Source Mixing Stress period Definitions pop-up dialog. For each stress
period, the recharge flux and evapotranspiration flux must be defined (when appropriate).
If FRCH = T specify whether recharge flux must be described for the stress period.
403
3). INCRCH
If FRCH = T and INCRCH 0, an array entry is required, for the recharge flux described on card
4.
4). CRCH
If FEVT = T specify whether evapotranspiration flux must be described for the stress period.
5). INCEVT
If FEVT = T and INCEVT 0, an array entry is required, for the evapotranspiration flux described
on card 6.
6). CEVT
7). NSS
If NSS > 0, the location, concentration, and type must be specified for each point source, using the
pop-up dialog in Figure 16.16.
FIGURE 16-16. oint Source Specification pop-up dialog. This dialog is used to specify for each
sink and source in the flow, what type it was and where it was located in the model grid.
404
8). KSS
ISS
JSS
CSS
ITYPE
Layer
Row
Column
Concentration
Type: Identify whether the concentration source is from a
Constant-Head (CH), Well, Drain, River, or General-Head
Boundary (GHB) cell. Only one option can be selected
per point source ID. If more then one type is applicable,
the point source must be entered multiple times.
Chemical Reaction
Selecting Packages:Chemical Reaction:Modify will generate the pop-up dialog shown in Figure
16.17. This dialog allows all the parameters needed for the Chemical Reaction package to be
defined. Listed below are the MT3D variable names with a description of the equivalent dialog
entry:
FIGURE 16-17. Chemical Reaction Package pop-up
dialog. This dialog is used to set parameters for MT3Ds
Chemical Reaction Package.
1). ISOTHM
= Linear Isotherm
= Freundich Isotherm
= Langmuir Isotherm
= None (No sorption isotherm used)
405
IREACT
6). RC2
Utility
This section is exactly the same as described in the Utility section of modmain (Chapter 15).
Run
Once all the data packages are built, a script to run MT3D with the current data files can be built
and MT3D can be executed.
Now
Run:Now will check to see that all package modifications have been saved, update the run script,
and make a system call to execute MT3D.
NOTE: MT3D is executed with a system call; as a result mt3dmain cannot independently
determine when MT3D is done or if there has been a problem. The standard MT3D
messages, however will still be printed to the xterm window that launched
mt3dmain. When MT3D is complete, STOP will be printed in the xterm window.
Save Script
This section is exactly the same as described in the Editor section of modmain (Chapter 15).
406
View
Once MT3D has been run, the standard output data file can be read, and head data may be striped
from the file and formatted into files compatible with contour, surface, and block.
NOTE: These options are not available until MT3D has been run.
WARNING:
Contoured Surface
This section is exactly the same as described in the Editor section of modmain (Chapter 15).
Block
This section is exactly the same as described in the Editor section of modmain (Chapter 15).
Simulator
NOT CURRENTLY INSTALLED.
Network
NOT CURRENTLY INSTALLED.
Log
The Log menu option is supplied to allow the user to save, view, or print all text which has been
written to the log/status window by the program or added by the user (The log window is also a
simple text editor). The options include View Log, Save, Save as, and Print. View Log, Save, and
Save as are similar in operation to the menu options under File described above.
407
Help
Help lists topics about the program for which there is help. When a item is selected a pop-up dialog
with a scrolled text area is generated which is similar to Figure 5.15 with the desired information.
Editor
This section is exactly the same as described in the Editor section of modmain (Chapter 15).
408
the xterm window were modmain was executed, when MODFLOW is done, the word STOP will
be printed (For this data set, you will only have to wait a few seconds).
Once MODFLOW is complete, the next step is to run mt3dmain. To do this, execute mt3dmain,
and load the project file jsmt.prj (Use the Project:Open Project menu-bar option). Opening the
project will again load several files into the application. The files listed below are display in the
log/status window:
jsmt.btn: BASIC INPUT PACKAGE
jsmt.adv: ADVECTION INPUT PACKAGE
jsmt.dsp: DISPERSION INPUT PACKAGE
jsmt.ssm: SINK & SOURCE MIXING INPUT PACKAGE
When mt3dmain reads in jsmt.btn, it recognized that the ADV, DSP, and SSM packages are also
required. These are defined by the Packages Used and Solver Used toggles and toggle menu
(TRNOP variables, Figure 16.3). To execute MT3D with these modules, select the Run:Now menubar option. A pop-up dialog will appear (similar to Figure 5.2) asking for the name of a script file.
Select jsmt.csh. Mt3dmain will determine what files are needed, build the script file, and tell the
UNIX operating system to execute the script. In the log/status window a message will be printed:
Executing MT3D
SYSTEM CALL: $PWD/jsmt.csh &
NOTE: WAIT for STOP to appear in the xterm text window before continuing.
To view the text file, select the View:MT3D output file menu-bar option. Press the View Output
Filename button (Figure 15.32) and the standard output file generated by MT3D will be displayed
(Figure 15.33). To make a contour map of the model head results, select the View:as Contoured
Surface menu-bar option. The model was steady-state, so there is only one stress period; it was
also, only a one layer X-Y plane model, so the default values shown in Figure 15.34 are correct. To
create a data file containing the X, Y coordinates and head value for each cell, press the Grid Data
button.
Once the file has been created, grid (Chapter 10) is called, and several parameters are passed to that
application. The most important are the X, Y, head value file name, and the number of columns and
rows to use to build the grid (By default, twice the MT3D rows and columns plus 1 are used; this
reduces some of the problems with the inverse-distance gridding algorithm honoring the MT3D
results). For this example, it is sufficient to select Method:Calculate from the grid menu-bar.
When the grid is calculated, select View:Contour Map from the grid menu-bar, and save the file to
junk.srf (See Chapter 10 for further detail on using grid). Once the file is saved, the grid file will be
passed to contour (Chapter 11) and displayed. A map similar to Figure 16.18 will be displayed.
409
FIGURE 16-18. This is an example MT3D contaminant concentration map. Cell concentration
values were striped from the standard MT3D output file, gridded using grid (Chapter 9), and
contoured using contour (Chapter 11).
Setting up Files
In addition to the MT3D file formats, mt3dmain adds two more file. One is a project file which is
used to keep track of all the files used in a project. The other is a shell script file which is used to
rename files for MT3D, run MT3D, redirect standard output, and clean up after MT3D when its
done.
410
Bibliography (mt3dmain)
Project File
The project file saves three groups of files, packages files, the ground water flow file, and files
associated with the package files. These later files are defined with a unit number in the 1D and 2D
array utility cards with a LOCAT file unit number different from the parent package. The files are
listed in the following order:
Associated Files
MT3D Package Files (Basic Package file first)
Ground Water Flow File
For each file, two or three pieces of information are needed: 1) the files unit ID, 2) the files name,
and 3) the unit ID of the owning package (e.g. block.ctc might be owned by the Block Center Flow
Package, unit 11). The third item is only needed for associated files. A sample file might look
like:
41
1
11
15
top_elev.dat 1
sample.btn
sample.adv
modflow.flo
Script File
The shell script file has four sections: 1) supplying by fort.# filenames of any files associated with
packages, 2) executing MT3D, 3) supplying file names and answers as requested to MT3D using
standard input, and 4) cleaning up after the model run. For the above example (Project File), the
shell script would look like:
\cp top_elev.dat
mt3d << end-mt3d
mt3d.out
sample.btn
sample.adv
modflow.flo
Y
end-mt3d
\rm
fort.41
fort.41
Bibliography (mt3dmain)
Zheng, C., 1990, MT3D, A Modular Three-Dimensional Transport Model, S.S. United States
Environmental Protection Agency and Papadopulos and Associates, Inc., Rockville, Maryland.
411
Zheng, C., 1991, MT3D, A Modular Three-Dimensional Transport Model, S.S. Papadopulos and
Associates, Inc., Rockville, Maryland.
412
Grid Manipulation:
Array
CHAPTER 17
The Array module used to manipulate mathematically one, two, or a series of block, sisim, contour,
or surface 2D or 3D grid files (For valid file formats, refer to Chapters 11 and 13). Depending on
the options selected, operations include addition, subtraction, multiplication, division, averaging,
minimum, maximum, probability value within a range, reclassification, and basic statistics. These
are basic grid tools used in Geographical Information Systems (GIS). This tool can be useful for
data preparation, or for data and result analysis. For example, by reclassifying a contaminant
plume map to a cost of remediation map, estimates can be made about site clean up costs.
The array application is composed of two sections (Figure 17.1); the main menu- bar, and the text
status area. The menu-bar is used to select all array commands and the text area contains relevant
data about the status of the program or the state of on- going calculations.
FIGURE 17-1. This is an example of the array application window. The main menu-bar is on the
top of the application window, and the status/discussion window in the lower portion.
NOTE: Within this chapter, any time an array is refered to, a 2D an or 3D grid is implied.
This program does not perform matrix algebra.
NOTE: For correct results on operations with two or more grids, all the grids must have
exactly the same grid dimensions (i.e. the number of columns, rows and layers must
be the same).
413
File
The File sub-menu options control file and print handling, and exiting the program. The options
include Save Preferences and Quit.
Save Preferences
When using programs with many user options, it is not possible for the program to always pick
reasonable default values for each parameter or input variable. For this reason preference files were
created (See Appendix C). These allow the user to define a unique set of defaults applicable to
the particular project. When File:Save Preferences is selected, modmain determines how all the
input variables are currently defined and writes them to the file array.prf.
WARNING:
If array.prf already exists, you will be warned that it is about to be overwritten. If you do not want the old version destroyed you must move it to a
new file (e.g. the UNIX command mv array.prf array.old.prf would be
sufficient). When you press OK the old version will be over-written! This
cannot be done currently from within the application. To rename the you will
have to execute the UNIX mv command from a UNIX prompt in another
window.
If array.prf does not exist in the current directory, it is created. This is an ASCII file and can be
edited by the user. See Appendix C for details.
Quit
File:Quit terminates the program, but if a grid has been calculated and not yet saved, the user will
first be queried to supply a file to save the changes in.
414
Array
Array-Array
Array:Array-Array is used to perform basic operations between two grids. The pop-up dialog is
shown in Figure 17.2. To use this option you must specify two input grids, Array 1 and Array 2,
and an Output Array file name. The output file will be of the same type as the input Array 1. There
are seven possible Operations between the two grids: +, - , x, /, average, minimum, and maximum.
Note that Order is important for the - and / operators. The Order can be set to Array 1 ? Array 2 or
Array 2 ? Array 1. The Output Format may also be specified as either Integer or Real. Note: the
Integer format will truncate real numbers to the lower integer value (i.e. 4.987 goes to 4). To
perform the operation, press the Calculate button. When the calculation is complete, the operation
will be posted in the log status window (Figure 17.1), and the results will be saved to the specified
Output Array file.
Once an input file has been specified, or the output file has been calculated, the text file may be
edited/viewed by pressing the appropriate Edit button. A map of the file may also be displayed by
pressing the Block, Contour, or Surface buttons.
415
Array-Constant
Array:Array-Constant is used to perform basic mathematical operations on a single grid. The popup dialog is shown in Figure 17.3. To use this option you must specify an Input Array and an
Output Array file name. The output file will be of the same type as the input file. There are four
possible Operations between the two grids: +, - , x, or /. Note that Order is important for the - and
/ operators. The Order can be set to Array ? Constant or Constant ? Array. The Output Format
may also be specified as either Integer or Real. Note: the Integer format will truncate real numbers
to the lower integer value (i.e. 4.987 goes to 4). The Constant is the value by which the grid will be
operated on. To perform the operation, press the Calculate button. When the calculation is
complete, the operation will be posted in the log status window (Figure 17.1), and the results will
be saved to the specified Output Array file.
Once an input file has been specified, or the output file has been calculated, the text file may be
edited/viewed by pressing the appropriate Edit button. A map of the file may also be displayed by
pressing the Block, Contour, or Surface buttons.
Array-Indicator
NOTE: This option is not yet installed!
Array-Reclassification
Array:Array-Reclassification is used to reclassify a range of values within a grid to a single new
value. The pop-up dialog is shown in Figure 17.4. To use this option you must specify an Input
Array and an Output Array file name. The output file will be of the same type as the input file.
416
With this option, all values Minimum and Maximum will be reset to the Reclassification value.
To perform the operation, press the Calculate button. When the calculation is complete, the
operation will be posted in the log status window (Figure 17.1), and the results will be saved to the
specified Output Array file.
FIGURE 17-4. Array-Reclassification pop-up dialog. This dialog is used to reclassify a value
Once an input file has been specified, or the output file has been calculated, the text file may be
edited/viewed by pressing the appropriate Edit button. A map of the file may also be displayed by
pressing the Block, Contour, or Surface buttons.
Array-Series
Array-Array Parameters
Array:Array-Series:Array-Array Parameters is used to perform basic mathematical operations on a
series of grids. The pop-up dialog is shown in Figure 17.5. To use this option you must specify a
Series Prefix and a Series Extension (e.g. for a series conc.iso.1.srf, conc.iso.2.srf, ..., the prefix
would be conc.iso and the extension would be srf. NOTE . are placed automatically of either side
of the file series number), the First and Last series number of concern (For best performance, it is
best if series numbers are consecutive), and the Output Array file name. There are five possible
Operations for the grid series: average, sum, minimum, maximum, and probability in range. For
Probability in Range the Minimum and Maximum range values of concern must also be specified.
The Output Format may also be specified as either Integer or Real. Note: the Integer format will
truncate real numbers to the lower integer value (i.e. 4.987 goes to 4). The output file will be of the
same type as the input file. To perform the operation, press the Calculate button. When the
calculation is complete, the operation will be posted in the log status window (Figure 17.1), and the
results will be saved to the specified Output Array file.
417
Once an input file has been specified, or the output file has been calculated, the text file may be
edited/viewed by pressing the appropriate Edit button. A map of the file may also be displayed by
pressing the Block, Contour, or Surface buttons.
Array-Point Parameters
Array:Array-Series:Array-Point Parameters is used to determine the value a series of 2D or 3D
grids at an arbitrary X, Y, Z location. The pop-up dialog is shown in Figure 17.6. To use this option
you must specify a Series Prefix and a Series Extension (e.g. for a series conc.iso.1.srf,
conc.iso.2.srf, ..., the prefix would be conc.iso and the extension would be srf. NOTE . are placed
automatically of either side of the file series number), the First and Last series number of concern
(For best performance, it is best if series numbers are consecutive), and the Output Array file name.
NOTE: This option currently does not work on single, non-series files. The program can be
fooled by putting a number in the filename, and setting the First and Last series
number to the same number.
418
The X, Y, and Z Point Location must also be specified. The output file will be of the same type as
the input file. By default, a new file will be created (an old file, if it exists, will be overwritten), and
the value(s) at the designated location will be printed each time this option is calculated. It is
sometimes convenient to combine the results of several searches into a single file, and identify the
X, Y, and Z coordinates. These options can be selected by toggling the Append to Existing File and/
or Print X, Y, Z buttons. To perform the operation, press the Calculate button. When the calculation
is complete, the operation will be posted in the log status window (Figure 17.1), and the results will
be saved to the specified Output Array file.
NOTE: When determining the value at a particular location, different methods are applied
depending on the file type. If the grid is node centered, a point falling anywhere
within the node will be assigned the node value (i.e., there is no interpolation). If the
grid is grid centered, the point of interest will be estimated using a basis function
based on the four surrounding grid points.
Once an input file has been specified, or the output file has been calculated, the text file may be
edited/viewed by pressing the appropriate Edit button. A map of the file may also be displayed by
pressing the Block, Contour, or Surface buttons.
Array-Transform
NOTE: This option is not yet installed!
419
Log
The Log menu option is supplied to allow the user to save, view, or print all text which has been
written to the log/status window by the program or added by the user (The log window is also a
simple text editor). The options include View Log, Save, Save as, and Print. View Log, Save, and
Save as are similar in operation to the menu options under File described above.
Help
Help lists topics about the program for which there is help. When a item is selected a pop-up dialog
with a scrolled text area is generated which is similar to Figure 5.15 with the desired information.
NOTE: Only one help window may be open at a time.
Help files are editable ASCII data files; for further information see Appendix D.
420
1). Zones that are shown to be highly contaminated (> 100 ppm), with even further
investigation will probably still exceed the EPA standard. These zones will have to be
remediated, and further investigation here will do little to define the extent of the
plume.
2). Zones that appear uncontaminated, are unlikely to be contaminated, and require little
further exploration. It may be wise to put in observation wells to insure the plume
does not migrate off site, or determine background contaminant levels, but those are
different issues then being described here.
3). Zones with high geologic certainty will yield little further information with continued
exploration, and the results will only minimally change the contaminant transport
flow model predictions.
4). Zones with low geologic certainty, and zones near the EPA standard should be
explored first. Data at these locations will best lower geologic uncertainty and define
the extent of the contaminant plume.
To create these maps, use the following steps:
1). Run array.
2). Make the a file representing two contaminant concentration standard deviations.
3). Make the file representing the maximum likely contaminant concentration with 95%
probability.
421
FIGURE 17-7. This map is the result of two maps added together. In this case, the average
contaminant concentration (determined from a series a 100 different geologic interpretations and
100 ground water flow and contaminant transport models) and two standard deviations at each
location were added together. This gives a single map indicating, at a 95% confidence level, the
maximum likely concentration at any location within the map area.
100: 10 ppm
Values between 0 and 10, and 10 and 100 ppm will be gradational.
Reclassify concentrations:
Trim maximum concentration down to 100.0.
Select the Array:Array-Reclassification menu item.
Define the Input Array as conc.iso.95max.srf.
Define the Output Array as junk1.srf.<
Define the Minimum as 100.0.
Define the Maximum as 1000.0.
Define the Reclassification as 100.0.
Press Calculate.
Press Done.<
Values now vary between 0 and 100.
Center EPA 10 ppm level about 0.0.
Select the Array:Array-Constant menu item.
Define the Input Array as junk1.srf.
Define the Output Array as junk2.srf.
Select the - Operation.
422
423
424
FIGURE 17-8. Using the map in Figure 16.7, a risk map can be created. In areas showing low
combined contaminant and error, risk of contamination is low. Likewise, in areas of high
concentration (> 100 ppm), these zones will be remediated, and the risk of leaving untreated water
is low. Areas near the cutoff (EPA standard = 10 ppm) pose the greatest risk and need further
investigation, because remediation would be a failure if zones over 10 ppm remain, but remediation
costs will be excessive if large areas are remediated that already meet the standard.
425
FIGURE 17-9. Based on a series of 100 simulations of the subsurface, the uncertainty in the
description of the subsurface geology can be defined. With this simple two unit model, a particular
cell may always be of a specific material (best case). A particular location may also, half the time
be one material, half the time be the other (worst case; either material type is equally likely). The
uncertainty may also fall somewhere in between. Where the material type is always the same, risk
is low. Where uncertainty is high, risk is high.
426
FIGURE 17-10. Merging (multiplying) the risk maps in Figures 16.8 and 16.9 creates the risk map
shown here.
427
{}
NOTES:
428
-af
= append file
0 = false
1 = true
default = 0
-aor
default = 0
-cor
default = 0
-cst
-df
-fmt
= constant value
= destination file name
= output file format
0 = floating point
1 = integer
default = 1.0
default =
default = 1
-help
-lgf
-ota
-otc
defalut = log.dat
default = 0
default = 0
2 = divide
3 = multiply
-ots
default = 0
-pmn
-pmx
-prf
-pxyz
=
=
=
=
minimum probability
maximum probability
preference file name
print X, Y, Z coordinates
0 = false
1 = true
default = 0.0
default = 1.0
defalut = array.prf
default = 0
-rcl
-rmn
-rmx
-runa
-runc
-runp
-runr
-runs
-runv
-s1f
-s2f
-sed
-sef
-ssf
-sst
-xpt
-ypt
-zpt
=
=
=
=
=
=
=
=
=
=
=
=
=
=
=
=
=
=
default = 0
default = 0.0
default = 0.0
default =
default =
default = 1
default =
default =
default = 1
default = 0.0
default = 0.0
default = 0.0
429
Array Mathematics
The mathematics used in array are fundamentally the same as those described in histo (Chapter 6).
430
CHAPTER 18
Utilities
This chapter describes the use of several utility programs; calc, lpr_ps, ps_merge, and editor.
calc
Calc is a programming and scientific RPN calculator with binary (2), decimal (10), octal (7), and
hexadecimal (16) base support. It is shown in Figure 18.1. The calculator is composed of several
components, 1) a display log where the current entry and previous operations and results are shown,
2) the four calculator registers (Top, Z, Y, X respectively) and register shift keys, 3) the calculator
angle mode (Degrees or Radians), 4) the operation base (binary, octal, decimal, or hexadecimal),
and 5) the numerical and operation keypad.
FIGURE 18-11. This is an example
431
Utilities
CHS
432
EEX
+
/
*
e^x
ln
10^x
log
cos
acos
sin
asin
tan
atan
sqrt
x^2
y^x
1/x
x!
PI
:
:
:
:
:
:
:
:
:
:
:
:
:
:
:
:
:
:
:
:
deg
rad
DEC
BIN
OCT
HEX
:
:
:
:
FRAC
INT
CLX
CLR
CLA
STO
RCL
Enter
Quit
: Terminate program.
AddY + X
SubtractY - X
DivideY / X
MultiplyY * X
Raise e (2.71828) to the X power
Natural log of X
Raise 10 to the X power
Log (Base 10) of X
Cosine X
Arc-cosine X
Sine S
Arc-sine X
Tangent X
Arc-tangent X
Square root of X
X squared
Raise Y to the X power
Take the factorial of X (X is first truncated to the nearest integer)
Enter (3.1415926...)
lpr_ps
lpr_ps
lpr_ps is a program for printing text files to a Postscript printer. The program adds utility over the
standard UNIX lpr command in that various print controls can be specified:
Font size.
File name on header.
Page number.
Date and time stamp on header.
Ability to print only specified lines of file.
Ability to number lines.
Landscape or portrait page orientation.
NOTE: This is not a X-windows/motif application. This program is run from the UNIX
command line. This program though is called from several UNCERT modules to
print ASCII text files.
The output uses a Courier font. Courier was selected because it is a standard font available on most
computers and it is non-proportionally spaced (i.e. all letters are the same width, therefore letters in
a given column will always be vertically aligned. For proportionally spaced fonts this is not true).
To run the application, the command line syntax is:
Syntax: lpr_ps [-bm #.#] [-fl #] [-hd #.#] [-hf #] [-ll #] [-lm #.#] [-ln #] [-lpc #] [-lpq ] [-ln]
[-out ] [-rm #.#] [-sd #] [-tb #] [-tf #] [-tm #.#] filename
Meaning of flag symbols:
#
#.#
{}
=
=
=
=
integer
float
character string.
variable is an array. Values must be seperated by a , and no spaces are
allowed. Do not use the { } symbols on the command line.
433
Utilities
NOTES:
=
=
=
=
=
=
=
=
=
=
bottom margin
first line to print
distance of header from top margin
header font size (1 point = 1/72)
last line to print
print line numbers (if flag present)
left margin
number of copies
print queue
print orientation
0 = portrait
1 = landscape
default = 1.0
default = 1
default = 0.5
default = 12
default = last line
-out
-rm
-sd
= output file
= right margin
= show date and time
0 = false
1 = true
default =
default = 1.0
default = 1
-tb
-tf
-tm
default = 8
default = 7
default = 1.0
default = 1.0
default = 1
default = ps
default = 0
NOTE: If no filename is given, the above list of parameters is scrolled to the console.
An example command line argument might be:
lpr_ps -fl 25 -ll 125 -ln -sd 1 -tf 8 lpr_ps.c
This would print 101 lines from the file lpr_ps.c starting at line 25 with line numbers. The
filename, page number, and date and time would be printed at the top of each page. The font would
be Courier 8 point.
434
ps_merge
ps_merge
Ps_merge is a utility program used to merge and scale two UNCERT Postscript files into a single
Postscript file. Previously merged files may also be merged again with raw Postscript files or other
previously merged files. So that the graphics from one figure dont overlap with that of another
figure, each Postscript file can be independently translated and scaled. Note that the translation is
done before the scaling.
WARNING:
This program cannot be used to merge Postscript files generated using any
programs other then those included in UNCERT.
Syntax: ps_merge [-sx1 #.#] [-sy1 #.#] [-sx2 #.#] [-sy2 #.#] [-tx1 #.#] [-ty1 #.#] [-tx2 #.#]
[-ty2 #.#] file_1 file_2
Meaning of flag symbols:
#
#.#
{}
=
=
=
=
integer
float
character string.
variable is an array. Values must be seperated by a , and no spaces are
allowed. Do not use the { } symbols on the command line.
NOTES:
=
=
=
=
=
=
=
=
default = 1.0
default = 1.0
default = 1.0
default = 1.0
default = 0
default = 0
default = 0
default = 0
435
Utilities
editor
Supplied with UNCERT is a simple text editor (Figure 18.2, A modified version of the example
editor in Motif Programming Manual (Volume 6) by Heller (1991)). Editor is supplied mainly as a
X/motif based tool to view text data and program result files. It is not recommended that it be used
for anything other then for the most basic file editing.
To run the application, the command line syntax is:
Syntax: editor [filename]
436
C Print Formats
APPENDIX A
e = exponential
f = fixed
g = general
Examples
Number
8.2f
8.2e
8.2g
--------------------------------------------------------------------128.3262
128.33
1.28e+02
1.28e+02
0.001234
0.00
1.23e-02
1.23e-02
256.0000
256.00
2.56e+02
256
437
C Print Formats
438
APPENDIX B
A Makefile is used to control the compilation of a program. It defines the location of source libraries and include files, compilation flags, and the source files required for each particular program.
Below is the Makefile for plotgraph. NOTE: Makefiles are non-standard between UNIX platforms so this file may need some modification.
Makefile
A Makefile is used to control the compilation of a program. It defines the location of source
libraries and include files, compilation flags, and the source files required for each particular
program. Below is an example Makefile. Makefiles for each UNCERT program may be found in
there respective src directories. NOTE: Makefiles are non-standard between UNIX platforms so
this file may need some modification
INC=../../inc/
WTLIBS
= -lXm -lXt -lX11 -lXext ../../lib/libXs.a -lc -lm
CCOPTS= -g -c -I$(INC)
FCOPTS= -g -c -qextname
LDOPTS= -g -qextname -o
CC
= cc
F77
= xlf
CCF77 = xlf
OBJS = axis.o files.o fonts.o graph_tools.o log.o menu.o messages.o mouse3d.o \
normal.o parse.o pref.o print.o rotated.o uncertmisc.o widget_tools.o \
xcolors.o zoom.o
LOBJS = dgeco.o regres.o sort.o
plotgraph: graphmenu.o graphio.o graphplot.o $(LOBJS) $(OBJS)
$(CCF77) $(LDOPTS) plotgraph graphmenu.o graphio.o graphplot.o \
$(LOBJS) $(OBJS) $(WTLIBS)
439
.c.o:
$(CC) $(CCOPTS) $*.c
.f.o:
$(F77) $(FCOPTS) $*.f
This Makefile is for an IBM RS6000 running AIX (IBMs version of UNIX). Other platforms may
need to include additional libraries, libraries with slightly different names, or a PATH to the
libraries of interest. These changes should be made to the WTLIB variable. CC is used to define
the ANSI C compile. This variable will usually be set to cc or gcc. F77 is used to define the name
of the FORTRAN compiler; this is often f77. If you dont have a FORTRAN compiler, there is a
program called f2c which converts FORTRAN source code to ANSI C code (See Chapter 2,
comments are made there for retrieving this software). If f2c is used, several changes need to be
made, the effected lines are shown below:
CCOPTS= -DF2C -g -c -I$(INC)
FCOPTS= -g -c
LDOPTS= -DF2C -g -o
F77
= f2c
CCF77 = cc
.f.o:
$(F77) $(FCOPTS) $*.f
$(CC) $(CCOPTS) $*.c
\rm $*.c
Source Code
Module Specific Files
440
arraymenu.c
arraystat.c
blockmenu.c
blockio.c
blockplot.c
calc.c
contourmenu.c
contourio.c
contourplot.c
blanking.c
gradient.c
blanking.h
gradient.h
Source Code
gmt_contour.c
gmt.h
gmt_extra.c
plane.c
post.c
profile.c
plane.h
post.h
profile.h
editor.c
gridmenu.c
gridcalc.c
gridio.c
invdist_menu.c
kt3d.c
ktb3d.c
ktb3dm.c
ord_menu.o
trend_menu.c
grid.h
kt3d.h
histomenu.c
histoio.c
histoplot.c
lpr_ps.c
mainmenu.c
modmenu.c
basic.c
block.c
drain.c
evt.c
ghb.c
oc.c
pcg.c
recharge.c
river.c
sip.c
ssor.c
well.c
modplot.c
modutil.c
modview.c
modflow.h
441
mt3dmenu.c
basic.c
advection.c
dispersion.c
mixing.c
reaction.c
modplot.c
modutil.c
modview.c
graphmenu.c
graphio.c
graphplot.c
regres.o
ps_merge.c
sisimmenu.c
sisimstat.c
sisimview.c
smain22.c
ssim22.c
simio22.c
surfacemenu.c
surfaceio.c
surfaceplot.c
gradient.c
plane.c
variomenu.c
variocalc.c
variocalc3d.c
varioio.c
varioplot.c
calc_grid_vario.c
jack.c
params.c
softcalc.c
var_statistic.c
variofitmenu.c
variofitplot.c
fitmenu.c
gam_series.c
442
mt3d.h
shead.h
gradient.h
plane.h
vario.h
jack.h
params.h
vario.h
gam_series.h
sisim3d main.
sisim3d simulation calculations.
sisim3d I/O
surface main and menu control
surface I/O
surface screen plot and printer control
calcualtes the cell gradients for surface
enable mapping on X-Y, X-Z, and Y-Z planes.
vario main and menu control and I/O
vario calculator
vario 3D control
vario I/O
vario screen plot and printer control
vario 3d calculator.
vario jackknifing calculator.
displays vario parameters on graph.
vario soft data calculator.
vario calculator.
variofit main and menu control and I/O
variofit screen plot and printer control
variofit automatic solver
variofit series display tools.
UNCERT Users Manual
Source Code
jackfit.c
latin.c
params.c
jackfit.h
params.h
Shared Files
axis.c
files.c
fonts.c
graph_tools.c
inet.c
log.c
messages.c
mouse3d.c
net.c
normal.c
parse.c
pref.c
print.c
rotated.c
socket.c
sort.c
statistic.c
uncertmisc.c
widget_tools.c
xcolors.c
zoom.c
libXs.a
libhtmlw.a
dgeco.f
dgefa.f
dgesl.f
uncert.h
axis.h
files.h
fonts.h
graph_tools.h
messages.h
mouse3d.h
normal.h
parse.h
pref.h
print.h
rotated.h
socket.h
uncertmisc.h
widget_tools.h
xcolors.h
libXs.h
HTML.h &
HTMLP.h
443
: sample map
: indicator map, Hanford Reservation, Washington
well.bck
well.bblk
wtstr.striped.dat
contour
Standard data files:
conc3.srf
head.srf
dig.grd
conc.blk
conc.blk
building.blk
conc.lbl
conc.lbl
label.lbl
conc3.prf
mirror.srf
grid
Standard data files:
conc3.dat
water.dat
histo
Standard data files:
data1.dat
modmain
Example MODFLOW calibration problem
444
f93.prj
f93.bas
f93.bcf
f93.csh
f93.oc
f93.rch
f93.riv
f93.sip
f93cor.prj
f93cor.bcf
f93cor.csh
f93cor.rch
f93cor.riv
f93.blk
contour.prf
mt3dmain
Example MT3D calibration problem
js.prj
js.bas
js.bcf
js.csh
js.oc
js.pcg
js.well
jsmt.prj
jsmt.adv
jsmt.btn
jsmt.csh
jsmt.dsp
jsmt.ssm
plotgraph
Two-column X-Y data file:
test.dat
445
sisim
Standard data files:
three.prj
three.dat
three.geom
three.var
three.set
three.unc
example.aniso.4.lhc
computer.net
surface
Input data files:
map.srf
dig.grd
vario
Scattered point data files:
water.dat
well.dat
Gridded data files:
geo.iso.1.sim - geo.iso.10.sim
446
variofit
Input data files:
clark.gam
well.gam
well.jack.gam
Gridded input data files:
geo.iso.1.gam - geo.iso.10.gam
Output data files:
clark.out
well.aniso.2.lhc
447
448
Preference Files
APPENDIX C
For many projects, the same graph specifications are used repeatedly; rather then having the user
change the program parameters each time the program is run, a preference file can be generated to
set default values for most user specified variables. This file can be created by selecting the
File:Save Preferences menu option (recommended method), or using a template such as the file
shown below.
Note that the default preference file name is plotgraph.prf. When plotgraph is executed it will
search the current directory for this file; if it exists, these specified defaults will be used, unless they
are superseded by a command-line argument. If plotgraph.prf is not present the plotgraphs
internal defaults will be used.
The preference file is an ASCII file and is user editable. There are several rules to these files:
449
Preference Files
Y Column
Font #0
Font Size
Font #1
Font Size
Font #2
Font Size
Font #3
Font Size
Font #4
Font Size
Font #5
Font Size
Axes Type
=2
#0
#1
#2
#3
#4
#5
Scale Priority
X/Y Ratio
Y Exag Scale
X Main Tic Freq
Y Main Tic Freq
X Minor Tic Freq
Y Minor Tic Freq
X Tic Origin
Y Tic Origin
X Minimum
X Maximum
Y Minimum
Y Maximum
Use Mesh
Dash Mesh
X Mesh Freq
Y Mesh Freq
X Mesh Origin
Y Mesh Origin
X-Axis Format
Y-Axis Format
Main Title
Secondary Title
X Label
Y Label
Refresh
Error-Bar Color
Mean Symbol
Symbol Color
450
=Helvetica-Bold
=24
=Helvetica-Bold
=15
=Helvetica-Bold
=15
=Helvetica
=15
=Helvetica
=10
=Helvetica
=12
=0
; 0 = Normal,
1 = Log X,
2 = Log Y,
3 = Log-Log,
4 = Probability
=0
; 0 = X/Y SCALE 1 = Y EXAGERATION
=1.500000
=19.373931
=10.000000
=0.250000
=5
=5
=0.000000
=0.000000
=0.000000
=80.000000
=0.000000
=3.250000
=0
; 0 = FALSE,
1 = TRUE
=0
; 0 = FALSE,
1 = TRUE
=8.000000
=0.275284
=0.000000
=0.000000
=.0f
=.2f
=Semivariogram of Hypothetical Data Set
=Example: UNCERT plotgraph module
=distance
=gamma (h)
=0
; 0 = EXPOSURE
1 = UPDATE
=7
; 0 = Black, 1 = White, 2 = Blue,
3 = Magenta, 4 = Green, 5 = Yellow,
6 = Cyan, 7 = Red
=1
; -1 No Symbol,
0 = Circle,
1 = Cross,
2 = Diamond,
3 = Square,
4 = X
=7
; 0 = Black, 1 = White, 2 = Blue,
3 = Magenta, 4 = Green, 5 = Yellow,
6 = Cyan,
Thickness
Mean Symbol Width
Extent Cross Width
Line Color
=1.000000
=15.000000
=10.000000
=5 2 4
; 0 = Black, 1 = White,
2 = Blue, 3 = Magenta,
4 = Green, 5 = Yellow,
6 = Cyan, 7 = Red
; -1 = No Line,
0 = Solid Line,
Line Type
=1 -1 0
Dashed,
2 = Double Dashed
Line Thickness
=1 1 3
Symbol Type
=-1 0 -1
Symbol Size
=10 8 10
Symbol Filled
=1 1 1
!
! Print Parameters
!
Print File
=junk.ps
PS Extension
=*.ps
Print Queue
=ps
Print Header
=0
; 0 =
Print Destination=0
; 0 =
Print Orientation=0
; 0 =
PS Output
=0
; 0 =
Print Copies
=1
Top Margin
=1.500000
Bottom Margin
=1.500000
Left Margin
=1.500000
Right Margin
=1.000000
Regression Order
=5
Regression Line
=0
Regression Type
=1
7 = Red
1 =
; -1 No Symbol,
0 = Circle,
1 = Cross,
2 = Diamond,
3 = Square,
4 = X
; 0 = FALSE,
FALSE,
PRINTER
PORTRAIT
BLACK & WHITE
;
;
;
;
in
in
in
in
1
1
1
1
=
=
=
=
1 = TRUE
TRUE
FILE
LANDSCAPE
COLOR
inches
inches
inches
inches
451
Preference Files
452
X Resource Files
APPENDIX D
The X application resource file helps describe the look of the application. Much of the programs
appearance is described within the software, but some options are left to the user. For plotgraph,
these are defined in the file:
$UNCERT/app-defaults/Graph
The X resource file describes such things as the size of the application window, the color of
different portions of the application, and the fonts used for the text in the application. For more
information on these resources refer to the OSF Programmers Guide (1991) and Programmers
Reference (1991).
453
X Resource Files
Graph*XmTextField.fontList:
Graph*XmScale.fontList:
Graph*XmLabel.fontList:
Graph*XmLabelGadget.fontList:
Graph*XmCascadeButton.fontList:
Graph*XmCascadeButtonGadget.fontList:
Graph*XmPushButton.fontList:
Graph*XmPushButtonGadget.fontList:
Graph*XmToggleButton.fontList:
Graph*XmToggleButtonGadget.fontList:
!
Graph*topShadowColor: white
Graph*borderColor:
wheat3
!
Graph*foreground: black
!
Graph*Help*foreground:
!
Graph*View Port*foreground: black
Graph*View Port*background: black
454
-*-times-medium-r-*--14-*
-*-times-bold-r-*--14-*
-*-times-bold-r-*--14-*
-*-times-bold-r-*--14-*
-*-times-bold-r-*--14-*
-*-times-bold-r-*--14-*
-*-times-bold-r-*--14-*
-*-times-bold-r-*--14-*
-*-times-bold-r-*--14-*
-*-times-medium-r-*--14-*
red
Help Files
APPENDIX E
Help files are user editable (assuming security write privileges are set appropriately) ASCII text
files which give brief descriptions about the use of each program (The example here is for
plotgraph). Appropriate sections of this file are displayed in the pop-up dialog when a particular
topic is requested. The help file plotgraph is located at:
$UNCERT/help/plotgraph.help
If the user (or administrator) wishes to modify the help file there are three basic components to
utilize. There is a separator comment line between topics (not required), the topic identifier (these
are hard coded and cannot be edited, deleted, or added to without modifying the source code), and
the topic help text.
A comment line starts with a ! in the first column of the line. In the file, comment lines can be
placed anywhere. Note though, that they will not be displayed in the help pop-up dialog.
Topic identifiers have :: in the first two columns of the line and are immediately followed by the
topic identifier. These lines cannot be modified or deleted, and adding new identifiers will do
nothing unless the source code is modified and recompiled.
The topic help is any text following the topic identifier and the next topic identifier (Commented
lines are ignored).
NOTE: Currently the help text can be composed of a maximum to 200 lines, and each line
cannot be more then 1024 characters.
Below is an abbreviated copy of plotgraphs help file.
PLOTGRAPH and menu.
455
Help Files
By:
Bill Wingle
Date: December 15, 1991
1).
2).
3).
-esp
-exceed
456
APPENDIX F
Bibliography
Adobe Systems, Inc., 1990, Postscript Language Reference Manual, Second Edition, AddisonWesley, Reading, MA.
Adobe Systems, Inc. and G.C. Reid, 1988, Postscript Language Program Design, Addison-Wesley,
Reading, MA.
Alabert, F., 1987, Stochastic Imaging of Spatial Distributions Using Hard and Soft Information,
M.S. Thesis, Stanford University, Stanford, California.
Burden, R.L. and J.D. Faires, 1985, Numerical Analysis, Third Edition, Prindle, Weber, and
Schmidt, Boston, pp 342-353.
Burrough, P.A., 1986, Principles of Geographical Information Systems for Land Resource
Assessment, Monographs on Soil and Resources Survey No. 12, Oxford Science Publications Clarendon Press, Oxford.
Clark, I., 1979, Practical Geostatistics, Elsevier Applied Science, London and New York.
Cohen, J,K, and J.W Stockwell, 1994, The SU Users Manual, Center for Wave Phenomena,
Colorado School of Mines, Golden, Colorado.
Curry, D.A., 1988, Using C on the UNIX System, OReilly and Associates, Inc, Sebastopol,
California.
Davis, B.M., 1987, Uses and Abuses of Cross-Validation in Geostatistics, Mathematical Geology,
Vol. 19, No. 3, pp 241-248.
Davis, J.C., 1986 (Second Edition), Statistics and Data Analysis in Geology, John Wiley & Sons,
New York.
Deutsch, C.V., and A.G. Journel, 1992, GSLIB: Geostatistical Software Library and Users Guide,
Oxford University Press, New York.
Englund, E., and A. Sparks, 1988, GEO-EAS, U. S. Environmental Protection Agency,
Environmental Monitoring Systems Laboratory, EPA/600/4-88/033.
Foley, J.D., A. Van Dam, S.K. Feiner, and J.F. Hughes, 1990, Computer Graphics, Principles and
Practice, Addison-Wesley, Reading, Massachusetts.
Gilly, D. and T. OReilly, 1990, The X Window System in a Nutshell, OReilly and Associates,
Sebastapol, CA.
457
Bibliography
Gmez-Hernndez, J.J. and R.M. Srivastava, 1990, ISIM3D: An ANSI-C Three Dimensional
Multiple Indicator Conditional Simulation Program, Computers in Geoscience, Vol. 16, No. 4,
pp. 395-440.
Gupta, S.K., C.R. Cole, C.T. Kincaid, and A.M. Monti, 1987, Coupled Fluid, Energy, Solute
Transport (CFEST) Model: Formulation and Users Manual, Office of Nuclear Waste Isolation,
Battelle Memorial Institute, Columbus, Ohio.
Hale, D., and C. Artley, 1991, pscontour (simple Postscript contouring program), Center for Wave
Phenomenon, Department of Geophysics, Colorado School of Mines.
Heller, D., 1991, Motif Programming Manual (Volume 6), OReilly & Associates, Inc., Sebatopol,
California.
Isaaks, E., and R.M. Srivastava, 1988, Spatial Continuity Measures for Probabilistic and
Deterministic Geostatistics, Mathematical Geology, Vol. 20, No. 4, pp. 313-341.
Journel, A., 1986, Constrained Interpolation and Qualitative Information, Mathematical Geology,
Vol. 18, No. 3, pp. 269-286.
Journel, A.G. and Ch. J. Huijbregts, 1978, Mining Geostatistics, Academic Press, London.
Kirk, K.G., 1991, Residual Analysis for Evaluating the Robustness of Inverse Distance, Kriging,
and Minimum Tension Gridding Algorithms, GeoTech/GeoChautauqua 91 Conference
Proceedings, Lakewood, Colorado, pg. 15 (presentaton).
Lillesand, T.M. and R.W. Kiefer, 1987, Remote Sensing and Image Processing, Second Edition,
John Wiley and Sons, New York.
McCuen, R.H., 1989, Hydrologic Analysis and Design, Prentice-Hall, Englewood Cliffs, New
Jersey.
McDonald, M.G., and A.W. Harbaugh, 1984, A Modular Three-Dimensional Finite-Element
Flow_Model, U.S. Geological Survey OFR 83-875.
McKay M.D., R.J. Beckman and W.J. Conover. 1979. A Comparison of Three Methods for
Searching of Input Variables in the Analysis of Output From a Computer Code,
Technometrics. Vol. 21, No. 2, pp 239-245.
McKenna, S.A., 1994, Utilization of Soft Data for Uncertainty Reduction in Groundwater Flow and
Transport Modeling, Ph.D. Dissertation, Colorado School of Mines.
Moler, C., 1978, LINPACK (Linear algebra FORTRAN77 sub-routines), University of New
Mexico, Argonne National Lab.
Nye, A., 1989, Xlib Programming Manual for Version 11 (Volume 1), OReilly and Associates,
Sebastapol, CA.
Nye, A., 1989, Xlib Reference Manual for Version 11 (Volume 2), OReilly and Associates,
Sebastapol, CA.
Nye, A. and T. OReilly, 1990, X Toolkit Intrinsics Programming Manual for X Version 11 (Volume
4), OReilly and Associates, Sebastapol, CA.
458
Open Software Foundation, Inc., 1991, Programmers Guide: Release 1.1, Prentice-Hall,
Englewood Cliffs, NJ.
Open Software Foundation, Inc., 1991, Programmers Reference: Release 1.1, Prentice-Hall,
Englewood Cliffs, NJ.
ORielly, T., 1990, X Toolkit Intrinsics Reference Manual for X Version 11 (Volume 5), OReilly
and Associates, Sebastapol, CA.
Press, W.H., S.A. Teukolsky, W.T. Vettering, and B.P. Flannery, 1992, Numerical Recipes in C, The
Art of Scientific Computing, Second Edition, Cambridge University Press, New York.
Quercia, V. and T. ORielly, 1990, X Window System Users Guide for Version 11 (Volume 3),
OReilly and Associates, Sebastapol, CA.
Richardson, A., 1993, Xvertext Version 3.0, e-mail mppa3@uk.ac.sussex.syma.
Shafer, J.M. and M.D. Varljen, 1990, Approximation of Confidence Limits on Sample
Semivariograms From Single Realizations of Spatially Correlated Random Fields, Water
Resources Research, Vol. 26, No. 8, pp 1787-1802.
Sutanthavibul, S., 1995, FIG : Facility for Interactive Generation of Figures (Software library).
M.I.T.
Wessel, P., and W.H.F. Smith, 1991, The GMT SYSTEM, The School of Ocean and Earth Sciences
and technologyUniversity of Hawaii.
Wingle, W.L., 1992, Examining Common Problems Associated with Various Contouring Methods,
Particularly Inverse-Distance Methods, Using Shaded Relief Surfaces, Geotech 92 Conference
Proceedings, 1992, Lakewood, Colorado, pp 362-376.
Wingle, W.L. and E.P. Poeter, 1993, Evaluating Uncertainty Associated with Semivariograms
Applied to Site Characterization. Ground Water, Vol. 31, No. 5, pp 725-734.
Young, D.A., 1985, The X Window System: Programming and Applications with Xt, OSF/Motif
Edition, Prentice-Hall, Englewood Cliffs, NJ.
Zheng, C., 1990, MT3D, A Modular Three-Dimensional Transport Model, S.S. United States
Environmental Protection Agency and Papadopulos and Associates, Inc., Rockville, Maryland.
Zheng, C., 1991, MT3D, A Modular Three-Dimensional Transport Model, S.S. Papadopulos and
Associates, Inc., Rockville, Maryland.
459
Bibliography
460
Software Contributions to
UNCERT
APPENDIX G
The following programs or libraries are included in the UNCERT software. These are Copyrighted
materials, but by condition of their Copyright notices, the following statements meet the conditions
of their Copyright.
This library is used to rotate X-windows text (rotated.c and rotated.h).
xvertext 3.0, Copyright (c) 1993 Alan Richardson (mppa3@uk.ac.sussex.syma)
Permission to use, copy, modify, and distribute this software and its documentation for any
purpose and without fee is hereby granted, provided that the above copyright notice appear
in all copies and that both the copyright notice and this permission notice appear in
supporting documentation. All work developed as a consequence of the use of this
program should duly acknowledge such use. No representations are made about the
suitability of this software for any purpose. It is provided as is without express or
implied warranty.
This modified library is used to select X-windows fonts (fonts.c and fonts.h).
FIG : Facility for Interactive Generation of figures
Copyright (c) 1985 by Supoj Sutanthavibul
Permission to use, copy, modify, distribute, and sell this software and its documentation
for any purpose is hereby granted without fee, provided that the above copyright notice
appear in all copies and that both that copyright notice and this permission notice appear
in supporting documentation, and that the name of M.I.T. not be used in advertising or
publicity pertaining to distribution of the software without specific, written prior
permission. M.I.T. makes no representations about the suitability of this software for any
purpose. It is provided as is without express or implied warranty.
This modified program is the program editor, described in Chapter 17.
Written by Dan Heller. Copyright 1991, OReilly & Associates.
This program is freely distributable without licensing fees and is provided without
guarantee or warrantee expressed or implied. This program is -not- in the public domain.
461
This library has not been, but soon will be implemented into the on-line help.
NCSA Mosaic for the X Window System
Software Development Group
National Center for Supercomputing Applications
University of Illinois at Urbana-Champaign
605 E. Springfield, Champaign IL 61820
mosaic@ncsa.uiuc.edu
Copyright (C) 1993, Board of Trustees of the University of Illinois
NCSA Mosaic software, both binary and source (hereafter, Software) is copyrighted by
The Board of Trustees of the University of Illinois (UI), and ownership remains with the
UI.
The UI grants you (hereafter, Licensee) a license to use the Software for academic,
research and internal business purposes only, without a fee. Licensee may distribute the
binary and source code (if released) to third parties provided that the copyright notice and
this statement appears on all copies and that no charge is associated with such copies.
Licensee may make derivative works. However, if Licensee distributes any derivative
work based on or derived from the Software, then Licensee will (1) notify NCSA
regarding its distribution of the derivative work, and (2) clearly notify users that such
derivative work is a modified version and not the original NCSA Mosaic distributed by the
UI (This version has been modified to eliminate WARNING messages during compilation.
These were strictly variable casting changes, and no substantive changes were made to the
code itself).
Any Licensee wishing to make commercial use of the Software should contact the UI, c/o
NCSA, to negotiate an appropriate license for such commercial use. Commercial use
includes (1) integration of all or part of the source code into a product for sale or license
by or on behalf of Licensee to third parties, or (2) distribution of the binary code or source
code to third parties that need it to utilize a commercial product sold or licensed by or on
behalf of Licensee.
UI MAKES NO REPRESENTATIONS ABOUT THE SUITABILITY OF THIS
SOFTWARE FOR ANY PURPOSE. IT IS PROVIDED AS IS WITHOUT EXPRESS
OR IMPLIED A WARRANTY. THE UI SHALL NOT BE LIABLE FOR ANY
DAMAGES SUFFERED BY THE USERS OF THIS SOFTWARE.
By using or copying this Software, Licensee agrees to abide by the copyright law and all
other applicable laws of the U.S. including, but not limited to, export control laws, and the
terms of this license. UI shall have the right to terminate this license immediately by
written notice upon Licensees breach of, or non-compliance with, any of its terms.
Licensee may be held legally responsible for any copyright infringement that is caused or
encouraged by Licensees failure to abide by the terms of this license.
462
463
464