Professional Documents
Culture Documents
– DEMs
– orthophotos
– Aerial imagery
– Vegetation
surveys
– Soils and
hydrology
– Disturbance
information
(fires, roads,
etc.)
Step 2. Reconnoiter the area; sketch major
features and plan measurement locations so that
a statistically significant data set can be acquired
in the time allotted
2. Ground Reference Data (GRD)
GRD is necessary
to train and validate
photo-interpreted
and remotely
sensed products.
• How accurate is my
map?
Efficiency
• Value to analyst vs.
feasibility in the
field.
3. Preflight Fieldwork a. Locate places to
collect GCPs:
– Visible in imagery
– Buildings
– Intersections
– Structures
b. Locate image
calibration targets
– Large areas
– Homogeneous
Pixel location
depends on
– Dark and light targets
accuracy of GPS – Located throughout
study area
In situ Measurements
in support of airborne or
4. Remotely sensed data satellite data
must usually be calibrated :
• Corrected geometrically (x,y,z) and
radiometrically (e.g, to % reflectance); --
allows comparison of remotely sensed
data from different dates and
quantitative analysis.
In situ spectrometer
measurement
5. Spectral measurements calibrated
against a “white” reference standard.
NBS standard SpectralonTM panel used to calibrate
images to reflectance in 4000-2500nm range
Transects across a
parking lot are
measured for
invariant targets;
used for 2nd stage
RT calibration
improvement
5.a. Spectralon: Approved NIST Standard
900
800
Digitized Numbers (DN)
700
600
500
Measured Data
200
field data
100
0
0 16 32 48 64 80 96 112 128 144 160 176 192 208 224
channel (#)
Instrument Calibration to
25
Radiance
Radiance (µW/cm^2nmsr)
20
15 Calibrated Radiance
10
0
400 700 1000 1300 1600 1900 2200 2500
Wavelength (nm)
Reflectance
0.60 0.60
0.50 0.50
0.40 0.40
0.30
0.30
0.20
0.20
0.10
0.10
0.00
0.00 400 700 1000 1300 1600 1900 2200 2500
Wavelength (nm)
400 700 1000 1300 1600 1900 2200 2500
Wavelength (nm)
LiCOR-PCA
12. Leaf or shoot water content = (Fresh weight – dry
weight )/ dry weight
Rhododendron
Quercus
Mollino
Pinus
Evolution of (a) EWT and dry matter during 1997 (b) FMC for the same period.
Ceccato et al. 2001 Remote Sensing of Environment
13. Field Spectrometry
• Quantitative measurement of
radiance, irradiance, or
reflectance in the field.
Phragmites australus
14. Capture spatial/condition and spectral
variation within each type, such as “grass”
14.a. Collect enough measurements to understand
the variability: Mean +/- standard deviation of all
dense grass measurements
14.c. Woody material & dry plant litter is often confused
with bare soil. Capture the variability in your samples
14.d. Mean +/- standard deviations
for woody debris and dry plant litter
14.e. “Bare Soil” may be deceiving; be sure to collect
the features you identify. Are these bare soils? What does
the shadow tell you about the quality of this sample?
14.f. Bare soil spectra from a boreal forest in Sweden
14. g. Field spectra can help you classify &
understand your image data
55% grass
30% trees
5% people
100% grass
10% sand
Points
– Difficult to locate individual pixels in image.
+ Higher spatial accuracy.
+ More efficient to sample.
Polygons
– Increased variability: Difficult to characterize with a
single value which may be inaccurate for individual
pixels.
c.f., loss of spatial information from too large sample
units.
‐ Requires deciding how to delineate polygon.
+ May be readily identifiable in image.
16.a. Sample Units – points
12 tree points
17 not tree points
16.b. Sample Units – polygons
75% trees
0%
trees
100% trees
16.c. Sample Units – polygons
100% trees
75% trees
80% trees 0%
85% trees trees
100% trees
17. Sampling Designs
Systematic
• Costly
• May require
sampling
inaccessible
areas
• Unbiased
• May miss rare
classes
17.b. Sampling Designs
Random
• Costly
• May require
sampling
inaccessible
areas
• Unbiased
• May miss rare
classes
17.c. Sampling Designs Stratified
Random
• Costly
• May require
sampling
inaccessible
areas
• Unbiased
• Includes rare
classes
• Requires
completed class
map
17.d. Sampling Designs
Clustered
Random
• Efficient
• Must be
designed well to
avoid bias,
missing rare
classes, etc.
17.e. Sampling Designs
Drive-by
• Efficient
• Requires
effective
coverage by
road network
• May be
substantially
biased
18. Other Sources of Reference Data
When you can’t go to the field…
• Existing land cover maps, geologic or soil maps
• Topographic maps
• Plant surveys
• High resolution photographs
• Historical/published data of site
Inspect the quality of the data closely.
• How were they collected? At what scale?
• Do they have relevant thematic information?
• Are they well distributed throughout the image?
• Well-timed with the image?
• What is the associated error?
19. Validation of image interpretation –
Continuous Data
Image products with a continuous range of
values include:
• Temperature
• Turbidity
• % cover
Compare with GRD by regressing the
field-estimated value for all test points
against image-derived values.
Compare pixel values against
independently collected field data.
Assess accuracy with R2 or RMS
statistics.
19.a. Validation – Continuous Data
ex: % cover of a noxious weed
y = -0.0016 + 0.0066x
R2 = 0.663
y = 0.06 + .0048x; R2 = 0.368
y = -0.31 + .0105x; R2 = 0.554
y = -0.0606 + 0.0136x
R2 = 0.184
Unclassified 28 81 95 68 272
Tamarix 268 1 25 0 294
Creosote 11 14 9 1 35
Palo verde 4 2 82 0 88
Ironwood 18 0 27 30 75
Total 329 98 238 99 765
Palo
Class Tamarix Creosote Ironwood Total
verde
Unclassified 28 81 95 68 272
Tamarix 268 1 25 0 294
Creosote 11 14 9 1 35
Palo verde 4 2 82 0 88
Ironwood 18 0 27 30 75
Total 329 98 238 99 765
Palo
Class Tamarix Creosote Ironwood Total
verde
Unclassified 28 81 95 68 272
Tamarix 268 1 25 0 294
Creosote 11 14 9 1 35
Palo verde 4 2 82 0 88
Ironwood 18 0 27 30 75
Total 329 98 238 99 765
Overall Accuracy presents high-biased accuracies because it does not correct for
chance agreement. The kappa coefficient takes this into account.
Κ = N Σxii – Σ(rowi total)(coli total) = 765*329 – [294*329 + 35*98 + 88*238 + 75*99] = 0.38
N2 - Σ(rowi total)(coli total) 765*765 – [294*329 + 35*98 + 88*238 + 75*99]
Κ > 0.85 excellent agreement; 0.7 < Κ < 0.85 very good agreement;
0.55 < Κ < 0.7 good agreement; 0.4 < Κ < 0.55 fair agreement…
21. Class Conditional Kappa
Ground truth (pixels)
If you’re most interested in a single class, lump all the others together into an
“other” category to calculate the class conditional Kappa. E.g., for Tamarix:
Κ > 0.85 excellent agreement; 0.7 < Κ < 0.85 very good agreement;
0.55 < Κ < 0.7 good agreement; 0.4 < Κ < 0.55 fair agreement…