You are on page 1of 13

Neural Network Technology in Mineral Exploration

Ross Corben
Collaroy Computing
P.O. Box 387
Collaroy N.S.W. 2097
Australia
Tel: +61 2 9972-4429
Fax: +61 2 9972-4436
Email: ross@collaroy.com
Web Page: www.collaroy.com
ABSTRACT
Exploration data analysis plays a vital role in acquiring, mapping, targeting,
prioritising and management of mineral prospects. Given the high stakes and
intense competition within all areas of the global mining industry, informed
business decisions on the acquisition, exploration and exploitation of
prospective ground are more important than ever. The ability of being able to
gain efficient usage of the multi-component exploration data that can be
acquired at ever increasing rates is fundamental to implementing successful
management strategies. Neural computers are good at analysing large
amounts of data for relationship identification, pattern recognition, make
associations, highlight anomalies, and make predictions automatically. Applied
to the various forms of exploration data, they provide a powerful analysis tool
that is orders of magnitude more time efficient than current, more manual tools.
This article presents the results of a study conducted using Prospect Explorer, a
neural software tool from Neural Mining Solutions. This exploration tool takes
an alternative approach to data analysis and visualisation, automating the
detection and prioritisation of anomalies and the relationships between their
components. Results from the project area around the Selwyn Cu-Au Mine in
central Queensland are given which clearly demonstrate the ability of neural
networks to quickly, effectively and consistently locate mineral deposits.

INTRODUCTION
Current methods of data processing, analysis and interpretation of exploration
data rely heavily on the ability of experts and are very time intensive. The two
main reasons for this are the inability to simultaneously display all the layers of
data for interpretation, and the time taken to analyse the results.
Image processing specialists are continuously developing more advanced
computing methods for preparing data for the specialist interpreter.

These

processed data sets are displayed for interpretation by devices that have limited
simultaneous input capabilities.

Consequently, the number of layers of

information (output) are restricted by the capabilities of the display device.


Complex algorithms compiled by experts have to be used to allow meaningful
information derived from multi-layered data sets to be displayed simultaneously.
These algorithms are naturally limited in their effectiveness by the talents of the
image processing experts.
Similarly, the interpretation of these data sets is limited by the expertise and
time resources of the interpreter.

Conventional procedures involve making

numerous overlays each depicting limited sets of data at the same scale and
projection, individually placing them on a light table and tracing patterns of
relative significant difference, and then integrating the interpreted results from
each overlay with those of all of the others. This process is cumbersome,
inconsistent, subjective and very time consuming.
NEURAL COMPUTING
Neural computing is a technology based on the processes of the biological brain
and has many human-like qualities (Kohonen, 1989). Since a neural computer
learns from data, it does not need to be programmed with fixed rules or

equations. It provides a radically different way of producing rapid solutions to


complex problems.

It has the ability to turn data into internally held

relationships which can then be analysed and viewed. The mining industry
frequently faces problems characterised by uncertainties brought about by
chemical, physical and biological phenomena. Until recently, using computer
technology to help the human geologist has been difficult, lagging behind most
other process industries.
Conventional computing approaches are effective when the nature of the
problem and the steps that lead to its solution are well known and can be
explicitly described. However, it is not always possible to describe the solution
to a problem and all the possible forms that the inputs into that problem can
take. Geoscientists have often attempted to define mineral deposits by their
common attributes as a means of more easily locating prospective areas for
exploration. With the increasing dependence upon numerically defined
exploration data sets (magnetics, gravity, EM, geochemistry) attempts have
been made over the past 3 decades to define mineralisation in a more
qualitative sense. Methods such as expert systems, artificial intelligence and,
more recently, GIS analytical techniques attempt to define a single
mineralisation style according to a prescribed common set of rules.

The

assumption here is that, given a large enough example dataset, the geoscientist
can then set expected numerical values for each data type (such as magnetic
response, soil geochemistry, etc) that define mathematically that style of
mineralisation. Such systems have been proven to work very successfully in
areas where the local conditions are reasonably consistent. The drawbacks are
three-fold: first - they tend to perform poorly globally, i.e. rules that define
diamondiferous pipes in an cainozoic volcanic belt in the tropics may not define
the same mineralisation style in palaezoic rocks in a temperate glacial terrain;
secondly - only one, very precise, definition has been input to the computer and
so other styles of mineralisation present in the exploration area go undetected,
thirdly - the methods are often labour intensive and therefore tend to be
somewhat impractical exploration techniques.

The neural approach solves problems in a uniquely different way to that of


conventional computing. Neural computers learn the key relationships in the
numerical data (rather than being given precise rules) and then generalise
from those relationships, building their own experience-based set of rules that
define not just one unique solution but rather mutliple solutions derived from the
data. These can then automatically produce predictions or estimates based
upon their experience.
The ability of the neural computer to learn from experience, rather than having
to be explicitly programmed, means they possess superior intelligence to
conventional computing, augmenting and enhancing existing manpower and
systems (Wasserman, 1993; Rumelhart and McClelland, 1986).
Two main classes of neural learning exists (Lawrence, 1993):
1. Supervised (or trained) - a neural model learns to predict a specified
pattern based on a selected group of input data (e.g. copper soil
concentrations can be predicted and mapped based on the learnt
relationships with other factors such as gravity, EM responses, etc.)
2. Unsupervised (untrained) - a neural model is only given input data and
by learning from all the data (not just a pre-selected training set), forms a
structure defining the datas inter-relationships; this structure can then be
analysed to find common (clusters) and uncommon (anomalous)
patterns.
Another attribute of neural computing is its ability to generalise. Any analysis
tool that uses data is at risk of being too specific to that data. If so, then the
predictions made using that data will not be accurate for new examples. The
analysis will not generalise to new examples. Without the ability to generalise,
the neural approach is prone to the same problem conventional approaches
have of not being very useful at finding known patterns globally.

Neural computers with their ability to learn from experience are iteratively able
to improve their performance and to adapt their behaviour to new and changing
environments.

They also tend to be more robust than their conventional

counterparts. They have the ability to cope well with incomplete (or "fuzzy")
data, handle noise in data well and can deal with previously unspecified or unencountered situations.
The ability to generalise means that once a model has been trained, it can
make predictions for previously unseen examples.

Moreover, these new

examples need not precisely match an example it has seen before, thus being
more flexible than a strictly rules based system. This is achieved by combining
information from several similar examples by forming many broad, general
relationships. Although each relationship is simple (and so generalises well),
the total behaviour of the model is complex.
NEURAL NETWORKS IN MINERAL EXPLORATION
The fact that neural networks are good at handling large amounts of diverse
types of data and recognise patterns (both common and anomalous) and
relationships in those datasets, makes them ideally suited as tools for the
exploration industry. Neural networks are not new to the mining industry, they
are currently used in a task oriented manner for processing plants, for preprocessing of some geophysical data and as basic classification tools. Until
now, the majority of their uses have been in a strongly supervised environment
and with highly variable success.
Neural Mining Solutions (NMS), an Australian based company owned by Straits
Resources, has developed a user-friendly software package called Prospect
Explorer, that utilises a combination of unsupervised and supervised neural
methods to aid in the detection and definition of mineralisation. The principle by
which it works is that for any given data set, the majority data occurs in common
populations. These populations, although often not definable statistically, are

identifiable in the data distribution. When a number of data sets combine, all the
variables can be analysed for the common populations. Those points, or groups
of points, that lie outside these common populations are by definition,
anomalous.
Some of the key features of Prospect Explorer include:

Integration of different types of multi layered data sets eg. geophysicalgeochemical-geological-topographical-satellite.

Ability to search on known mineral occurrences for lookalike signatures.

Search for mineral deposit types based on distinguishing characteristics


eg. Porphyry search - high K, Cu, low Ni, Cr, magnetic low, resistivity
high etc.

Exclude unwanted zones from search eg. elevated Ni values to exclude


ultramafics.

Ability to identify any point and find the reasons for the response.

Automatic lithological mapping through its ability to discriminate different


patterns (clusters) in the data.

Automatically identify and prioritize anomalies in minutes from large multi


layered data sets.

Ability to recognize complex relationships and patterns in large multi


layered data sets which are used to define areas of anomalism, rather
than just identifying outliers.

Quickly identify highly complex relationships (correlation analysis)


between many components from multiple survey datasets.

Distinguish sub-units within mapped sequences.

Analyse variables that are contributing to the mineral occurrences.

Provides comprehensive coverage of all survey data; does not miss


potentially important sites

Consistent, prioritised, non-subjective and accurate results obtained

The product is now being used by a number of mining companies to


successfully help them in their exploration efforts. The key benefits for these
companies lie in the following areas:

Helping geologists in their exploration data management

Increasing the usefulness of exploration data by rapid and accurate


location of prioritised targets

Reducing costs by decreasing exploration expenditure prior to discovery


by drilling

Improving quality and consistency and minimising subjectivity

These factors all strongly support the use of neural computing within the mining
industry, especially for exploration.

The large quantities of data with very

complex inter-relationships, the lack of well-understood rules linking mineral


deposition to survey results and the resources required to analyse data, have
resulted in many occurrences of failure to discover ore bodies.

It is an

unfortunately regular event that a region will be analysed many times by several
exploration companies before a mine is found.
The capabilities of neural computers to handle large, complex data sets and
provide highly applicable information for geologists means they are ideal as an
exploration tool.

EXAMPLE Selwyn Cu-Au Deposit, Australia


NMS was contracted by Selwyn Mines Ltd (SML) to conduct a neural analysis
around its copper-gold mine in central Queensland. SML acquired the Selwyn
Project in January 2000 and commenced mining at Mt. Elliott in March 2000.
The Selwyn tenements cover an area of over 1,500 sq km with exploration
expenditure totalling over $45 m in the last 15 years.

With this enormous

amount of information to sort through, SML required a tool that could efficiently
organise the data and then use it to thoroughly assess the entire tenement
holding in order to identify and prioritise potential exploration targets.

Prospect Explorer provides a number of neural analysis tools which process


gridded data to extract information to aid the geologist. The main tools utilised
in the Selwyn study were:
Anomaly detection
Cluster analysis
Correlation analysis
Relationship explorer
Fuzzy search

SML supplied NMS with a number of data sets for analysis:


1. Regional Airborne Geophysics
Magnetics TMI, RTP (see Figure 1) and 1st Vertical Derivative
Radiometrics Potassium (see Figure 1), Thorium and Uranium
Geotem Early, mid and late time (see Figure 1) decay constants.
2. Prospect Soil Geochemistry Copper, gold, lead and zinc soil sample
values over a number of prospects were provided.

Magnetics RTP

Radiometrics - Potassium

GeoTEM Late time decay

Figure 1 Three of the Airborne Geophysical layers


The data was re-gridded, loaded into the software and a first pass anomaly
analysis was run using all twelve layers.

This technique utilises an

unsupervised neural search where the software attempts to identify commonly


recurring patterns in the data layers which it groups into clusters. The clusters
typically map out the underlying geology as was the case at Selwyn where they
clearly identified the major geological units and even differentiated between
granites that were previously thought to be of a similar type (see figure 2).
Any uncommon patterns (or anomalies) are presented in a coloured contour
map (see figure 3 & 4) that highlights the degree of anomalism.
The map grid points can then be interrogated using the mouse cursor to reveal
how much each layer is contributing to the anomaly. In many instances, the
anomaly may not just be a unique data instance such as an extreme value in
one layer but may be much more subtle where a combination of individual layer
values is unusual but not unique. The anomaly map can be exported directly
into GeoSoft, ER Mapper, ArcView or MapInfo for further image processing or
plotting.

Cluster 1

Mt Dore and Yellow Waterhole Granites

Figure 2

The Selwyn anomaly map highlighted a number of anomalies that had


previously not been identified as well as picking out most of the known deposits
in the area.

Figure 3 - Anomaly Analysis Map showing Mt Elliott layer values


Figure 4 - Zoom in of Anomaly map using Geophysics and Copper Soil
geochemistry layers and showing mining lease boundaries
The next step in the analysis was to run some supervised fuzzy searches in
order to locate areas with similar patterns to know deposits.

Using this

technique, the software identifies the fuzzy pattern of the data layer values lying
over the point of interest and then employs a supervised neural search (see
figure 5) to scan across the whole data range and identify regions where the
relationships in the fuzzy pattern are similar. The output is presented in a
coloured contour map (see figure 6) that highlights the similarity of each point to

the search point. Like the anomaly map, this similarity map can be exported
to other software packages.

Figure 5 Neural Fuzzy Search using the Mount Elliott mine as search point

Figure 6 Similarity Map output from a Fuzzy Search


Supervised fuzzy search patterns can also be specified manually and a number
of these were run on the Selwyn data sets looking for areas with a high late time
decay constant, elevated soil geochem and high magnetic values.
Both the anomaly and fuzzy search analysis tools identified an area called
Metal Ridge North lying to the south of the Mount Elliott mine. SML has recently
followed up this area with some detailed ground geophysics and geological
mapping. The company plans to run the results of this detailed work back
through Prospect Explorer before selecting drill targets.
The final technique used was to run a correlation analysis over a number of
regions of interest such as the Mount Elliott mine area. The correlation analysis
quickly identifies how various layers of data within the region of interest relate to
each other. These inter-relationships can then be used as input to the fuzzy
search tool to identify areas with similar correlations. This technique can be
very useful for identifying similar areas that may have much lower
concentrations than the specified search location.

Figure 7 Correlation Analysis

Prospect Analysis
After completing an analysis of the entire region, a number of exploration
targets were examined separately by zooming in on the area of interest in order
to get more detailed results. One such area is called Amethyst Castle where
previous exploration had identified a large area of alteration. Nine layers of
geophysical data were loaded into Prospect Explorer and an unsupervised
search run using all of the data layers. Twenty two clusters were identified with
cluster 18 clearly mapping out the alteration zone. (figure 8)

Figure 8
By exporting this cluster to another software package, the cluster was overlaid
onto the topography (figure 9).

Figure 9 Cluster 18 overlaid on topography


CONCLUSION
Sites found based on remotely sensed data require test drilling to discover and
parameterise the ore body. As exploration funds have a high risk and are
therefore limited, any means by which target detection and prioritisation costs
can be minimised allows the explorer greater capability to directly sample the
target and thereby make the discovery. Prospect Explorer has demonstrated
the ability to achieve this capability over numerous project areas within greatly
varying regions of Australia and with different data types.

REFERENCES

Introduction to Neural Computing by J Lawrence, California Scientific Software


Press (1993)
Advanced Methods in Neural Computing by PD Wasserman, Van Nostrand
Reinhold New York (1993)
Self Organisation and Associative Memory by T Kohonen, Springer Verlag, 3 rd
edition (1989)
Parallel Distributed Processing by DE Rumelhart and JL McClelland, Vol 1,
MIT Press (1986)

You might also like