You are on page 1of 137

I can feel it…

I can sense it…


A new form of intelligence…
is emerging from…
the other side of the table !
From Brains to Machines
Introduction to Artificial Intelligence ( AI )

Dr. Janaka Wijayakulasooriya


Department of Electrical and Electronic Engineering
University of Peradeniya
Sri Lanka
Our Road Map

§ What is AI ?
§ How it evolved ?
§ What can we do with it ?
§ How can we use AI ?
A Chinese Proverb

Tell me…I will forget


Show me…I will remember
Involve me…I will understand
What is Intelligence ?

Ability to…
§ Think
§ Analyse
§ Learn
§ Reason
§ Do creative work
§ Perception
§…
What is Intelligence ?

Intelligence consists of the following skills:


§ ability to reason
§ ability to acquire and apply knowledge
§ ability to manipulate and communicate ideas
An Intelligent Entity

INTERNAL
INPUTS PROCESSES
Senses environment Has knowledge
Has understanding/
intentionality
See
Can Reason
Hear
Touch
Taste
Smell Exhibits behaviour
Mind
OUTPUTS
What is AI ?
§ Media creates the world today
§ AI =
§ 2001: Space odessey
§ Star Trek
§ AI: The movie
§ Often portrayed as
§ A property of an evil computer
§ Computers doing impossible things
§ Books and movies
§ Inspired many AI researchers
§ Raised the expectations of general public
What is AI ?
What others have to say
What is AI ?

§ Automation of human behaviour


§ Mimic human reasoning process
§ Knowledge representation in machines
§ Learning from mistakes ( adaptive )
…..

“ Is it replicating human intelligence ?”


What about Emotions ?
What about Emotions

“I want a voice recognition software which will make


the computer fears me, when I raise my voice”
Its 2007, but where is HAL ?
Artificial
intelligence is
about more
than talking
computers and
robots in
search of love
and laughter.
In fact, AI is
most useful in
its simplest
form
Is it replicating human intelligence ?”

§ Not necessarily
§ Sometimes more…sometimes less
§ It is like
§ Physical Vs Electronic Books
§ Actual Vs Virtual shopping
§ Bird Vs Airplanes

… or simply something Vs essence of something


Strong AI Vs Weak AI
§ Strong AI
§ Computers can be made to think in a level at
least equal to humans
§ Weak AI
§ Adding “thinking like” features to machines to
make them more useful
§ Examples: Expert systems, speech recognition,
natural language processing…
Human/Biological Intelligence
§ Thinking humanly ( Cognitive Modeling ]
§ Cognitive science
§ 1960s – information processing behaviorism as
the dominant view in psychology
§ Cognitive neuroscience
§ Neurophysiologic basis of intelligence and
behaviour

§ Acting humanly ( Operational Intelligence ]


§ Turing test
§ Required: Knowledge, reasoning, language
understanding, learning
§ Problem: It is not reproducible or amenable to
mathematical analysis à rather subjective
Why should I learn AI ?
Will it get me a Job ?
Goals of AI
Engineering Goal:
To solve real world problems
To build systems that
exhibits intelligent behaviour

Scientific Goal:
To understand computational
mechanisms needed for
modeling intelligent behaviour
Goals of AI
Intelligent Systems
Behaviourist View on Intelligent Machines

§ Many scientists believe that only things that can be directly observed
are “scientific”

§ Therefore if a machine behaves “as if it were intelligent” it is


meaningless to argue that this is an illusion.

§ Turing was of this opinion and proposed the “Turing Test”

§ This view can be summarized as:“If it walks like a duck, quacks like a
duck and looks like a duck - it is a duck”
A brief history of AI
§ Gestation (43-56):
§ automata theory, neural networks, checkers, theorem proving.
§ Shannon, Turing, Von Neumann, Newell and Simon, Minsky,
McCarthy, Darmouth Workshop.
§ Great expectations (52-69):
§ computers can do more than arithmetics!
§ General Problem Solver (GPS), better checkers
§ LISP (LISt Processing language)
A brief history of AI
A brief history of AI

§ A dose of reality (66-74):


§ ELIZA: human-like conversation.
§ limitations of neural networks, genetic algorithms, machine
evolution.
§ acting in the real world: robotics.
§ Knowledge-based systems (69-79):
§ domain focus: experts systems vs. General Problem Solvers.
§ DENDRAL, MYCIN, XCON, etc.
A brief history of AI

§ Commercial AI: the ‘80s boom (80-90)


§ DEC’s R1 computer configuration program
§ many expert systems tools companies (mostly defunct): Symbolics,
Teknolwedge, etc.
§ Japan’s 5th generation project: PROLOG.
§ limited success in autonomous robotics and vision systems.
Dartmouth
Degree Conference New
of Japan 5th Technology
Motivation Generation Support
Computer
AI
Winter

1948 1970s - 80s mid-1980s


Time
Test for intelligence: Turing Test
Early Attempts: Eliza

§ Developed in 1964-1966 by Joseph


Weizenbaum in MIT
§ Models (parodies) the rôle of a Rogerian
psychotherapist engaged in an initial
interview with a patient. Much or the
technique of the Rogerian psychotherapist
involves drawing the patient out by reflecting
the patient’s statements back at him.
Do you agree with Turing Test?
John Searle - Chinese Room

John Searle (1932-)


Philosophy professor
University of California says ...

§ Papers are passed in with


Chinese writing.
§ Human being (who does not
know Chinese) uses book to
convert input squiggles to
output squiggles
§ Human being copies squiggles
onto papers and sends them out
of the room.
Outside the Chinese Room

§ Behaviour
§ Chinese questions go into the room.
§ Chinese answers come out of the room.
§ Functionalist interpretation
§ Whoever is in the room
understands Chinese.
Same situation with our Students ?

Do we really understand … or
Just CUT & PASTE
Objections to Turing Test
Objections to Turing Test
Foundations of AI
§ Philosophy: Aristotle, mechanistic views, materialism,
positivism, rationality.
§ Mathematics: algorithms, logic, formalization of
mathematics, incompleteness, decision theory.
§ Psychology: behaviorism, cognitive science.
§ Linguistics: grammars, syntax and semantics.
§ Computer Science: computers, software, theory
§ Others: neuroscience, economics, game theory.
Ready to play a Game ?

Tic-Tac-Toe

O
O
X
Another Game ?
Game of Nim
§ Robots § Game Playing
§ Chess-playing program § Machine Translation
§ Voice recognition system § Resource Scheduling
§ Expert systems (diagnosis,
§ Speech recognition system
advisory, planning, etc)
§ Grammer checker § Machine learning
§ Pattern recognition § Intelligent interfaces
§ Medial diagnosis
§ System malfunction rectifier
A Robot Colony
§ Problem Definition:
Intelligent Agents

§ What is an intelligent agent?


§ Structure of intelligent agents
§ Environments
§ Examples
Intelligent agents: their environment and
actions
Ideal rational agents
§ For each possible percept sequence, an ideal rational
agent should take the action that is expected to maximize
its performance measure, based on evidence from the
percept sequence and its built-in knowledge.
§ Key concept:
mapping from perceptions to actions
§ Different architectures to realize the mapping
Structure of intelligent agents

§ Agent program: a program that implements the


mapping from percepts to actions
§ Architecture: the platform to run the program
(note: not necessarily the hardware!)
§ Agent = architecture + program
§ Examples:
§ medical diagnosis - part-picking robot
§ satellite image analysis- interactive tutor
§ refinery controller - flight simulator
Introduction

§ Humans solve problems by combining


§ Facts
§ Knowledge
§ They take
§ Finding ways to process them to reach the
desired solution(s)
§ This process is called as Reasoning
§ Def: Reasoning
§ The process of working with knowledge,
facts and problem solving strategies to
draw conclusions
Deductive Reasoning
§ Deduce new information from logically related
known information
§ Eg. Sherlock Holmes
§ Deductive Reasoning uses
§ Axioms (problem facts)
§ I am standing in the rain
§ Implications
(Related general knowledge in the form of rules)
Eg: If I stand in the rain THEN I will get wet
§ Conclusion:
We can conclude I will get wet

A ∩ (A → B ) ⇒ B
Inductive Reasoning
§ Used to arrive at a general conclusion from a
limited set of facts
§ Eg:
§ Premise: Crows in Sri Lanka can fly
§ Premise: Crows in India can fly
§ Conclusion: Crows can fly
§ For a set of objects, X={a,b,c,d,...} if property P is true for
a, and if P is true for b and if P is true for c, then P is true for
all X
§ Not true always
Abductive Reasoning

§ Abduction is a form of deduction that allows for


plausible inference (i.e. conclusions might follow
from the available information, but it might be
wrong)
§ Eg.
§ Ground is wet IF it is raining
§ Ground is wet
§ Is it raining?
Analogical Reasoning

§ Formation of models drawing analogies and


differences between two objects
§ Eg.
§ Lion is like a tiger
§ Both eat meat
§ Live in India
§ However, they have different colour, belongs to
different families
Common-Sense Reasoning

§ Relies more on good judgment rather than on exact


logic
§ Referred to as Heuristic ; Rule of thumb (best first
search)
§ Eg.
§ A loose fan belt usually causes strange noises
§ A patient with TB usually have light fever in evenings
§ How can I drive from here to Kandy ?
§ Valuable in applications that requires quick
solutions
Non-Monotonic Reasoning

§ During the course of solving a problem,


the state of various facts remains
constant (Monotonic Reasoning)
§ In non-monotonic reasoning state of the
facts can change
§ Eg.
§ RULE: If it rains I will get wet
§ FACT: It is raining à I will get wet
§ FACT: It’s not raining à I won’t get wet
§ Non-monotonic reasoning can adjust
Inference
§ Def: The process used in an expert system to
derive solutions or new information from known
information
§ Inference Engine is the module used for this
§ It contains:
§ What questions to ask the user?
§ How to search the knowledgebase
§ How to pick a rule to fire?
§ How the concluded info. Influence the search
The world of a wumpus

Stench
PIT
Breeze

Wumpus à PIT Prize

Hunter à PIT
Exploring the world of Wumpus
Knowledge Base:
Stench: No
PIT Breeze: No
Prize: No
OK: [ 1,1]
PIT: - Not in [1,1]
Wu: - Not in [1,1]
PIT
Inference:
OK: [2,1] and [1,2]
PIT: Not in [1,2], [2,1]
Wu: Not in [1,2], [2,1]

Action:
à Move to (1,2)

PIT
Exploring the world of Wumpus
Knowledge Base:
Stench: No
PIT Breeze: Yes
Prize: No
OK: [1,2], [2,1]
PIT: - Not in [1,1], [1,2], [2,1]
Wu: - Not in [1,1], [1,2], [2,1]
PIT

Inference:
Pit in [3,1] or [2,2] or [1,1]
à Pit in [3,1] or [2,2]
Wu: Not in [1,1],[2,2] and [3,1]

Actions:
§ à Move to (1,1)
PIT § à Move to [1,2]
Exploring the world of Wumpus
Knowledge Base:
Stench: Yes
PIT Breeze: No
Prize: No
OK: [1,2], [2,1]
PIT: Not in [1,1], [1,2], [2,1]
In [2,2] or [3,1] or both
PIT Wu: Not in [1,1], [1,2], [2,1], [2,2], [3,3]

Inference:
Wu: in [1,3], [2,2] or [1,1]
è Wu in [1,3]
Pit: not in [1,3], [2,2] or [1,1]
è Pit: In [3,1]
è [2,2] OK
Action:
Move to [2,2]
PIT
Exploring the world of Wumpus
Knowledge Base:
Stench: Yes
PIT Breeze: No
Prize: No
OK: [1,2], [2,1]
PIT: Not in [1,1], [1,2], [2,1]
In [2,2] or [3,1] or both
PIT Wu: Not in [1,1], [1,2], [2,1], [2,2], [3,3]

Inference:
Wu: in [1,3], [2,2] or [1,1]
è Wu in [1,3]
Pit: not in [1,3], [2,2] or [1,1]
è Pit: In [3,1]

PIT
Illustrative example: taxi driver

Agent Percepts Actions Goals Environment


Type
Camera Steer Safety Roads
Driver GPS Accelerate Speed Drivers
Speedometer Brake Legal Traffic
Sonar Profit Pedestrians
Video Customers
cameras
Table-Driven Agents
function Table-driven-agent(percept) returns action
static: percepts, a sequence, initially empty
table, indexed by percept sequences (given)
append percept to the end of percepts
action := LOOKUP(percepts, table)
return action

• Keeps a list of all percepts seen so far


• Table too large
• takes too long to build
• might not be available
Artificial Neural Networks
Brain and Machine
• The Brain
– Pattern
Recognition
– Association
– Complexity
– Noise Tolerance
•The Machine
–Calculation
–Precision
–Logic
The contrast in architecture
• The Von Neumann architecture
uses a single processing unit;
– Tens of millions of operations per
second
– Absolute arithmetic precision

•The brain uses many slow


unreliable processors acting in
parallel
The Structure of Neurones

synapse
synapse axon
axon
nucleus
nucleus

cell
cellbody
body

dendrites
dendrites
The Structure of Neurones
• A neurone only fires if its input signal
exceeds a certain amount (the threshold) in
a short time period.
• Synapses vary in strength
– Good connections allowing a large signal
– Slight connections allow only a weak signal.
– Synapses can be either excitatory or inhibitory.
A Classic Artifical Neuron
ao+1
wj0
a1 wj1
wj2
a2
Sj f (Sj) Xj

wjn
an
Multilayer Perceptron
Output Values

Output Layer
Adjustable
Weights

Input Layer

Input Signals (External Stimuli)


§ General architecture 1
Single layer
x1 w1 b

Y
wn
net input to Y: xn
n
bias b is treated net = bweight
as the + x ifrom ∑
w i a special unit with
constant output 1. i =1

threshold related to Y
output

classify θ into one of the two classes


if net ≥ θ
y = f (net ) = 
1
- 1 if net < θ

( x1 , ...... xn )
§ Decision region/boundary
n = 2, b != 0, θ = 0 x2 +

b + x1 w1 + x 2 w2 = 0 or -
w1 b
x 2a =line,
is − calledx1 −decision boundary, which partitions the planex
into two w 2 decision
w 2regions 1

If a point/pattern is in the positive region, then


, and the output is one (belongs to
class one)
Otherwise,
( x1 , x 2 ) , output –1 (belongs to
b +class 1 + x2 w2 ≥ 0
x1 wtwo)
n = 2, b = 0, θ != 0 would result a similar partition

b + x1 w1 + x 2 w 2 < 0
Case Studies
PC Based Fetal Monitoring Unit
Heart Beat wave shape
Neural Network
W1,1,1 W2,1,1
1 1 1
Z-1 W1,1,2 W2,1,2
W3,1,1
2 2 2
W3,2,1
Z-1
W3,3,1 Out put
3 3 3
. . .
. . .
. . . W3,10,1
Third layer

. . .
Z-1
15 10 10
Inputs First layer
(15 at a time) Second layer
Training the Neural Network
Training performances
Recognition

Neural Network output


Input hart beat variation
User Interface

Fetal Heart Monitor –Patient No. 03

5
Heart rate:110 Base line:110 B.L. Var. :+1 Print
6
DECELERATION WARNING !
Example
Automated character recognition

C:\ocrsamp\janaka7.tif
C:\ocrsamp\janaka11.tif
Example
software to recognize handwritten
characters...
Methodology
Step 1: Taking the letters apart
Methodology
Step 2: Standardizing the letters
1. Skeltonizing

2. Normalizing
Methodology
Step 3: Extracting Features

16 features
Feature Space l
l

mm m
1 m
i
m
0.9 mm w
m w mmw
w wm m w an
m a
0.8 l n wd
w nmwu a a g
w aa a l
wl l
0.7 n x xnndux a a a a a h a a
a aa g
l d
ai i
du kd p
l aaa y a a a i a sg
w l dunuu xn u da n a zakl aaa aa
a z
ak a
xni y bu
n nni kudd a
nxn n nk
a
nay f d yka
ag g e ggsg g
g
iy du y
i a dh aaar da hd
z se e
Mean (F9..F16)

0.6 yh n nru
n hzudh
nu
xa e o
nvnn n n hr nk f
y xhlu avx ny a kyb qi o o pl g o pg egg ee
r u vfw n uvgu j uu nfnf q qgq oqoboo
oepb esp e s es
i y l
v
tn f vtq t t uu n
u uhri q tq kksoe ooebe q o o
e be s
bp e e es e
0.5 r ity lydlnq if ii t k kn zb tp
it lya o bo
oeoe
s
o kbe e
pep es ee
yy fprw
ri rirfrtu
jr yftzh
vtgal kl ayifllzi jo habhh krpz soc
oppoqe p b t
s
l r vo l ceepc
i w l i r r k
iv h h k l p kr t o e f s s c t
r
0.4 j aj a ris asr h or sso s cc
if jj j i i ks r r l t cc
i j r c
t r j i cc c
0.3 j i j j nt
c c

0.2

0.1

0
0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1
Mean (F1..F8)
1

0.9
Feature Space i

a
0.8 a
u a a
aaaa a
u a aa
0.7 i
a i aa a aa aaa i a
a
a
u u aa a a aa a a
u u i uu a aaa a a a a e
i i u aa e e
Mean (F9..F16)

0.6 u uu u i a aaa a
u a a a e oo
i o ooe e e ee e
u u uu u oo oo oo e e e
i i i i ii uu u i oeo oooee ee e
eo e
e
0.5 a i oa o oo ee ee ee
oo
i i u a o ai ee o e
i i
0.4 i i a
o oo e
i ai a
i i
i i
0.3

0.2

0.1

0
0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1
Mean (F1..F8)
Self Adaptive Artificial
Neural Network (SAANN)
Classifier

Character
Features
SAANN Character

EXPERIENCE
With 3 new users

With 1 user

With only 10 type fonts


Example
Intelligent Power Quality Monitoring System
Research Background
Voltage and current
waveforms of the
electrical supply is
considered to be
pure sinusoidal

However, practically
these waveforms
deviate from its pure
sinusoidal form due to
many reasons
Research Background

The deviation of voltage or


current waveforms from its
pure sinusoidal form can be
defined as a
Power Quality (PQ)
disturbance
Research Background
During last few decades...
…equipment
sensitive to PQ
disturbances...

…equipment
causing PQ
disturbances
have increased dramatically...
…..The research has developed a novel Intelligent Power
Quality Monitor ( IPQM ), which is capable of…..
Detecting

Capturing

Classifying

….PQ disturbances
Locating

Analysing

….Harmonic sources
IPQMS
Bus 1 Bus 2 Bus N

Voltage Waveforms Voltage Waveforms Voltage Waveforms


Distributed Distributed Distributed
Monitoring Monitoring Monitoring
Unit Unit Unit
[ DMU ] [ DMU ] [ DMU ]
Captured
Captured PQ Events Captured
PQ Events PQ Events
Centralised
Monitoring
Unit (CMU)
Distributed Monitoring Unit
Bus
Voltage Waveform V(t)
Distributed
Monitoring Disturbance Extraction Module
Unit
[ DMU ]
Disturbance Extraction Module (DMU)
Extracts PQ disturbances from the supply voltage signal

50
DMU 4

Re (C ) ⋅ Vr (t ) + Im( C ) ⋅ Vr (t − )
40

30
T 2

20

e(t ) = v(t ) − 4
e(t)
0
10

0
C -1

-2

r= C
-10

-20 -3

-30 -4
0 50 100 150 200 250 300 350 400 0 50 100 150 200 250 300 350 400

where,

Voltage Signal  T 
T
C = ∫ v(t ) ⋅ Vr (t ) + j ⋅ v (t ) ⋅ Vr (t − )  ⋅ dt Extracted Disturbance
0
4 

1
Voltage (p.u.)

-1

-2
0 0.2 0 .4 0 .6 0.8 1 1 .2 1.4
tim e ( s )

0 .5
Voltage (p.u.)

-0 .5

-1

-1 .5
0 0.2 0 .4 0 .6 0.8 1 1 .2 1.4
tim e ( s )
Example
2

Voltage (p.u.) 1
V(t)
0

-1

-2
0 0.2 0.4 0.6 0.8 1 1.2 1.4
tim e ( s )

0.5
Voltage (p.u.)

-0.5

-1
e(t) -1.5
0 0.2 0.4 0.6 0.8 1 1.2 1.4
tim e ( s )
Example
2

Voltage (p.u.) 1
V(t)
0

-1

-2
0 0.2 0.4 0.6 0.8 1 1.2 1.4
tim e ( s )

0.5
Voltage (p.u.)

-0.5

-1
e(t) -1.5
0 0.2 0.4 0.6 0.8 1 1.2 1.4
tim e ( s )
Example

1
V(t)
Voltage (p.u.)

-1

-2
0 0.2 0.4 0 .6 0.8 1 1 .2 1.4
time (s)

0.2

0.1
Voltage (p.u.)

-0.1

e(t) -0.2

-0.3
0 0.2 0.4 0 .6 0.8 1 1 .2 1.4
time (s)
Example

1
V(t)
Voltage (p.u.)

-1

-2
0 0.2 0.4 0 .6 0.8 1 1 .2 1.4
time (s)

0.2

0.1
Voltage (p.u.)

-0.1

e(t) -0.2

-0.3
0 0.2 0.4 0 .6 0.8 1 1 .2 1.4
time (s)
Distributed Monitoring Unit
Bus
Voltage Waveform V(t)
Distributed
Monitoring Disturbance Extraction Module
Unit
[ DMU ] Extracted Disturbance
waveform e(t)

Event Identification Module


Event Identification Module
Capturing PQ Disturbances

e(t) Buffer ei-1(T) ei(T) ei+1(T)

Event Identification
Module ( EIM )
State Model
Steady Transition
State State

Captured
Event

Captured
Event Captured
Event

Intermediate
Intermediate Steady State
Transition State
Example
(a)
2
voltage (pu)

0
V(t)
-2
0 0.05 (b)
0.1 0.15 0.2
1
time (s)
voltage (pu)

0
e(t)
-1
0 0.05 (c)
0.1 0.15 0.2
2
time (s)
STATE

1.5

1
0 0.05 (d)
0.1 0.15 0.2
1
time (s)

Extracted
Voltage (pu)

event -1

wvaeform 0 0.005 0.01


time (s)
0.015 0.02
Example
(a)
2
voltage (pu)

0
V(t) -2
0 0.05 (b)
0.1 0.15 0.2
1
time (s)
voltage (pu)

0
e(t) -1
0 0.05 (c)
0.1 0.15 0.2
2
time (s)
STATE

1.5

1
0 0.05 (d)
0.1 0.15 0.2
1
time (s)
Voltage (pu)

Extracted0
event -1
0 0.005 0.01 0.015 0.02
wvaeform time (s)
Distributed Monitoring Unit
Bus
Voltage Waveform V(t)
Distributed
Monitoring Disturbance Extraction Module
Unit
[ DMU ] Extracted Disturbance
waveform e(t)

Event Identification Module

Captured transition events Captured steady-state events


IPQMS
Bus 1 Bus 2 Bus N

Voltage Waveforms Voltage Waveforms Voltage Waveforms


Distributed Distributed Distributed
Monitoring Monitoring Monitoring
Unit Unit Unit
[ DMU ] [ DMU ] [ DMU ]
Captured
Captured PQ Events Captured
PQ Events PQ Events
Centralised
Monitoring
Unit (CMU)
Centralised Monitoring Unit
DMU
Captured PQ events
Centralised
Monitoring Feature Extraction Module
Unit
[ CMU ]
Feature Extraction Module (FEM)
Extracting features from PQ Disturbances
Captured Captured
Transition Steady State
Event Event
Waveforms Waveforms

Feature Transition Feature Steady State Feature


Extraction Extractor Extractor
Module (FEM ) ( Using DWT ) ( Using FFT )

Transition Disturbance Feature Vector Steady State Disturbance Feature Vector


( 63 elements ) ( 2 elements )
Example: Impulsive Transient
Example: Oscillatory transient
Centralised Monitoring Unit
DMU
Captured PQ events
Centralised
Monitoring Feature Extraction Module
Unit
[ CMU ] Extracted features

Event Classification Module


Event Classification
Classifying PQ Disturbances
Transition Event Feature Vector Steady State Event Feature Vector
( 63 elements ) ( 2 elements )

Event Transition Event Steady State Event


Classification Classifier Classifier
Module (ECM) ( SAANN-1 ) ( SAANN-2 )

Transition Event Classes Steady State Event Classes

Oscillatory Impulsive Momentary Over Supply


Transient Transient Supply Voltage Interruption Harmonic
Interruption Distortion

Under Normal
Voltage Voltage Voltage Condition
Sag Swell
SIMULATION
Biometric Recognition System

§ Facial Recognition
§ Voiceprint Recognition
§ Fingerprint Recognition
§ Retinal Patterns Recognition
§ DNA Recognition
ECG signal classification
Thank You !

You might also like