Professional Documents
Culture Documents
Technical Report
#AALab0502
m2takami [at] yahoo.com
AA Lab, 2005
:1 4 ........................................................
-1-1 4 .................................................................................................
-2-1 5 .........................................................................................
-3-1 6 ...........................................................................................
-4-1 7 ..........................................................................................
-1-4-1 9......................................................................
-2-4-1 10 .....................................................
-3-4-1 11 ............................................................
-4-4-1 12 .............................................................
-5-4-1 13 ........................................................ PCA
:2 23 ..................................................
-1-2 23 ...............................................................................................
-2-2 25 .......................................................................... MLP
-1-2-2 25 ...........................................................................
-2-2-2 26 ....................................................
-3-2-2 28 .................................................................
-4-2-2 31 ..................................................................
- -5-2-2 33 ...............................................................
-1-3-2 38 ..............................................................
-4-2
42 ......................................................................
-1-4-2 ) ( 42 ........................
-2-4-2 45 .......................................
-3-4-2 )48 ..................................................... (SVM
-4-4-2 49 ..............................................................
-5-2 54 .............................................................
-1-5-2 54 ...........................................................................
-2-5-2 54 ..............................................
-3-5-2 56 ................................................
6-2 -k 57 ..............................................
-1-6-2 57 ...........................................................................
-3-6-2
kNN
58 ...........................................................
-7-2 60 ................................................................
-1-7-2 60 ...........................................................................
-2-7-2 60 .................................................................
:1
-1-1
.
.
. 2
. 3
.
. ] [3 .
1
normalization
whitening
3
dimemsion reduction
-2-1
. :
. ] [-1,1 ] [0,1
. xi ,min xi ,max i
.
.
-s)( ].[26
. i
i i n
i ) (2 ] [26 ]:[4
)(2
x ij i
i
= y ij
y )(
. ) (2 .
Normalization
)
(
.
-3-1
.
.
.
.
. ] [9
) (MSE
MSE
unitary
m n ) n m
C m m ij
i j
) ( . ) (3
) (4
xw ]:[19
)(3
C = VV T
)(4
) xw = V -1/2 V T (x -
-4-1
.
)(
. 1
2 3 .
4 .
5 6
- 7 ] .[27
.
.
. .
. d m
d ) . ( .
1
multivariate analysis
ordination
3
geometrical
4
multidimensional scaling
5
feature selection
6
feature extraction
7
Karhunen-Loeve
) -1( .
x1
x1
f1
f1
f2
f2
xp
xp
.1 ( (
m
.
) -1( .
2 3 .
.
J :
- d d
m x1 ,..., x m . X d
)(5
) ) = max J ( X
J (X
d
d
Xd d
.
.
) (
A
)(6
))) ) = max (J (A ( X
J (A
A
subset selection
supervised
3
unsupervised
X .
) Y = A (X.
.
-1-4-1
m
) ( d .
d
.
d
- d .
)(7
!p
! (m - d )! d
= nd
m d .
5 10 ) d = 5 ( m = 10
252 -5
.
1 .
:
.
generalization
10
:2
.
.
.
-1
/ .
.
.
-2
) ( .
.
-2-4-1
1989 3 ]:[11
5 .
.
1
globally
suboptimal
3
Genari
4
relevant
5
redundant
11
-3-4-1
.
.
.
.
-1 2 c
c
S w = p i Ci
)(8
i =1
p i wi . ni wi n
ni
n
)(9
pi
} trace {S w .
-2 3 c
:
)(10
) S b = p i ( mi m 0 )( mi m 0
i =1
mi m0 :
)(11
m 0 = p i mi
i =1
1
2
3
12
} trace {S b .
-3 1 c
T
S m = E ( x m0 )( x m 0 )
)(12
S m .
) (10) (8 ) (12
Sm = S w + Sb
)(13
} trace {S m
} trace {S w
)(14
= J1
) (14
J 1 .
) trace ( )
( J 2:
Sm
= S w 1S m
Sw
)(15
= J2
2 J 1
J 2 .
J 3 :
)(16
J 3 = trace S w 1Sm
J 2 J 3 . J 3
.
-4-4-1
) (
.
1
2
13
.
)
( .
.
.
-1 ) ( .
-2 )
(.
-3 .
-4 ) .
(.
-5 ) (
)
(.
-5-4-1
PCA
) (PCA 2 .
PCA ) (
. PCA
PCA
.
.
Principal Components Analysis
Pearson
1
2
14
.
.
.1 A 1
2 . 3 1933
.
.2 .
.3
.
X x1 ,..., x m )(
i , i = 1,..., m :
m
i = a ij x j
)(17
j =1
= AT X
X ) ( A .
1 = a1 j x j = a1T X
)(18
j =1
1 . 1:
var(1 ) = E[ 12 ] E[ 1 ]2
)(19
X ] E[ . .
a1T a1 a1Ta1 = a1 = 1
2
)(20
orthogonal
stationary
3
Hotelling
15
.
a1 :
a1 - a1 = 0
) (
a1 . m ... 2 1 m
1 2 ...m 0 .
. 1
)(21
a1T a1 = a1T a1
=
a1 .
I = 0 .
x1 ,..., x m.
2 = a 2T X a 2 i , i = 1,..., m
2 a1 = 1 2 1 .
:
)(22
E[ 21 ] E[ 2 ]E[1 ] = 0
)(23
a 2T a1 = 0
a1 a 2T a1 = 0 a2
a1 .
)(24
a 2T a 2 a 2T a 2 a 2T a1
. a2 :
)(25
2 a 2 2 a 2 a1 = 0
a1T:
)(26
2a1 a 2 = 0
16
a2 = a2
. a2 a1 .
.
k k = a kT X
ak k k
.
) (orthonormal
.
)(28
= AT X
] A = [a1 ,..., a m .
.
:
)(29
var( ) =
i =1
i =1
.
k .
)(30
i =1
i =1
i / i
d . d
) k ( :
k 1
i =1
i =1
i =1
17
i d i i
)(31
)(32
k = A Tk X
%90 X 1991
] [14 : d
. 3 4 1995
.
d ) (.
scree test
d .
5 6 .
.
PCA .
-1 : ) PCA (
.
.
.
.
-2 :
.
.
1
Jollife
Jackson
3
Parkash
4
Murty
5
cut off
6
elbow
18
) (
: ... .
.
) -2-1 ( .
.
-3 : ) (28 X .
.
.
)(33
) = AT (X -
X.
-4 ) :(SVD Z
ZT Z . ) (
)(34
1 n
1 T
X X
= (x i m ) (x i m )T
n 1 i =1
n 1
= X - 1m T
X X n m m 1 n
1 .
)(35
1
1
=X
) (X - 1m T
n 1
n 1
=Z
Z
.
)(36
1 -1
1
= XD
( X - 1m T )D-1
n 1
n 1
=Z
D d ii
Z .
.
Z r = U r r VrT + 1m T
19
) ( .
. .
PCA
-6-4-1
MLP .
3 . 2 .
PCA
.
.
.
y1
yc
=1
u0
u1
u2
um
.2 MLP PCA
-7-4-1
20
PCA
]:[23
X Y
)(38
: Rm F ,
PCA . F
.
... (x 1 ) ) (x n
) .( i =1 (x i ) = 0 PCA
n
)(39
1 n
(x i ) (x i )T
n i =1
=C
0 } V F {0
)(40
V = CV
1 n
. ) (39 ) ( (x i ) V) (x i
n i =1
V ) ... (x 1 ) (x n .
= C V
: ) (40 i = 1,..., n
)(41
) ( (x i ) V ) = ( (x i ) CV
... 1 n V
) V = i =1 i (x i
n
)(42
) (x i .
) (39 ) (42 ) (41 n n K
)(43
) k ij = (x i ) (x j
:
)(44
n K = K 2
21
... 1 n . ) (44
n = K
)(45
. ) (45 ) (44 .
) (45 ) (42
.
) K n ) ((45 1 2 n
1 ,..., n p
) .( 0
q = p ,..., n q
1 = i , j iq jq ( (x i ) (x j ) ) = q K q = k q q
n
(x )
V q F :
)(47
) ) V q (x ) = i =1 iq ( (x i ) (x
n
) (43 ) (47 ) (x i
.
. ) k(x , y
F k F
.
-5-4-2 .
kPCA ) k(x , y ) = (x ) ( y
) ( k
V q ( q = p ,..., n ) F . x ) (x F
22
) (kPC )q (x ) = V q (x ) = i =1 iq k(x i , x
n
)(48
. kPCA
) (48 3 .
) k(x , x M
) k(x , x 2
) k(x , x 1
xM
x2
x1
.3
) kPCA M F (
K F
.
23
:2
-1-2
.
.
.
1960
.
.
.
24
)
( .
.
)
( . 1 .
) (
. )( )(
.
.
.
.
:
( :2
( :3
4 .
.
( :5 .
) ( .
feature
training set
3
test set
4
overlearning
5
validation set
2
25
.
.
.
MLP RBF kNN
) (SVM.
-2-2
-1-2-2
MLP
1943
1 2
. 1959
.
. 1962
3
.
1969 4 5
)(
) XOR ( .
. ) (
1
McCulloch
Pitts
3
Rosenblatt
4
Minsky
5
Papert
2
26
.
.
1982 .
1 -
/ .
.
... . ][25
.
MLP .
.
""
.
-2-2-2
m x i y ) .(4
w i . w 0
x0 x0
1 . 2
.
w0
w1
w2
x1
x2
xm w m
.4
Hopfield
feedforward
1
2
27
x 1 :
)(49
a = wi xi
i
i = 0,..., m
. i = 1,..., m
y ) f (a .
a .
.
(1
)(50
f (a ) = a
(2 )(
)(51
1
1 + e a
= ) f (a
(3 ) (
)(52
) f (a ) = tanh(a
)(53
1 a 0
f (a ) =
0 a < 0
)(54
1 a 0
f (a ) =
1 a < 0
5 .
activation
activity
1
2
Sta ndard (Logistic) Sigm oid
Line a r function
Ha rd Limite r
0.9
28
1
1
0.8
3
0.8
0.7
0.4
0.6
0.2
output
output
-1
0.4
-0.2
-2
0.3
-0.4
-3
0.2
-4
output
0.6
output
0.5
0.6
0.8
0.4
0.2
-0.6
-0.8
0.1
0
-1
0
1
activa tion
-1
-2
-3
.5
-4
-5
-5
0
1
a ctivation
-1
-2
-3
-4
0
-5
0
1
a ctivation
-1
-2
-3
-4
-5
0
1
a ctiva tion
-1
-2
-3
) f ( x ;w
- y x.
w ] .[10
- .
.
c1 c 2 .
.
.
-3-2-2
.
m .
x = [1, x 1 , x 2 ,..., x m ]T w = [w 0 ,w 1 ,...,w m ]T
1 w 0 . w
. c1 c 2 )
( w
)(55
w T x 0 x c1
)(56
w T x < 0 x c 2
w T x = 0 m ... x 2 x 1 x m
. w
-4
-5
29
) (55 ) (56 .
n k
) 0 (1 .
}) {( x1 , t1 ),( x 2 , t 2 ),..., (x k , t k t i 0 1.
:
:
(1 i ( i = 1,..., k ) x i w i
i )
t i x i( w i :
)(57
w i +1 = w i
w T x i 0 x i c1 .
)(58
w i +1 = w i
w T x i < 0 x i c 2 .
(2 :
)(59
w i +1 = w i i x i
w T x i 0 x i c 2 .
)(60
w i +1 = w i + i x i
w T x i < 0 x i c1 .
1 2 i . 1
i . i ) ( i =
.
].[13
6 . x m
)( . ) t i(
1 -1 ) (
. w T x
:
)(61
) x cos(
w Tx = w
learning rate
30
x
x
)(1
)(2
w
)(3
x
.6
) ( 90 wT u
6 ) w (1 )
t = 1 sign(w T x ) = 1 ( . 90
) cos( x
x ) w (2 .
) w (3 . 90
w T x ) .( sign(w T x ) = 1 w
. 7 )
( . w
w
.
u2
u
u0
u1
=0
.7
+w
+w
2
w
1
w
w
'' ''
31
) (57 ) (60
.
1 :
w i +1 = w i + w i
)(62
= w i + i (t i - y i ) x i
y i i . t i - y i
i . 0 < 1
. e = t - y ( w T x ) a w
.
-4-2-2
. . 8
.
) (
.
. .
y1
y2
=1
x0
x1
x2
xm
yc
.8
error correction
32
.
4 .
0 1 .
t i = [0, 0]T
) t i = [1, 1]T t i = [1, 0]T t i = [ 0, 1]T ( .
. 9 .
) (
) ( .
c c 1
c
.
)
.
.
0.5
A
B
C
D
0.5
0.4
.9
0.3
0.2
0.1
-0.1
-0.2
Class
Class
Class
Class
-0.3
-0.4
0.5
A
B
C
D
0.4
0.3
Class
Class
Class
Class
0.4
0.3
0.2
0.2
0.1
0.1
-0.1
-0.1
-0.2
-0.2
-0.3
-0.3
-0.4
-0.4
-0.5
-0.5
0.8
0.6
0.4
0.2
-0.2
-0.4
-0.6
-0.8
-0.5
-1
: )
( ) (.
33
.
7
) ... t i = [0,0,0 ]T ( t i = [1,1,1]T
7 . 10
[1,1, 0]T .
.
.
L1
L2
.10 .
-5-2-2
. )
(
. 11 ) ( .
1 .
hidden layer
y1
yc
=1
34
x0
x1
x2
xm
.11
1 .
.
1957 2 .
. ) (
.
" : f:[0,1]nRm
n 2n+1
m " .
.
. :
) (
.
].[4
fully connected
Kolmogorov
1
2
-6-2-2
35
MLP
) ( t i .
.
:
1
er2
)(63
= ) ( w
r r .
.
.
:
:
.
: ) (
)( . 12 .
w
.12
.
.
:
36
.1 :
.
. 13
.
.2 : .
.
.13
.
.
:
.1 )(Error-Correction Learning
.2 )(Memory-Based Learning
.3 )(Hebbian Learning
.4 )(Competitive Learning
.5 )(Boltzman Learning
MLP
) (BP .
back propagation
-7-2-2
37
.
.
.
.
- (L.M.) 2
. .
3 4 5
.
.
)) (63 (
6 .
. L.M. .
Hessian 2
)(
. L.M.
regularization
) ( ) (- )
2 ( .
1
-3-2
-1-3-2
RBF
38
) (RBF 2
. 3 4
] .[24 5 6 7 8
9.
RBF .
.
) ( .
MLP )( .
-2-3-2
RBF .
) (
) ( .
RBF
.
) ( .
. : .
39
) (
) ( .
spread
)) (exp(-x2
0.5 )
Ln(0.5) spread (.
k-Means Fuzzy C-Means
.
].[2
:k-Means
. k-Means n
c ) ( i = 1, 2, ..., c Gi " "
) ( . ) (64 :
)(64
) d(x k c i
k , x k G i
i =1
i =1
= J = Ji
ci d(x k c i ) i i ) ( ci k .
:
)(65
c
c
2
J = J i = ||x k c i ||
i =1
i =1 k , x k G i
U c n
uij 1 j x j i
. :
)(66
1 if || x j c i || 2 || x j c k || 2 , for each k i ,
u ij =
otherwise
0
dissimilarity
40
U
:
)(67
)(68
= 1, j = 1,..., n
=n
ij
u
i =1
ij
u
i =1 j =1
:
)(69
1
xk
|G i | k ,x k Gi
= ci
| | G i i G i .
: c
c
. U J
. ) (69
U . ... J
.
.
.
:(FCM) Fuzzy C-Means
k-Means
.
:
U )(70
:
)(70
= 1, j = 1,..., n
41
ij
u
i =1
1 .
) (71 .
)(71
J (U, c1 , c 2 ,..., cc ) = J i = u ij m d ij 2
i =1
i =1 j =1
uij 0 1 ci d ij i i ) ( ci j
m ] [1,.
. ) (72 )(73
:
)(72
)(73
m
j
j =1 ij
n
m
j =1 ij
u x
u
1
)2 /( m1
c d
k=1 d ij
kj
ci
= uij
U )(70
. ) (72 .
) (71 . )(73
) U ( . U
. .
.
42
-4-2
-1-4-2
) (
n ) ... (x 2 , y 2 ) (x1 , y 1 ) (x n , y n x i R m
} yi {1,1 . 1
)(74
{x : f ( x ) = x Tw + w 0 = 0}, || w ||= 1
1 -1 2
.
.
.
.
]:[12
)(75
max C
w ,w 0 , w =1
y i (x iTw + w 0 ) C , i = 1,..., n
C w w0
. C .
w = 1
)(76
1
y i (x iTw + w 0 ) C ,
w
) w0(
)(77
y i (x iTw + w 0 ) C w
.
1
separating hyperplane
margin
3
decision boundary
2
H 1 : w T x +w 0 = +1
1
w
43
x2
=C
1
w
=C
w
w
x1
H 2 : w T x +w 0 = 1
f ( x ) =w Tx +w 0 = 0
.14
w w0
w = 1 C . ) (75
)(78
min 12 w
w ,w 0
y i (x iTw + w 0 ) 1, i = 1,..., n
.
) ( 1/ w
) .( 14 w w0
.
) 1 (
2
)(79
]+ i [ y i ( x iTw + w 0 ) 1
i =1
LP = min 12 w
convex
primal
1
2
44
w = i y i x i
)(80
i =1
n
0 = i y i ,
)(81
i =1
) (79 1 :
n
LD = i 12 i k y i y k x iT x k
i =1 k =1
)(82
i =1
i 0
LD 2
. Karush-Kuhn-
i [ y i (x iTw + w 0 ) 1] = 0,
. :
i > 0 y i ( x iTw + w 0 ) = 1 x i
.
y i (x iTw +w 0 ) > 1 x i i = 0.
) (80 w x i
i = 0 . w0 ) (83
. 14
.
.
f ( x ) = x Tw + w 0
:
)(84
) x ) = sign f ( x
(G
) (
.
Wolfe dual
orthant
1
2
45
)
( .
. 1LDA
.
.
-2-4-2
C
)
.(15 ) = (1 , 2 ,..., n .
) (75 :
)(85
y i (x iTw + w 0 ) C i
)(86
) y i (x iTw + w 0 ) C (1 i
i 0 i i =1 i.
n
.
"" .
.
) y i ( x iTw + w 0 ) C (1 i i = 0
0 < i < 1
i > 1
. i
f ( x i ) = x iTw + w 0 .
i =1 i
i 1
i =1 i
46
K K) .
K
i =1 i
) ( i 1 K (.
x2
2
4
3
x1
.15
) (78 w C = 1/ w
:
)(87
y i ( x iTw +w 0 ) (1 i ), i
1
min
2 w
w ,w 0
i 0, i K
) (87
.
) (87 ) (78 ) (.
.
) (87 :
)(88
i =1 i
min 12 w
w ,w 0
i 0 y i (x iTw + w 0 ) (1 i ), i
K ) (87 .
= .
)(89
LP = 12 || w ||2 + i =1 i i =1 i [ y i ( x iTw + w 0 ) (1 i )] i =1 i i
n
47
w0 w i .
:
n
w = i y i x i
)(90
i =1
n
)(91
0 = i y i ,
)(92
i = i , i
i =1
: i . i , i , i 0
) (90 ) (92 ) (89 )(
:
n
LD = i 12 i j y i y j x iT x j
)(93
i =1 j =1
i =1
) (89 . LD
0 i i y i = 0
n
i =1
Tucker
)(94
i [ y i (x iTw + w 0 ) (1 i )] = 0
)(95
i i = 0
)(96
y i (x iTw + w 0 ) (1 i ) 0
w = i y i x i
i =1
i i ) (96 ) (
) ) ((94 . w
. ) ( i = 0
) (92 ) (95 0 < i < . ) ( i > 0
48
. i = ) (94 ) ( 0 < i , i = 0
w0 .
) (93 )(89
)
).(1981(Murrary
w w 0 :
]) x ) = sign[f (x
(G
)(98
] = sign[x Tw + w 0
. ) (89
)( .
) (
. 1 .
-3-4-2
(SVM) 2
.
.
) (
.
:
.1
.
.2
.
cross validation
Support Vector Machines
1
2
49
.
.
.
.
. .
. :
.
.
-4-4-2
x m
j (x ), j = 1,..., M M . - m
n ) i = 1,..., n x i (
( x i ) = [1 ( x i ), 2 (x i ),..., M ( x i )]T .
) (
f ( x ) = ( x )Tw + w 0 ) x : 0 ( x ) = 1 w 0
( .
) x ) = sign f ( x
( G.
Cover
50
.
-5-4-2
SVM
SVM ) (89 .
.
) (93
n
) LD = i 12 i j y i y j (x i ), ( x j
)(99
i =1 j =1
i =1
. ) (90 ) f ( x ) (100:
f ( x ) = ( x ) T w + w 0
)(100
= i =1 i y i (x ), ( x i ) +w 0
n
i w0 f ( x i ) = 0 ) (100 )
( x i 0 < i < .
) (99 ) (100
) ( x . ) ( x
1
)(101
)K ( x , x ) = ( x ), (x ) = T ( x ) ( x ) = j ( x ) j (x
j =0
K . )( . ) (100
)(102
n
f ( x ) = = i =1 i y i K (x , x i ) +w 0
] .[13 16 .
kernel
51
SVM d
) (RBF ) ( ) (104) (103
) (105 .
)(103
K ( x , x ) = (1 + x , x )d
)(104
) K ( x , x ) = exp( x - x / c
)(105
) K ( x , x ) = tanh(1 x , x + 2
) K (x , x1
x = [ x 1 , x 2 ,...x m ]T
x1
) K (x , x 2
x2
xm
K (x , x M
M
.16 ][13
) K (x , x 1 ] .[13
:
.1
.
.
.
.2
.
Mercer
52
.3
RBF MLP :
.
.
.
) (
. .
:
: )(
.
.
: .
.
17
.
.
.
conceptual problem
computational problem
1
2
53
SVM - 4th Degree Polynomial Kernel in Feature Space
1.5
1.5
0.5
0.5
-0.5
-0.5
0.5
1.5
-0.5
-0.5
0.5
1.5
1.5
1.5
0.5
0.5
-0.5
-0.5
0.5
1.5
-0.5
-0.5
0.5
1.5
( ( 4 ( ( : .17
-5-2
-1-5-2
54
2 3
. ) (
.
...
.
.
.
-2-5-2
c )( c ... 2 1 x
. c ) i = 1,..., c P(i x .
x ) . a
(posteriori x (.
) P(i x .
)
( .
) ( c = 2 :
) ) P(1 ) ( P(2 : k
n1 n 2 1 2
:
)(106
n1
n
, P(2 ) = 2
n
n
= ) P(1
bayesian
Bayes
3
Probability Density Function
2
55
:
)(108
) p( x i ) P( i
) p(x
= ) P( i x
) P( i x )
( ) (108 .
. :
) P(1 | x ) > P( 2 | x x 1
) P(1 | x ) < P( 2 | x x 2
) (108
) p( x | 1 ) P( 1 ) > p(x | 2 ) P( 2 x 1
) p( x | 1 ) P( 1 ) < p(x | 2 ) P( 2 x 2
. p( 1 ) = p( 2 ) = 1 ) p(x | 1 ) p(x | 2
) .(18 18 x < x 0
x > x 0 .
56
) p(x | 1
) p(x | 2
.
PDF
. PDF
.
-3-5-2
m :
)(109
i = 1,..., n
exp ( x i )T i1 (x i ) ,
2
1
1
2
m
2
(2 ) i
= ) p( x | i
i = E [ x ] i i
m m i :
)(110
i = E ( x i )T ( x i )
2
-6-2 -k
-1-6-2
57
.
. -k
.
.
-2-6-2
kNN
kNN . x
V x ]:[27
)(111
p (x ) dx
) V( x
V . :
)(112
p( x )V
V
. k n V
:
k
n
)(113
) (112 ) (113 :
k
nV
)(114
kNN
k
n
= ) p ( x
) n k ( V
x k .
K-Nearest Neighbor
k
n
58
) ) k ((113 n ) .
(. n
k k
. ) (114
k
n
:
)(115
= lim k
)(116
k
=0
n
lim
k k = n.
-3-6-2
kNN
. k k i i )
i =1 k i = k c ( .
c
n i i ) .( i =1 n i = n ) p(x | i
c
ki
ni V
)(117
= ) p ( x | i
) p(i
ni
n
)(118
= ) p ( i
.
x i
)(119
( i x ) p
j x ),
(p
)(120
59
k
n
k i ni
j j,
ni V n n j V n
x i
)(121
ki > k j ,
x
k . x
k k x
.
:kNN n
x c .
kNN )
( :
x k
x k . .
k i k
k i ) .( i = 1,..., c . i k i = k
x k i .
k = 1 .
x
x . :
k
.
] [8 .
-7-2
-1-7-2
60
kNN .
PDF
PDF
. ) (117
.
-2-7-2
n k n
V x x
18 ]:[8
)(122
k
nV
= ) p ( x
kNN PDF
. k kNN
k V k
.
V .
m h
:
)(123
V = hm
)(124
1 u j < 1/ 2
(u ) =
0 otherwise
Parzen window
61
) u j u j (
) (125 :
)(125
x xi
)
h
(k =
i =1
) (122:
)(126
1 n 1 x xi
) ( h
n i =1 V
= ) p(x
x
1 (u ) ) p( x
. :
)(127
(u ) > 0
)(128
(u ) d u = 1
) (127 ) (128
. ... .
.
h ) p(x h
) p(x . h p(x )
. 19 5
h )
(.
) (
)(
.19
5 ( h=0.2( h=0.5
62
) (
(
h=1
:
V p(x )
V ) (122 .
) p( x .
63
( : )
. [ 1]
.1383
[4] Bishop M., Neural Networks for Pattern Recognition, MIT Press, 1996.
[5] Brown M., Layered Perceptron Networks and the Error Back Propagation
Algorithm, Tutorial, Oct., 1996, available here:
neuron.tuke.sk/math.chtf.stuba.sk/pub/vlado/NN_books_texts/Brown_tutotrial_perceptr.pdf
[6] Couto A., Current Status of Electronic Nose: Research and Applications,
Report on Microelectronic Sensors, 2000.
[7] Demuth H. and Beale M., Neural Networks Toolbox for use with MATLAB (R
13).
[8] Duda R. O., Hart P. E., and Stark D. G., Pattern classification, 2nd edition,
John Wiley, 2000.
[9] Eldar Y.C. and Oppenheim A.V., MMSE Whitening and Subspace Whitening,
IEEE Trans. Inform. Theory 49, pp. 1846-1851, Jul., 2003.
[10] Hagan M. T., Demuth H. B., and Beale M. H., Neural Network Design,
PWS Pub., Boston, MA., 1996.
[11] Hall M. A., Correlation-based Feature Selection for Discrete and Numeric
Class Machine Learning, PhD thesis, Department of Computer Sience,
University of Waikato, Hamilton, New Zealand, 1998.
[12] Hastie T., Tibshirani R., and Friedman J., The Elements of Statistical
Learning: Data Mining, Inference, and Prediction, Springer, New York, 2001.
[13] Haykin S., Neural Networks- A Comprehensive Foundation, Macmillan
College Publishing Company Inc, New York, 1994.
[14] Jackson J. E., A User's Guide to Principal Components, Wiley, New York,
1991.
[15] Jantzen J., Introduction to Perceptron Networks, Tech. report No. 98-H 873,
Technical University of Denmark, Oct., 1998.
[16] Jollife I.T., Principal Components Analysis, Springer-Verlag, New York,
1986.
[17] Kermani B. Gh., On Using Artificial Neural Networks And Genetic Algorithm
Electronic Nose, PhD Dissertation, North Carolina State University, 1996.
[18] Ma J., Zhao Y., and Ahalt S. C., OSU SVM Classifier Matlab Toolbox (ver
3.00), available at: http://www.ece.osu.edu/~maj/osu_svm/
64
[19] Oursland A., De Paula J., and Mahmood J., Case Studies of Independent
Component Analysis, report for CS383C Numerical Analysis of Linear
Algebra, available here: http://www.oursland.net/tutorials/ica/index.html
[20] Prakash, M. and Murty, N., 1995. A genetic approach for selection of (near-)
optimal subsets of principal components for discrimination. Pattern Recogn.
Lett. 16, pp. 781787.
[21] Rose-Pehrsson, Susan L. and Di Lella, Smart sensor system and method using
surface acoustic wave vapor sensor array and pattern recognition for selective
trace organic vapor detection, US patent No. 5,469,369, 1995.
[22] Scholkopf B., Smola A., and Muller K.R, Nonlinear Component Analysis as a
Kernel Eigenvalue Problem, Technical Report No. 44, Max-Planck-Institute,
Dec., 1996.
[23] Scholkopf B., Smola A., and Muller K.R., Kernel Principal Component
Analysis, In B. Scholkopf, C. J. C. Burges and A. J. Solma, editors, Advances
in Kernel Methodes Support Vector Learning, pp. 327-352, MIT Press,
1999.
[24] Sahin F., A Radial Basis Function Approach to a Color Image Classification
Problem in a Real Time Industrial Application, Master's thesis, Virginia
Polytechnic Institute, June, 1997. available here:
http://scholar.lib.vt.edu/theses/available/etd-6197-223641/
[25] Sima J., Introduction to Neural Networks, Technical report, No. V-755, Aug.
1998.
[26] Theodoridis S. and Koutroumbas K., Pattern Recognition, Academic Press,
1999.
[27] Webb A., Statistical Pattern Recognition, 2nd edition, John Wiley, 2002.
[28] Zhao R., Chemical sensors for the detection of volatile organic compounds
(VOCs), ECE, Auburn University, AL 36849.
[29] Electrochemistry Encyclopedia, http://electrochem.cwru.edu/ed/encycl.
[30] Electronic Nose., http://www.eng.warwick.ac.uk/srl/electronic_nose.htm
[31] Nose II, http://www.nose-network.org/review/
[32] Memory and Senses,
http://www.agen.ufl.edu/~chyn/age2062/lect/lect_24/lecture_24.htm