You are on page 1of 23

Chapter 8

Fuzzy Associative Memories


Li Lin
2004-11-24

CONTENTS

Review
Fuzzy Systems as between-cube
mapping
Fuzzy and Neural Function Estimators
Fuzzy Hebb FAMs
Adaptive FAMs

Review

In Chapter 2, we have mentioned BAM


theorem
Chapter 7 discussed fuzzy sets as points
in the unit hypercube
What is associative memories?

Fuzzy systems
Input
Output
Koskos: fuzzy
systems asuniverse
between-cube
universe
mapping
of
of
discourse
discourse

Fig.1 A fuzzy system

The continuous fuzzy system behave as


associative memories, or fuzzy associative
memories.

Fuzzy and neural function


estimators

Fuzzy and neural systems estimates sampled


function and behave as associative memories

Similarities:
1. They are model-free estimator
2. Learn from samples
3. Numerical, unlike AI

Differences:
They differ in how to estimate the sampled
function
1. During the system construction
2. The kind of samples used

Differences:
3. Application
4. How they represent
and store those
samples
5. How they
associatively
inference
Fig.2 Function f maps
domains X to range Y

Neural vs. fuzzy


representation of structured
knowledge

Neural network

problems:
1. computational burden of training
2. system inscrutability
There is no natural inferential audit
tail, like
an computational black box.

3. sample generation

Neural vs. fuzzy


representation of structured
knowledge

Fuzzy systems

1. directly encode the linguistic sample


(HEAVY,LONGER) in a matrix
2. combine the numerical approaches with the
symbolic one

Fuzzy approach does not abandon neuralnetwork, it limits them to unstructured


parameter and state estimate, pattern
recognition and cluster formation.

FAMs as mapping

Fuzzy associative memories are


transformations
FAM map fuzzy sets to fuzzy sets, units cube to units
cube.

Access the associative matrices in


parallel and store them separately
Numerical point inputs permit this simplification
binary input-out FAMs, or BIOFAMs

FAMs as mapping
1

L ig h t

M e d iu m

50

100
T r a ffic d e n s ity

H eavy

150

xn 200

S h o rt

M e d iu m

10

20
30
G r e e n lig h t d u r a tio n

Fig.3 Three possible fuzzy subsets of trafficdensity and green light duration, space X and
Y.

Long

yn 40

Fuzzy vector-matrix
multiplication: max-min
composition

Max-min composition

A M B
Where, A (a1 ,...an ), B (b1 ,...b p )
n p
n-by-p matrix (a point
I
in
)

b j max min(ai , mi , j )
1i n

, M is a
fuzzy

Fuzzy vector-matrix
multiplication: max-min
composition
.2 .8 .7

Example
.7 .6 .6

M
Suppose A=(.3 .4 .8 1),
.8 .1 .5

0 .2 .3

B A M .8 .4 .5

Max-product composition

b j max ai mij
1i n

Fuzzy Hebb FAMs

Classical Hebbian learning law:


m ij mij S i ( xi ) S j ( y j )

Correlation minimum coding:


mij min(ai , b j )

a1 B
M A B

Example
M A B

.3

.4
.8

1

.3 .3 .3
.4 .4 .4

.8 .4 .5
.8 .4 .5

.8 .4 .5

T
T

A
bm

an B

The bidirectional FAM


theorem for correlationminimum encoding

The height and normality of fuzzy set A

H ( A) max ai
1i n

fuzzy set A is normal, if H(A)=1


Correlation-minimum bidirectional
(i) A M B
theorem
iff H ( A) H ( B )
T
(ii) B M A
iff H ( B ) H ( A)
(iii A M B
for any A
)
T
(iv) B M A
for any B

The bidirectional FAM


theorem for correlationminimum encoding

Proof

A AT max ai A max ai H ( A)
1i n

Then

1i n

A M A ( AT M )
( A AT ) B
H ( A) B
H ( A) B

So

H ( A) B B

iff

H ( A) H ( B)

Correlation-product encoding

Correlation-product encoding provides an


alternative fuzzy Hebbian encoding scheme
M AT B

Example

and

mij ai b j

.3

.
4

M AT B .8 .4 .5

.8

.24 .12 .15


.32 .16 .2
.64 .32 .4

.8 .4 .5

Correlation-product encoding preserves


more information than correlation-minimum

Correlation-product encoding

Correlation-product bidirectional FAM


theorem
M AT B
if
and A and B are nonnull fit
vector
(i) A M B
iff H ( A) 1
iff H ( B ) 1
then(ii) B M T A
(iii A M B
for any A
)
T
(iv) B M A
for any B

FAM system architecture


( A1 , B1 )

( A2 , B2 )

FAM Rule 1
FAM Rule 2

B1

B2

( Am , Bm )

FAM Rule m

1
2

Bm

FAM SYSTEM

Defuzzifier

yj

Superimposing FAM rules

Suppose there are m FAM rules or associations


The natural neural-network maximum or add
the m associative matrices in a single matrix M:
M max M k
1 k m

or

M Mk
k

This superimposition scheme fails for fuzzy


Hebbian encoding
The fuzzy approach to the superimposition
Bk
problem additively superimposes the m recalled
M k matrices
vectors instead of the fuzzy Hebb
A M k A ( AkT Bk ) Bk

Superimposing FAM rules

Disadvantages:
Separate storage of FAM associations consumes
space
Advantages:
1 provides an audit trail of the FAM inference
procedure
2 avoids crosstalk
3 provides knowledge-base modularity
4 a fit-vector input A activates all the FAM rules
in
Back
parallel but to different degrees.

Recalled outputs and


defuzzification

The recalled output B equals a weighted

sum of the individual recalledBvectors


k
B B'
m

k 1

How to defuzzify?
1. maximum-membership defuzzification
mB ( ymax ) max mB ( y j )
1 j p

simple, but has two fundamental problems:


the mode of the B distribution is not unique
ignores the information in the waveform B

Recalled outputs and


defuzzification
2. Fuzzy centroid defuzzification
p

y m
j1
p

m
j 1

(yj)

(yj)

The fuzzy centroid is unique and uses all the


information in the output distribution B

Thank you!

You might also like