You are on page 1of 118

Optimization Algorithms for Association

Rule Mining (ARM)


K.INDIRA
Principal
E.S. Engg. College
Villupuram

ROAD MAP
INTRODUCTION
ASSOCIATION RULE MINING
TRADITIONAL METHOD - APRIORI
OPTIMIZATION
GA BASED ARM
PSO BASED ARM
REFERENCES
2
DATA MINING
Extraction of interesting information or
patterns from data in large databases is known
as data mining.
3
ASSOCIATION RULE MINING
Association rule mining finds interesting
associations and/or correlation relationships
among large set of data items.
4
5
Proposed by Agrawal et al in 1993.
It is an important data mining model studied
extensively by the database and data mining
community.
Assume all data are categorical.
No good algorithm for numeric data.
Initially used for Market Basket Analysis to find
how items purchased by customers are related.

Bread Milk [sup = 5%, conf = 100%]
ASSOCIATION RULE MINING
6
I = {i
1
, i
2
, , i
m
}: a set of items.
Transaction t :
t a set of items, and t _ I.
Transaction Database T: a set of transactions T = {t
1
,
t
2
, , t
n
}.

The model: data
7
Market basket transactions:
t1: {bread, cheese, milk}
t2: {apple, eggs, salt, yogurt}

tn: {biscuit, eggs, milk}
Concepts:
An item: an item/article in a basket
I: the set of all items sold in the store
A transaction: items purchased in a basket; it
may have TID (transaction ID)
A transactional dataset: A set of transactions
Transaction Data: Supermarket Data
8
A transaction t contains X, a set of items
(itemset) in I, if X _ t.
An association rule is an implication of the
form:
X Y, where X, Y c I, and X Y = C

An itemset is a set of items.
E.g., X = {milk, bread, cereal} is an itemset.

The Model: Rules
9
Support: The rule holds with support sup in T
(the transaction data set) if sup% of
transactions contain X Y.
sup = Pr(X Y).
Confidence: The rule holds in T with
confidence conf if conf% of tranactions that
contain X also contain Y.
conf = Pr(Y | X)
An association rule is a pattern that states
when X occurs, Y occurs with certain
probability.
Rule strength measures
10
Support count: The support count of an
itemset X, denoted by X.count, in a data set
T is the number of transactions in T that
contain X. Assume T has n transactions.
Then,

n
count Y X
support
). (
=
count X
count Y X
confidence
.
). (
=
Support and Confidence
11
Goal: Find all rules that satisfy the user-specified
minimum support (minsup) and minimum confidence
(minconf).
Key Features
Completeness: find all rules.
No target item(s) on the right-hand-side

Goal and key features
12
There are a large number of them!!
They use different strategies and data structures.
Their resulting sets of rules are all the same.
Given a transaction data set T, and a minimum support
and a minimum confident, the set of association rules
existing in T is uniquely determined.
Any algorithm should find the same set of rules
although their computational efficiencies and
memory requirements may be different.
We study only one: the Apriori Algorithm

Many mining algorithms
CS583, Bing Liu, UIC 13
Probably the best known algorithm
Two steps:
Find all itemsets that have minimum support
(frequent itemsets, also called large itemsets).
Use frequent itemsets to generate rules.

E.g., a frequent itemset
{Chicken, Clothes, Milk} [sup = 3/7]
and one rule from the frequent itemset
Clothes Milk, Chicken [sup = 3/7, conf = 3/3]
The Apriori algorithm
CS583, Bing Liu, UIC 14
Step 1: Mining all frequent itemsets
A frequent itemset is an itemset whose support
is minsup.
Key idea: The apriori property (downward
closure property): any subsets of a frequent
itemset are also frequent itemsets
AB AC AD BC BD CD
A B C D
ABC ABD ACD BCD
CS583, Bing Liu, UIC 15
The Algorithm
Iterative algo. (also called level-wise search):
Find all 1-item frequent itemsets; then all 2-item frequent
itemsets, and so on.
In each iteration k, only consider itemsets that
contain some k-1 frequent itemset.
Find frequent itemsets of size 1: F
1

From k = 2
C
k
= candidates of size k: those itemsets of size k
that could be frequent, given F
k-1

F
k
= those itemsets that are actually frequent, F
k
_
C
k
(need to scan the database once).
CS583, Bing Liu, UIC 16
Example
Finding frequent itemsets
Dataset T

TID Items
T100 1, 3, 4
T200 2, 3, 5
T300 1, 2, 3, 5
T400 2, 5
itemset:count
1. scan T C
1
: {1}:2, {2}:3, {3}:3, {4}:1, {5}:3

F
1
: {1}:2, {2}:3, {3}:3, {5}:3

C
2
: {1,2}, {1,3}, {1,5}, {2,3}, {2,5}, {3,5}
2.

scan T C
2
: {1,2}:1, {1,3}:2, {1,5}:1, {2,3}:2, {2,5}:3, {3,5}:2
F
2
: {1,3}:2, {2,3}:2, {2,5}:3, {3,5}:2
C
3
:

{2, 3,5}
3. scan T C
3
: {2, 3, 5}:2 F
3:
{2, 3, 5}
minsup=0.5
CS583, Bing Liu, UIC 17
Details: ordering of items
The items in I are sorted in lexicographic order
(which is a total order).
The order is used throughout the algorithm in each
itemset.
{w[1], w*2+, , w[k]} represents a k-itemset w
consisting of items w[1], w*2+, , w[k], where w[1]
< w*2+ < < w[k] according to the total order.
CS583, Bing Liu, UIC 18
Details: the algorithm
Algorithm Apriori(T)
C
1
init-pass(T);
F
1
{f | f e C
1
, f.count/n > minsup}; // n: no. of transactions in T
for (k = 2; F
k-1
= C; k++) do
C
k
candidate-gen(F
k-1
);
for each transaction t e T do
for each candidate c e C
k
do
if c is contained in t then
c.count++;
end
end
F
k
{c e C
k
| c.count/n > minsup}
end
return F
k
F
k
;
Does not fit in memory and is expensive to build
EXISTING SYSM
Traverse the database many times

I/O overhead, and computational complexity is
more
Cannot meet the requirements of large-scale
database mining
EXISTING SYSTEM
LIMITATIONS
19
20
OPTIMIZATION
Optimization is the act of obtaining the best
result under given circumstances.

Optimization can be defined as the process of
finding the conditions that give the maximum or
minimum of a function.




OPTIMIZATION
Provide robust and efficient approach in exploring
large search space
Applicable in problems where no (good) method is
available
Most suitable in problems where multiple solutions
are required
Parallel implementation is easier
21
EVOLUTIONARY COMPUTING
Evolutionary computing techniques mostly involve Metaheuristic
Optimization Algorithms.
Evolutionary algorithms
Gene expression programming
Genetic Algorithm
Genetic programming
Evolutionary programming
Evolution strategy
Differential evolution
Differential search algorithm
Eagle strategy
Swarm intelligence
Ant colony optimization
Particle Swarm Optimization
Bees algorithm
Cuckoo search
22
EVOLUTIONARY COMPUTING
Evolutionary computing techniques mostly involve Metaheuristic
Optimization Algorithms.
Evolutionary algorithms
Gene expression programming
Genetic Algorithm
Genetic programming
Evolutionary programming
Evolution strategy
Differential evolution
Differential search algorithm
[6]

Eagle strategy
Swarm intelligence
Ant colony optimization
Particle Swarm Optimization
Bees algorithm
Cuckoo search
23
Genetic Algorithm (GA) and Particle Swarm
Optimization (PSO) are effective population based
stochastic search algorithms, which include heuristics
and an element of nondeterminism in traversing the
search space.
GA AND PSO : AN INTRODUCTION
24
DATASETS
Lenses
Habermans Survival
Car Evaluation
Post operative care
Zoo
University of California Irvine Repository
25
DATASETS
Dataset Name No. of
Instances
No. of
Attributes
Attribute
characteristics
Lenses 24 3 Categorical
Habermans
Survival
306 3 Integer
Car Evaluation 1728 6 Categorical

Post Operative
Patient
90 8 Categorical,
Integer
Zoo 101 17 Categorical,
Integer

26
SYSTEM TEMPLATE
9/14/2014 27
Proposed
System
INPUT TEMPLATE
OUTPUT TEMPLATE
GENETIC ALGORITHM
A Genetic Algorithm (GA) is a procedure used to
find approximate solutions to search problems
through the application of the principles of
evolutionary biology.
28
GENETIC ALGORITHM
POPULATION SELECTION
MUTATION
CROSSOVER
29
Flowchart of ARM using GA
30
Components of a GA
A problem to solve, and ...
Encoding technique (gene, chromosome)
Initialization procedure (creation)
Evaluation function (environment)
Selection of parents (reproduction)
Genetic operators (mutation, recombination)
Parameter settings (practice and art)

Simple Genetic Algorithm
{
initialize population;
evaluate population;
while Termination Criteria Not Satisfied
{
select parents for reproduction;
perform recombination and mutation;
evaluate population;
}
}
The GA Cycle of Reproduction
reproduction
population evaluation
modification
discard
deleted
members
parents
children
modified
children
evaluated children
Population

Chromosomes could be:
Bit strings (0101 ... 1100)
Real numbers (43.2 -33.1 ... 0.0 89.2)
Permutations of element (E11 E3 E7 ... E1 E15)
Lists of rules (R1 R2 R3 ... R22 R23)
Program elements (genetic programming)
... any data structure ...
population
Reproduction






reproduction
population
parents
children
Parents are selected at random with selection chances
biased in relation to chromosome evaluations.
Chromosome Modification
modification
children
Modifications are stochastically triggered
Operator types are:
Mutation
Crossover (recombination)
modified children
Mutation: Local Modification
Before: (1 0 1 1 0 1 1 0)
After: (0 1 1 0 0 1 1 0)

Before: (1.38 -69.4 326.44 0.1)
After: (1.38 -67.5 326.44 0.1)
Causes movement in the search space
(local or global)
Restores lost information to the population
Crossover: Recombination
Crossover is a critical feature of genetic
algorithms:
It greatly accelerates search early in evolution of
a population
Evaluation




The evaluator decodes a chromosome and assigns it
a fitness measure
The evaluator is the only link between a classical GA
and the problem it is solving
evaluation
evaluated
children
modified
children
Deletion




Generational GA:
entire populations replaced with each iteration
Steady-state GA:
a few members replaced each generation
population
discard
discarded members
GA based ARM
Parameter Tuning for ARM
GA with Elitism
Adaptive GA
41
PARAMETERS OF GA
Parameter
Name
Parameter Role
Population Size Fixes the number of chromosomes and
indirectly the crossover
Selection
Selection of the chromosomes for
crossover
Mutation rate (p
m
) The mutation operation is based on
mutation rate
Crossover rate (p
c
) The crossover points is fixed by
crossover rate
Minimum support
Minimum confidence
Set by the user for fitness calculation
42
ARM by Parameter Tuning
Methodology

Selection : Tournament

Crossover Probability : Fixed ( Tested with 3
values)

Mutation Probability : No Mutation

Fitness Function :

Population : Fixed (Tested with 3
values)


43
RESULT ANALYSIS
Population Size Vs Accuracy for
ARM with GA
50
55
60
65
70
75
80
85
90
Lenses Haberman Car
Evaluation
Postop Zoo
P
r
e
d
i
c
t
i
v
e

A
c
c
u
r
a
c
y

%

Datasets
No. of Instances
No. of Instances *1.25
No. of Instanses * 1.5
44
RESULT ANALYSIS
Minimum Support and Confidence Vs Accuracy
for ARM with GA
0
10
20
30
40
50
60
70
80
90
100
sup=0.2 con=0.2 sup=0.9 con=0.9 sup=0.9 con=0.2 sup=0.2 con=0.9
P
r
e
d
i
c
t
i
v
e

A
c
c
u
r
a
c
y

%

Minimum Support and Confidence
Lenses
Haberman
Car Evaluation
Postop
Zoo
45
RESULT ANALYSIS
Pc = .25 Pc = .5 Pc = .75
Accuracy
%
No. of
Generations
Accuracy
%
No. of
Generations
Accuracy
%
No. of
Generations
Lenses 95 8 95 16 95 13
Haberman 69 77 71 83 70 80
Car
Evaluation
80 80 81 83 81 85
Post
Operative
Patient
74 57

74

63 73 68
Zoo 81 90 80 88

81

85
Comparison based on variation in Crossover Probability
for Arm using GA
46
GA based ARM
Parameter Tuning for ARM
GA with Elitism
Adaptive GA
47
Concept of Elitism
Elitism
Mating Pool
Selection
Crossover
Mutation
Elite
New
Solutions
Population
48
Predictive Accuracy for Mining AR based on GA with
Elitism
RESULTS ANALYSIS
0
10
20
30
40
50
60
70
80
90
100
Lenses Habermans
Survival
Car Evaluation Po-opert Care Zoo
P
r
e
i
c
t
i
v
e

A
c
c
u
r
a
c
y

(
%
)

Datasets
GA
Elitism
GA
49
GA based ARM
Parameter Tuning for ARM
GA with Elitism
Adaptive GA
50
MINING AR USING AGA
Methodology

Selection : Roulette Wheel

Crossover Probability : Fixed


Mutation Probability :



Fitness Function :

Population : Fixed


51
FLOWCHART OF AGA
SELF ADAPTIVE
Max Generation?
Initial Population
Evaluate Fitness
Select Survivors
Crossover
Output Results
Mutation
52
INFERENCES
Accuracy comparison between GA, AGA and GA with
parameters set to termination values of AGA
RESULT ANALYSIS
0
10
20
30
40
50
60
70
80
90
100
Lenses Habermans
Survival
Car Evaluation Po-opert Care Zoo
P
r
e
d
i
c
t
i
v
e

A
c
c
u
r
a
c
y

(
%
)

Datasets
GA
AGA
Ga with
AGA
53
PARTCILE SWARM OPTIMIZATION
PSOs mechanism is inspired by the social and
cooperative behavior displayed by various
species like birds, fish etc including human
beings.
54
Updation of velocity of
particle in each Iteration

-Particle

- Best particle of
the swarm

PARTCILE SWARM OPTIMIZATION
Generation 1
Generation 2
Target
(Solution)
Generation N
55
Introduction to the PSO: Algorithm
1. Create a population of agents (particles) uniformly
distributed over X
2. Evaluate each particles position according to the objective
function
3. If a particles current position is better than its previous
best position, update it
4. Determine the best particle (according to the particles
previous best positions)
Introduction to the PSO: Algorithm
5. Update particles velocities:


6. Move particles to their new positions:


5. Go to step 2 until stopping criteria are satisfied

Velocity Updation in PSO
58
PSO STATES
Exploration Exploitation
Particle
Best Particle
of Swarm
59
PSO STATES
Convergence Jumping Out
60
Introduction to the PSO: Algorithm -
Example
Introduction to the PSO: Algorithm -
Example
Introduction to the PSO: Algorithm -
Example
Introduction to the PSO: Algorithm -
Example
Introduction to the PSO: Algorithm -
Example
Introduction to the PSO: Algorithm -
Example
Introduction to the PSO: Algorithm -
Example
Introduction to the PSO: Algorithm -
Example
Introduction to the PSO: Algorithm
Characteristics
Advantages
Simple implementation
Easily parallelized for concurrent processing
Very few algorithm parameters
Very efficient global search algorithm

Disadvantages
Tendency to a fast and premature convergence in mid
optimum points
Slow convergence in refined search stage (weak local search
ability)
PSO based ARM
Modifications in
methodology
Parameter Tuning
PSO & WPSO
CPSO
SAPSO1 & 2
NPSO
SAPSO1 & 2
70
Mining ARs using PSO
Methodology
Each data itemset are represented as particles
The particles moves based on velocity

The particles position are updated based on




Weighted PSO
The velocity update equation is modified as

71
Flow chart depicting the General PSO Algorithm:
For each particles position (p)
evaluate fitness
If fitness(p) better than
fitness(pbest) then pbest= p
L
o
o
p

u
n
t
i
l

a
l
l

p
a
r
t
i
c
l
e
s

e
x
h
a
u
s
t

Set best of pBests as gBest
Update particles velocity and
position
L
o
o
p

u
n
t
i
l

m
a
x

i
t
e
r

Start
Initialize particles with random position
and velocity vectors.
Stop: giving gBest, optimal solution.
FLOWCHART OF PSO
72
RESULTS ANALYSIS
Execution Time Comparison Between GA and PSO
0
20
40
60
80
100
120
140
Lenses Post Operative
Patient
Zoo Haberman's
Survival
Car Evaluation
E
x
e
c
u
t
i
o
n

T
i
m
e

(
s
)

Datasets
GA
PSO
73
Predictive Accuracy Comparison by Altering Inertia
Weights
RESULTS ANALYSIS
30
40
50
60
70
80
90
100
w=0.2 w=0.3 w=0.4 w=0.5 w=0.6 w=0.7 w=0.8 w=0.9
P
r
e
d
i
c
t
i
v
e

A
c
c
u
r
a
c
y

%

Inertia Weight
Lenses
Haberma's Survival
Car Evaluation
Po-opert Care
Zoo
74
PSO based ARM
Modifications in
methodology
Parameter Tuning
PSO & WPSO
CPSO
SAPSO1 & 2
NPSO
SAPSO1 & 2
75
MINING ARS USING CHAOTIC PSO
The new chaotic map model is formulated as
Methodology
Initial point u
0
and v
0
to 0.1
The velocity of each particle is updated by
76
RESULT ANALYSIS
Predictive Accuracy Comparison of CPSO with PSO
75
80
85
90
95
100
105
Lenses Car Evaluation Habermans
Survival
Po-opert Care Zoo
P
r
e
d
i
c
t
i
v
e

A
c
c
u
r
a
c
y

%

Datasets
PSO
CPSO
77
PSO based ARM
Modifications in
methodology
Parameter Tuning
PSO & WPSO
CPSO
SAPSO1 & 2
NPSO
SAPSO1 & 2
78
The concept of local best particle (lbest) replacing
the particle best (pbest) is introduced

The neighborhood best (lbest) selection is as
follows;
Calculate the distance of the current particle
from other particles
Find the nearest m particles as the neighbor of
the current particle based on distance
calculated
Choose the local optimum lbest among the
neighborhood in terms of fitness values
Mining ARs Using NPSO
Methodology
79
FLOWFCHART FOR NPSO
Compute x
i
(k+1)
Compute (f(x
i
(k+1))
Reorder the particles
Generate neighborhoods I =1
k K
i = i +1
K = k+1
Start
K =1 ,Initialize x
i
(k), v
i
(k)
Compute f(x
i
(k))
Determine best particles in the
neighborhood of i
Update previous best if
necessary
I N
Stop
80
RESULT ANALYSIS
Predictive Accuracy Comparison for Dynamic
Neighborhood selection in PSO
75
80
85
90
95
100
Lenses Habermans Survival Car Evaluation Po-opert Care Zoo
P
r
e
d
i
c
t
i
v
e

A
c
c
u
r
a
c
y
(
%
)

Datasets
PSO
NPSO
81
PSO based ARM
Modifications in
methodology
Parameter Tuning
PSO & WPSO
CPSO
SAPSO1 & 2
NPSO
SACPSO
82
ROLE OF CONTROL PARAMETERS
Parameter Name Parameter Role
Inertia weight () Controls the impact of the
velocity history into the new
velocity
Acceleration
Coefficient (c
1
)
Maintains the diversity of swarm
Acceleration
Coefficient (c
2
)
Convergence towards the global
optima
83
MINING AR USING SAPSO
The Inertia Weight in the velocity update equation is made
adaptive.
SAPSO1 :

SAPSO2 :






SACPSO :




where, g is the generation index representing the current
number of evolutionary generations, and G is a redefined
maximum number of generations. Here, the maximal and
minimal weights
max
and
min
are set to 0.9 and 0.4, based on
experimental study.
84
RESULT ANALYSIS
Predictive Accuracy Comparison of SAPSO with PSO for Lenses
Dataset
50
55
60
65
70
75
80
85
90
95
100
10 20 40 60 80 100
P
r
e
d
i
c
t
i
v
e

A
c
c
u
a
r
c
y

(
%
)

No. of Iterations
PSO
SAPSO1
SAPSO2
SACPSO
85
RESULT ANALYSIS
70
75
80
85
90
95
100
10 20 40 60 80 100
P
r
e
d
i
c
t
i
v
e

A
c
c
u
r
a
c
y

(
%
)

No. o Iterations
PSO
SAPSO1
SAPSO2
SACPSO
Predictive Accuracy Comparison of SAPSO with PSO for
Habermans Survival Dataset
86
RESULT ANALYSIS
42
52
62
72
82
92
10 20 40 60 80 100
P
r
e
d
i
c
t
i
v
e

A
c
c
u
r
a
c
y

(
%
)

No. of Iterations
PSO
SAPSO1
SAPSO2
SACPSO
Predictive Accuracy Comparison of SAPSO with PSO for Car
Evaluation Dataset
87
RESULT ANALYSIS
40
50
60
70
80
90
100
10 20 40 60 80 100
P
r
e
d
i
c
t
i
v
e

A
c
c
u
r
a
c
y

(
%
)

No. of Iterations
PSO
SAPSO1
SAPSO2
SACPSO
Predictive Accuracy Comparison of SAPSO with PSO for Zoo
Dataset
88
RESULT ANALYSIS
90
91
92
93
94
95
96
97
98
99
100
10 20 40 60 80 100
P
r
e
d
i
c
t
i
v
e

A
c
c
u
r
a
c
y

(
%
)

No. of Iterations
PSO
SAPSO1
SAPSO2
SACPSO
Predictive Accuracy Comparison of SAPSO with PSO for
Postoperative Patient Dataset
89
ARM with PSO and GA
PSO + SFLA
GPSO (GA/PSO Hybrid)
Adaptive PSO (data dependent)
90
PSO + SFLA
MINING AR USING APSO
Estimation of Evolutionary State done using distance
measure d
i
and estimator e
Classify into which state particle belongs and adapt the
acceleration coefficients and Inertia Weight

Exploration
Exploitation
Convergence
Jumping Out

91
MINING AR USING APSO
Adapt the acceleration coefficients as given in table

92
State/Acceleration
Coefficient
c
1
c
2

Exploration Increase by Decrease by
Exploitation Increase by Decrease by
Convergence Increase by Increase by
Jumping out Decrease by Increase by
The Inertia Weight is adjusted as given in equation
0
0.05
0.1
0.15
0.2
0.25
0.3
0.35
0.4
0.45
0.5
1 4 7 10 13 16 19 22 25 28 31 34 37 40 43 46 49
E
v
a
l
u
a
t
i
o
n

a
c
t
o
r

(
e
)

Iteration No.
1.5
1.6
1.7
1.8
1.9
2
2.1
2.2
2.3
2.4
1 4 7 10 13 16 19 22 25 28 31 34 37 40 43 46 49 52 55 58 61
C
o
e
f
f
i
c
i
e
n
t

V
a
l
u
e


Generation Number
C1
C2
0
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
1
18
1
5
2
2
2
9
3
6
4
3
5
0
5
7
6
4
7
1
7
8
8
5
9
2
9
9
I
n
e
r
t
i
a

W
e
i
g
h
t

Iterations
Parameter Adaption
93


Predictive Accuracy comparison of Adaptive PSO with
PSO

RESULT ANALYSIS
75
80
85
90
95
100
105
Lenses Car
Evaluation
Habermans
Survival
Po-opert
Care
Zoo
P
r
e
d
i
c
t
i
v
e

A
c
c
u
r
a
c
y

(
%
)

PSO
APSO
94
ARM with PSO and GA
PSO + SFLA
GPSO (GA/PSO Hybrid)
Adaptive PSO (data dependent)
95
PSO + SFLA
HYBRID GA/ PSO (GPSO) MODEL
Genetic
Algorithm
Particle Swarm
Optimization
Advantages
Global Optimization Converges Easily
GA works on a
population of
possible solution
PSO have no
overlapping and
mutation
calculation
they do not tend to
be easily trapped
by local optima
Memory


Disadvantages
Cannot assure
constant
optimisation
response times
The method easily
suffers from the
partial optimism
Mutation and
Crossover at times
creates children
faraway from good
solutions
Weak local search
ability

96
HYBRID GA/ PSO (GPSO) MODEL
Genetic
Algorithm
Particle
Swarm
Optimization

Evaluate
Fitness
Upper
Lower
Initial
Population
Ranked
Population
Updated
Population
97
x <- copy(x_best)
For 1 to Elite
x <- Select an Individual
x <- Update Velocity
x <- Update Position
x1 <- Select an Individual
x2 <- Select an Individual
Crossover(x1, x2)
Mutate(x1, x2)
For 1 to (pop_size-Elite) *
breed_Ratio
For 1 to (pop_size-Elite)*(1-
Breed_Ratio
HYBRID GA/ PSO (GPSO) MODEL
98
RESULT ANALYSIS
Predictive Accuracy Comparison Of GPSO with GA
and PSO
0
10
20
30
40
50
60
70
80
90
100
Car Haberman Lens PO Care Zoo
P
r
e
d
i
c
t
i
v
e

A
c
c
u
r
a
c
y

(
%
)

GA
PSO
GPSO
99
ARM with PSO and GA
PSO + SFLA
GPSO (GA/PSO Hybrid)
Adaptive PSO (data dependent)
100
PSO + SFLA
Mining AR using PSO +SFLA
Shuffled Frog Leaping Algorithm (SFLA) is adopted to
perform the local search

Here the particles are allowed to gain some
experience, through a local search, before being
involved in the evolutionary process

The shuffling process allows the particles to gain
information about the global best.
101
FLOWCHART FOR PSO +SFLA
Generation of initial population(P) and evaluating
the fitness of each particle
Velocity and position updation of particles
Distribution of frog into M memeplexes
Iterative Updating of worst frog in each
memeplexes
Combining all frogs to form a new population
Termination
criteria satisfied?
Determine the best solution
Sorting the population in descending order in
terms of fitness value
SFLA
102
SHUFFLED FROG LEAPING
ALGORITHM (SFLA)
9/14/2014 103
14-09-2014 104
Frog 1
Frog 2
Frog 3
Frog 7
Frog 6
Frog 5
Frog 4
Memeplex 1
Memeplex 2
Memeplex 3
Frog 8
Sorted Frogs


X
b

- Position of the group best /global best
X
w
- Position of the worst frog in the group
D
i
- Calculated new position of the worst frog

The position of the particles with worst fitness is
modified using
Updation of Worst Particles
105
80
82
84
86
88
90
92
94
96
98
100
Lenses Haberman Car Postop Zoo
P
r
e
d
i
c
t
i
v
e

A
c
c
u
r
a
c
y

Datasets
PSO
APSO
PSO+SFLA
APSO+SFLA
RESULT ANALYSIS
Predictive Accuracy Comparison
106
ARM with PSO and GA
PSO + SFLA
GPSO (GA/PSO Hybrid)
Adaptive PSO (data dependent)
107
PSO + SFLA
QUANTUM BEHAVED PARTICLE
SWARM OPTIMIZATION

As per classical PSO, the state of a particle is depicted by
its position vector (x
i
) and velocity vector (v
i
) which
determine the trajectory of the particle.

In Quantum Laws of Mechanics, according to Uncertainty
Principle, x
i
and v
i
cannot be determined simultaneously.

In the quantum model of PSO, the state of a particle is
depicted by wave- function (x,t).

9/14/2014 108
QUANTUM BEHAVED PARTICLE
SWARM OPTIMIZATION
9/14/2014 109
Start
Initialize the swarm
Calculate mean best (mbest)
Update particle position
Update local best
Update global best
Termination
criteria
reached
Stop
No
Yes
QUANTUM BEHAVED PARTICLE
SWARM OPTIMIZATION

9/14/2014 110
QUANTUM BEHAVED PARTICLE
SWARM OPTIMIZATION
9/14/2014 111
QUANTUM BEHAVED PARTICLE
SWARM OPTIMIZATION
9/14/2014 112
28-08-2013

113
Predictive Accuracy -measures the effectiveness of the rules mined





PERFORMANCE ANALYSIS
Comparison of the Predictive Accuracy
Achieved by All Methods
60
65
70
75
80
85
90
95
100
105
Lenses Habermans
Survival
Car Evaluation Po-opert Care Zoo
P
r
e
d
i
c
t
i
v
e

A
c
c
u
r
a
c
y

(
%
)

Datasets
GA
SAGA
Elitism GA
PSO
W PSO
cPSO
DPSO
SAPSO1
SAPSO2
SAcPSO
GPSO
APSO
PSO+SFLA
APSO+SFLA
114
References
Jing Li, Han Rui-feng, A Self-Adaptive Genetic Algorithm Based On
Real- Coded, International Conference on Biomedical Engineering and
computer Science , Page(s): 1 - 4 , 2010

Chuan-Kang Ting, Wei-Ming Zeng, Tzu- Chieh Lin, Linkage Discovery
through Data Mining, IEEE Magazine on Computational Intelligence,
Volume 5, February 2010.

Caises, Y., Leyva, E., Gonzalez, A., Perez, R., An extension of the
Genetic Iterative Approach for Learning Rule Subsets , 4th International
Workshop on Genetic and Evolutionary Fuzzy Systems, Page(s): 63 - 67 ,
2010

Shangping Dai, Li Gao, Qiang Zhu, Changwu Zhu, A Novel Genetic
Algorithm Based on Image Databases for Mining Association Rules, 6th
IEEE/ACIS International Conference on Computer and Information
Science, Page(s): 977 980, 2007

Peregrin, A., Rodriguez, M.A., Efficient Distributed Genetic Algorithm
for Rule Extraction,. Eighth International Conference on Hybrid
Intelligent Systems, HIS '08. Page(s): 531 536, 2008

115
Mansoori, E.G., Zolghadri, M.J., Katebi, S.D., SGERD: A Steady-State
Genetic Algorithm for Extracting Fuzzy Classification Rules From Data,
IEEE Transactions on Fuzzy Systems, Volume: 16 , Issue: 4 , Page(s): 1061
1071, 2008..

Xiaoyuan Zhu, Yongquan Yu, Xueyan Guo, Genetic Algorithm Based on
Evolution Strategy and the Application in Data Mining, First International
Workshop on Education Technology and Computer Science, ETCS '09,
Volume: 1 , Page(s): 848 852, 2009

Hong Guo, Ya Zhou, An Algorithm for Mining Association Rules Based on
Improved Genetic Algorithm and its Application, 3rd International
Conference on Genetic and Evolutionary Computing, WGEC '09, Page(s):
117 120, 2009

Genxiang Zhang, Haishan Chen, Immune Optimization Based Genetic
Algorithm for Incremental Association Rules Mining, International
Conference on Artificial Intelligence and Computational Intelligence, AICI
'09, Volume: 4, Page(s): 341 345, 2009
References Contd..
116

Caises, Y., Leyva, E., Gonzalez, A., Perez, R., An extension of the Genetic
Iterative Approach for Learning Rule Subsets , 4th International Workshop
on Genetic and Evolutionary Fuzzy Systems, Page(s): 63 - 67 , 2010
Xiaoyuan Zhu, Yongquan Yu, Xueyan Guo, Genetic Algorithm Based on
Evolution Strategy and the Application in Data Mining, First International
Workshop on Education Technology and Computer Science, ETCS '09,
Volume: 1 , Page(s): 848 852, 2009

Miguel Rodriguez, Diego M. Escalante, Antonio Peregrin, Efficient
Distributed Genetic Algorithm for Rule extraction, Applied Soft Computing
11 (2011) 733743.

Hamid Reza Qodmanan , Mahdi Nasiri, Behrouz Minaei-Bidgoli, Multi
objective association rule mining with genetic algorithm without specifying
minimum support and minimum confidence, Expert Systems with
Applications 38 (2011) 288298.

Yamina Mohamed Ben Ali, Soft Adaptive Particle Swarm Algorithm for Large
Scale Optimization, IEEE 2010.



References Contd..
117
118

You might also like