Professional Documents
Culture Documents
EE561
Sensor Fusion
Are you ready for a 2nd Opinion ?
Janaka Wijayakulasooriya
PhD, MIEEE Senior Lecturer Department of Electrical and Electronic Engineering University of Peradeniya jan@ee.pdn.ac.lk
Janaka Wijayakulasooriya, Department of Electrical and Electronic Engineering, University of Peradeniya
Sensor Fusion: Definition Sensor Fusion is the combining of sensory data or data derived from sensory data such that the resulting information is in some sense (eg: accuracy, robustness) better than would be possible when these sources were used individually
Sensor Fusion Models: Complementary Type sensors do not depend on each other directly Can be combined to establish a more complete picture of the phenomenon being observed and hence the sensor datasets would be complete Example:
the use of multiple cameras each observing different parts of a room four radars around a geographical region would provide a complete picture of the area surrounding the region
Sensor Fusion Models: Cooperative Type Data provided by two independent sensors are used to derive information that would not be available from a single sensor Eg: Stereoscopic vision system
By combining the two-dimensional (2D) images from two cameras located at slightly different angles of incidence (viewed from two image planes), a three- dimensional (3D) image of the observed scene can be determined
Cooperative sensor fusion is difficult to design, and the resulting data will be sensitive to the inaccuracies in all the individual sensors.
Janaka Wijayakulasooriya, Department of Electrical and Electronic Engineering, University of Peradeniya
Stereoscopic Camera
Fusion Methods
Probabilistic and statistical models:
Bayesian reasoning Evidence theory Robust statistics Recursive operators
Kalman Filter Optimization Regularization Uncertainty ellipsoids ANNs Fuzzy logic Approximate reasoning Computer vision techniques
Heuristic methods
T1
T?
T2
Sensor Models
P(x|z)
P(x|z)
Combined Measurement
P(x|z1,z2)
Example
Bayesian Decision Theory Lets revisit the example of Sea Bass and Salmon Let w represents the class:
w = w1 Sea Bass w = w2 Salmon
Reflect our prior knowledge of how likely we are to get a sea bass or salmon before the fish actually appears Can we make a decision only based on priori ?
Janaka Wijayakulasooriya, Department of Electrical and Electronic Engineering, University of Peradeniya
Improving the simple classifier Can we use P(w1) > P(w2) or P(w1) < P(w2) to decide next fish is Sea Bass or Salmon ? What if we have to predict many fish ? Suppose, in addition to the priori we have:
Measurement vector (Feature vector) x={x1xN} on the subject
(Example: x1 = lightness, x2 = length)
Bayes Formula
Bayes Classifier Bayes formula shows that by observing the value of x we can convert the prior probability P(j) to the a posteriori probability (or posterior) probability P(j |x) For the purpose of classification, what is important is:
Likelihood Priori
Evidence p(x) can be considered as just a scaling factor which make sure the sum of each individual probabilities = 1
Janaka Wijayakulasooriya, Department of Electrical and Electronic Engineering, University of Peradeniya
In order to justify this decision, we have to calculate the probability of error, whenever we make a decision based on the measurement x and priories
So, if P(error|x) can be minimized for each individual decision, P(error) can be minimized So, if we apply the Bayes decision rule then the P(error|x) = min[ P(w1|x), P(w2|x) ]
Bayes Decision Rule: elimination of p(x) As, p(x) [ evidence ] in the Bayes just ensures that P(w1|x) + P(w2|x) = 1, we can eliminate it from the decision rule Hence, the decision can be made only based on:
p(x|w1).P(w1) > p(x|w2). P(w2) Decide w1 p(x|w1).P(w1) < p(x|w2). P(w2) Decide w2
Special cases:
When p(x|w1) = p(x|w2) decision is only based on priories When P(w1)=P(w2) decision is only based on likelihood
Janaka Wijayakulasooriya, Department of Electrical and Electronic Engineering, University of Peradeniya
Example Consider an air surveillance detector, which can have 3 states (x):
Sensor Model: Likelihood Matrix Sensor can be modeled in the form of a likelihood matrix This gives P(z|x) Consider P(z|x) for two sensors
Posterior Probability
Combining Sensors Assuming that each sensor is mutually exclusive: P(x | z1,z2,zN) = P(x | z1).P(x|z2)P(x|ZN) Therefore P(x | z1,z2) = P(x|z1)P(x|z2) = C. P(z1|x).P(x).(z2|x).P(x)
jan@ee.pdn.ac.lk
Janaka Wijayakulasooriya, Department of Electrical and Electronic Engineering, University of Peradeniya