Title: (1)Faculty of Mech.
1Semi-nonnegative INDSCAL analysis
Ahmad Karfoul (1), Julie Coloigner (2,3), Laurent
Albera (2,3), Pierre Comon (4,5)
(1)Faculty of Mech. Elec. Engineering,
University AL-Baath, Syria
(2)Laboratory LTSI - INSERM U642, France
(3)University of Rennes 1, France
(4)Laboratory I3S - CNRS, France
(5)University of Nice Sophia - Antipolis, France
2Outlines
- Preliminaries and problem formulation
- A compact matrix form of derivatives
3Preliminaries and problem formulation
Outer product
Ex. Order 3
?
Ex. Order q
?
Outer product of q-vectors ? rank-one q-th order
tensor
4Preliminaries and problem formulation
Tensor to rectangular matrix transformation
(unfolding according to the i-th mode)
4
5Preliminaries and problem formulation
CANonical Decomposition (CAND) Hitchcock 1927,
Carroll Chang 1970, Harshman 1970
CAND Linear combinantion of minimal number of
rank -1 terms
?P
6Preliminaries and problem formulation
INDSCAL decomposition Carroll Chang 1970
7Preliminaries and problem formulation
CANonical Decomposition (CAND)
INDSCAL decomposition
INDSCAL CAND of 3-order tensor symmetric in two
of three modes
8Preliminaries and problem formulation
(Semi-) nonnegative INDSCAL decomposition for
(semi-) nonnegative BSS
Example
Diagonalizing a set of covariance matrices
the (N ? P) mixing matrix
s zero-mean random vector of P statistically
independent components
Covariance matrix
9Preliminaries and problem formulation
Problem at hand
Constrained problem
Unconstrained problem
10Preliminaries and problem formulation
- Solution minimizing the following cost
function
Some iterative algorithms
First second order derivatives of ?
11Optimization methods
Global line search (1/2)
- Looking for the global optimum in a given
direction
Update rules
Directions given by the iterative algorithm
with respect to A and C, respectively.
12Optimization methods
Global line search (2/2)
13Optimization methods
Steepest Descent (SD)
- Optimization by searching for stationary points
of ? based on first-order approximation (i.e. the
gradient)
Update rules
14Optimization methods
Steepest Descent (SD)
Then
15Compact matrix form of derivatives
Gradient computation of ?(A,C)
Then
16Optimization methods
Newton
- Optimization by including the second-order
approximation to accelerate the convergence
Update rules
17Optimization methods
Newton
- Convergence requirement Hessians are positive
definite matrices
- Solution Necessity to regularization (i.e.
Eigen-Value Decomposition (EVD) - based
technique )
18Optimization methods
Levenberg-Marquardt (LM)
Update rules
19Numerical results
Convergence speed VS SNR
- Noise-free random 3-order tensor
- Results averaged over 200 Monte Carlos
realizations.
20Numerical results
Convergence speed VS SNR
SNR 0 dB
21Numerical results
Convergence speed VS SNR
SNR 15 dB
22Numerical results
Convergence speed VS SNR
SNR 30 dB
23Conclusion
- Solving an unconstrained semi-nonnegative
INDSCAL problem .
- Differential concept ? Powerful tool for
compact matrix derivations forms
- Global line search for symmetric case ? global
optimum in the considered direction
- Iterative algorithms with global line search ?
suitable step to reach the global
optimum