This paper defines analogy measure of two random variables, and discusses the principle and algorithm of maximizing non-Gaussianity of observed data with a linear transformation to estimate independent components serially. It also proves the non-polynomial moment theorem by a generalized way, and states the feasibility that substitutes the analogy with the expectation of a non-quadratic smooth even function based on the theorem. A formula to compute sign of above algorithm is given. The algorithm overcomes the contradiction between the objective function and the sign computation formula.Comparing with Maximum likelihood ICA, the analogy is Maximum likelihood function of single source under pre-whited.
Comon P. Independent component analysis : A new concept[J]? Signal Processing, 1994, 36(3): 287-314.[2]Bell A J and Sejnowski T J. An information-maximizationapproach to blind separation and blind deconvolution[J].Neural Computation.1995, 7:1129-1159[3]Cardoso J F and Adali T. The maximum likelihood approachto complex ICA[C]. The 2006 IEEE International Conferenceon Acoustics, Speech, and Signal Processing(ICASSP),Toulouse, France, May 14-19 2006, Vol.V: 673-676.[4]Hiroe A. Solution of permutation problem in frequencydomain ica, using multivariate probability densityfunctions[C]. 6th International Conference on IndependentComponent Analysis and Blind Source Separation, Carleston,SC, USA, March 5-8, 2006: 601-608.[5]Hyvariene A, Karhunen J, and Oja E. IndependentComponent Analysis[M]. New York, John Wiley Sons Inc.,2001: 165-181.[6]Hyvarinen A and Oja E. Independent component analysis bygeneral nonlinear Hebbian-like learning rules[J]. SignalProcessing, 1998, 64(3): 301-313.[7]Hyvariene A and Oja E. Fast and robust fixed-pointalgorithms for independent component analysis[J].IEEETransactions on Neural Networks.1999, 10(3):626-634