IAC is a much more powerful quinine, however, capable of finding the underlying factors or sources when the classic methods fail completely. In reality, the data often does not follow a Gaussian distribution and the situation is not as simple as those methods of factor analysis, projection pursuit or PICA assumes. Many real world data sets have super Gaussian Distributions. Hence the probability density of the data is peaked at zero and has many tails, when compared to a Gaussian density of the same variance.
This is the starting point of IAC where we try to find statistically independent components in the unreal case where the data is non Gaussian . An this paper we provide the different estimation principles of IAC and their algorithms. The simulation results of IAC are carried out by MUTUAL. Keywords- Nonlinear system, PICA, IAC, Statistical independence, Non-Gaussian Introduction Linearity is a specification of a field of activity, nonlinearity is a “non-specification” and its field is unbounded. In nature, nonlinearity is the rule rather than the exception, while linearity is a simplification adopted for analysis.
Indeed, the complex structure of dynamic systems makes it almost impossible to use linear models to present them accurately. Nonlinear models are designed to provide a better mathematical way to characterize the inherent nonlinearity in real dynamic systems, In mathematics, a nonlinear system is a system which is not linear, I. E. A system which does not satisfy the superposition principle. Less technically, a nonlinear system is any problem where the variable(s) to be solved for cannot be written as a linear sum of independent components.
In mathematics, a linear function f(x) is one which satisfies both of the following properties: Additively: f (x + (y) Homogeneity: f(ax)=oaf (x) A nonlinear algebraic problems are often exactly solvable, and if not they usually can be thoroughly understood through qualitative and numeric analysis. One of the greatest difficulties of nonlinear problems is that it is not generally possible to combine known solutions into new solutions. A non-linear algebraic problem can be thoroughly understood through qualitative and numeric analysis.
PICA is a main stay of modern data analysis- a useful statistical tool to find the patterns in data of high dimension. A dimension refers to a particular measurement yep. PICA is one of the multivariate methods of analysis and has been used widely with large multidimensional data sets. Principal component analysis is a vector space transform often used to reduce multidimensional data sets to lower dimensions for analysis. PICA involves a mathematical procedure that transforms a number of correlated variables into a smaller number of uncorrelated variables called principal components .
The lack of correlation is a useful property as it nears that the PC’s are measuring different “dimension ” in the data sets. However, the PICA quinine only uses second order statistic information, which makes the principal component De-correlated but not really independent. Recently it has been realized that IAC rather than PICA is more appropriate technique, as IAC involves higher order statistics for the extraction of independent component, which makes the component reveals more useful information than principal components.
Independent Component Analysis Independent Component Analysis is a statistical technique for decomposing complex data set into independent sub-parts. Independent component analysis (CA) is a amputation method for separating a multivariate signal into additive subcomponents supposing the mutual statistical independence of the non-Gaussian source signals. IAC can be seen as an extension to principal component analysis and factor analysis. IAC is a much more powerful technique, however, capable of finding the underlying factors or sources when these classic methods fail completely.
In IAC the source signals are to be reconstructed from the mixed signals I. E. , mixtures of the source signals subject to determination. IAC is a way of finding a linear non- orthogonal co-ordinate system in any multivariate data. IAC defines a generative model for the observed multivariate data, which is typically given as a large database of samples. In the model, the data variables are assumed to be linear mixtures of some unknown latent variables, and the mixing system is also unknown.
The latent variables are assumed non Gaussian and mutually independent and they are called the independent components of the observed data. These independent components, also called sources or factors, can be found by IAC Some of the characteristics of IAC are summarized as follows 1) IAC can only separate linearly mixed sources. ) Since IAC is dealing with clouds of point, changing the order in which the points are plotted has virtually no effect on the outcome of the algorithm. ) Since IAC separates sources by maximizing their non-Gaussian, perfect Gaussian sources can not be separated. 4) Even when the sources are not independent, IAC finds a space where they are maximally independents. The goal of IAC is to recover independent sources given only sensor observations that are unknown linear mixtures of the unobserved independent source signals. In (PICA), IAC not only decorates the signals but also reduces higher-order statistical appendices, attempting to make the signals as independent as possible.
In other words, “IAC is a way of finding a linear non-orthogonal co-ordinate system in any multivariate data. ‘IAC’ is a technique of data transformation that finds independent sources of activity in recorded mixtures of sources. IAC promises to improve our ability to extract neural signals from recorded mixtures. The mixtures could be, for example, sound recordings from microphones at a cocktail party or in the case of optical recordings, output from photodiode detectors.
The success of IAC depends on en key assumption regarding the nature of the physical world. This assumption is that independent variables or signals are generated by different underlying physical processes. If two signals are independent, then the value of one signal cannot be used to predict anything about the corresponding value of the other signal. As it is not usually possible to measure the output of a single physical process, it follows that most measured signals must be mixtures of independent signals. Given such a set of measured signals (I. . , mixtures), IAC works by finding a transformation of those sutures, that produces independent signal components, on the assumption that each of these independent component signals is associated with a different physical process IAC is an area of study in which source signals -s(t) are to be reconstructed from the mixed signals -x(t), I. E. , mixtures of the source signals subject to determination[1-2]. In general terms: Recover the source vector -s(t), given the m- dimensional mixture vector -x(t) according to (1) , where A is the mixing matrix.
In the context of this problem, the De-mixing or separating matrix W(t) is defined as in (2). X(t) = A -s(t); The main aim is to find the De-mixing matrix to get original signals. Fig . 1 Block diagram of IAC model  Here, ‘n’ hypothetical source signals (S) are mixed linearly and instantaneously by an unknown mixing process (M). Mixed sources are recorded by a set of n detectors (D). Independent component analysis (W) transforms the detected signals into n independent components (C). If the assumptions are correct then the Independent components are the original signals except the scales, sign and order are not preserved.
The Number of Sources and Mixtures Basically, there must usually be at least as many different mixtures of a set of source signals as there are source signals. For the example of speech signals, this implies that there must be at least as many microphones (different voice mixtures) as there are voices (source signals). Method to find independent components The signals from each detector form the rows of the data (or detector) matrix D, and the square matrix W (n =m = the number of detectors) such that WAD= C.
The rows of C are called ‘independent components’ because they are forced to be as independent as possible. The independent components are the same length as the data and there are the same numbers of independent components as there are of detectors. This can be represented schematically: Fig. 2 Representation of independent component analysis  W is called the ‘mixing matrix because it UN-mixes the detected signals in D, which are assumed to be mixtures of signals from different sources. Each row of W unmixed the detected signals (D) into one independent component (row of C).
If the assumptions of the IAC model are correct, the rows of C will be the original source signals. But neither the sign nor the scale will be preserved and the independent components will be shuffled with respect to the original sources. Preprocessing for IAC However, before applying an IAC algorithm on the data, it is usually very useful to do some preprocessing that make the problem of IAC estimation simpler and better conditioned The necessary preprocessing for IAC are centering, Centering: The most basic and necessary preprocessing is to center x, I. . Subtract its mean vector, so as to make x a zero-mean variable. Whitening:- Another useful preprocessing in IAC is to first whiten the observed rabbles. This nears that before the application of the IAC algorithm (and after centering), we transform the observed vector x linearly so that we obtain a new vector x which is white, I. E. Its components are uncorrelated and their variances equal unity. Whitening reduces the number of parameters to be estimated Some of the IAC algorithms used for the implementation of independent source signals are: l.
Informal II. Fast IAC Informal algorithm: There exists two main approaches for IAC, these include: Statistically based algorithms and neural network based IAC algorithms. Compared to the statistical approach, neural network methods are considered to be more computational efficient. INFORMAL algorithm is one of the most popular neural network-based approaches. The INFORMAL method uses a gradient-based algorithm which leads to low complexity in terms of implementation. [l] The Informal algorithm maximizes the information transferred in a network of nonlinear units.
The nonlinear transfer function is able to pick up higher-order moments of the input distributions function in the Informal algorithm is a fixed Logistic function. Fast ‘CA: The Fast IAC algorithm is a computationally highly efficient method for performing the estimation of ‘CA. It uses a fixed-point iteration scheme that has been found in independent experiments to be 10-100 times faster than conventional gradient descent methods for IAC. This technique is suitable for good convergence properties of the one-unit case with symmetrical normalization .
The algorithm finds directly independent components of (practically) any non-Gaussian distribution using any nonlinearity. This is in contrast to many algorithms, where some estimate of the probability strutting function has to be first available, and the nonlinearity must be chosen accordingly. Simulation Results of IAC The graph shows the separation of linearly mixed sources without preprocessing of data Fig. 3 shows the two source signals Fig. 4 shows the mixing of two source signals Fig. Shows the reconstructed original two sources Interpretation of the result In this there are two independent sources A and B, which are mixed linearly due to some mixing process. Here Fast IAC is applied which is able to uncover the original activation of A and B. But the algorithm cannot recover the exact amplitude and sign f the source activities. In this there are two voice signal sources, which are mixed linearly due to some mixing process. Here Fast IAC is applied which is able to reconstruct the original sources. But the algorithm cannot recover the exact amplitude and sign of the source activities.
Conclusion The simulation results of IAC has been obtained and studied. This technique improves the ability to extract neural signals from recorded mixtures I. E. Optical recording, sound recording etc. The recorded data provides the independent components without affecting their scale, design and order. Scope of Future Work A nonlinear system can be successfully handled by the various techniques like artificial neural network, in which it is desired to have proper supervised as well unsupervised features in various learning methods.
The IAC technique can be used in brain imaging, machine fault diagnosis, and speech recognition system and in many more systems At times; it is not possible to use ANN technique alone for a better alternative for feature extraction, lamentation of the data and revealing of hidden dynamics. For handling large data sets genetic algorithm along with neural outworks, PICA and IAC could be a good choice. References: Glen D. Brown, Stators Yamaha and Deterrence J.
Snowiest: “Independent component analysis at the neural cocktail party’. You-mining Chemung, Lei Xx: “Independent component ordering in IAC time series analysis”. wry. CICS. Hut. Few papa/papers/laconic_tutorial/nodded. HTML S. -l. Mari, Papa Hearing, So- Young Lee, Et-Won Lee and V. David Sanchez A. The Guest Editorial Team : “Blind signal separation and independent component analysis”. ”Dimension reduction of process dynamic trends using independent component analysis”. R. F. L’, X. Z. Wang.