About Eigen Values and Eigen Vectors: Part 1

Vandana Rajan
4 min readAug 15, 2019

The concept of Eigen values and Eigen vectors is very important in the world of signal processing and machine learning. Eigen vector computation forms the core of Principal Component Analysis (PCA) and Independent Component Analysis (ICA) algorithms. In this post, I will explain some basics about this concept.

The word Eigen means ‘own’ in German and Dutch languages. You will soon see why this name suits for our eigen values and vectors.

Let us start from a simple question. What happens to a vector (x) when it is multiplied by a matrix (A)? Well, most of them change their directions. Let us see an example. Let x = [4, 7] and A = [[3,1],[2,5]]. Ax = [19,43]. The direction (obtained by atan(y/x)) of original x is 1.0517 radians and the direction of Ax is 1.1547 radians. Thus, after multiplication with the matrix, our vector changed its direction. Like wise, there are infinite examples to illustrate this change.

Now, let us take another pair of vector and matrix. Let x = [6, 4] and A = [[8,3],[2,7]]. The direction of x is 0.5880 radians. What about the direction of Ax? Turns out that it is also the same. It seems as if A and x are specially made for each other!!!. In other words, x is A’s “own” vector or eigen vector.

In the previous example, you might have noticed that even though the direction remains unchanged, the magnitude of x has changed after the multiplication. This change in magnitude is known as eigen value. It indicates how much the vector x has been squeezed or elongated or reversed.

Thus mathematically, we can write the whole thing as follows.

where lambda is the eigen value.

If you think about it, you realize that a matrix is now being represented as a combination of a scalar and a vector, i.e; a pair of eigen value and eigen vector. Isn’t that cool?

Now, how do we calculate the eigen value and eigen vector of a matrix?

For that, let us start from the previous equation. If we re arrange the equation, we get

where I is the identity matrix having same dimension as A. Now, look at this equation. This means that (A — lambda I) is transforming the vector x into a zero vector, which is possible only if the determinant of (A — lambda I) is zero.

Thus we have the following equation, which is known as the ‘characteristic equation’ of matrix A.

Let us see an example now.

I hope the steps are self explanatory. I have calculated the eigen vector corresponding to only one eigen value. As a small exercise, you can calculate the eigen vector associated with eigen value 3.

In the example above, we saw that the matrix has 2 eigen values. You might have also noticed that corresponding to eigen value 1, there are an infinite number of eigen vectors for that matrix. Let us take note of an important point here.

A matrix can have more than one eigen vector sharing the same eigen value.

Interesting right? Now, can the reverse be true? I mean, can one eigen vector correspond to multiple eigen values? Let us for an instance, assume that it is true. Then,

Look at the above equation. It can be true only if both the eigen values are equal, since eigen vectors cannot be zero. Thus, we can say with confidence that one eigen vector cannot correspond to more than one eigen value.

--

--