Chapter 6: Eigenvalues and Eigenvectors
This post is based on Introduction to Linear Algebra by Gilbert Strang book.
1. Definition of Eigenvalues and Eigenvectors
For a square matrix A, an eigenvector v is a non-zero vector that satisfies:
where λ is the corresponding eigenvalue. This equation means that multiplying A by v scales v by λ without changing its direction.
2. How to Find Eigenvalues and Eigenvectors
To find eigenvalues and eigenvectors:
- Solve the characteristic equation:
where I is the identity matrix.
2. For each eigenvalue λ, solve the system:
to find the corresponding eigenvector v.
Example 1: 2x2 Matrix
Let’s compute the eigenvalues and eigenvectors of:
Step 1: Find Eigenvalues
Solve the characteristic equation:
Solve the quadratic equation:
Step 2: Find Eigenvectors
For λ1=5:
This simplifies to:
So, the eigenvector corresponding to λ1=5 is:
For λ2=2:
This simplifies to:
So, the eigenvector corresponding to λ2=2 is:
Python Code:
import numpy as np
# Define the matrix
A = np.array([[4, 1],
[2, 3]])
# Compute eigenvalues and eigenvectors
eigenvalues, eigenvectors = np.linalg.eig(A)
print("Eigenvalues:")
print(eigenvalues)
print("\nEigenvectors:")
print(eigenvectors)
Note: The eigenvectors are normalized to unit length.
3. Applications of Eigenvalues and Eigenvectors
- Diagonalization: Expressing a matrix A as A=PDP−1, where D is a diagonal matrix of eigenvalues and P is a matrix of eigenvectors.
- Principal Component Analysis (PCA): A dimensionality reduction technique in machine learning that uses eigenvectors of the covariance matrix.
- Stability Analysis: In differential equations, eigenvalues determine the stability of a system.
Happy learning :-)