Compute eigenvalues and eigenvectors of any square matrix with detailed step-by-step solutions. Understand the spectral structure of your matrix through the decomposition A = VDV⁻¹.
Enter a square matrix and click Decompose
or press Enter
An eigenvector of a square matrix A is a non-zero vector v that, when multiplied by A, yields a scalar multiple of itself. That scalar is called the eigenvalue. In equation form:
The word "eigen" comes from the German word meaning "own" or "characteristic." Eigenvalues and eigenvectors reveal the intrinsic properties of a linear transformation: the directions along which the transformation acts as simple scaling, and the scale factors themselves.
Every n×n matrix has exactly n eigenvalues (counted with multiplicity) over the complex numbers, a fundamental result guaranteed by the Fundamental Theorem of Algebra. Real matrices may have complex eigenvalues, which always appear in conjugate pairs.
If a square matrix A has n linearly independent eigenvectors, it can be factored as:
where V is a matrix whose columns are the eigenvectors of A, and D is a diagonal matrix containing the corresponding eigenvalues along its diagonal. This factorization is called the eigendecomposition or spectral decomposition.
Not every matrix can be eigendecomposed. A matrix is diagonalizable if and only if it has a full set of n linearly independent eigenvectors. Defective matrices (those lacking a complete set of independent eigenvectors) cannot be diagonalized, though they can be reduced to Jordan normal form.
Geometrically, multiplying by a matrix represents a linear transformation: rotations, reflections, scalings, and shears. Eigenvectors are the special directions that remain unchanged (or simply reverse) under the transformation. The eigenvalue tells you the stretching factor along that direction:
Think of a matrix as a machine that distorts space. The eigenvectors point along the axes of that distortion, and the eigenvalues measure how much distortion happens along each axis.
Eigenvalues are the roots of the characteristic polynomial, obtained by solving:
For an n×n matrix, this produces a degree-n polynomial in λ. For a 2×2 matrix [[a, b], [c, d]], the characteristic polynomial is:
where (a + d) is the trace and (ad - bc) is the determinant. For larger matrices, directly solving the characteristic polynomial is numerically unstable, which is why iterative algorithms like QR iteration are used in practice.
The QR algorithm is the standard numerical method for computing eigenvalues. It works by repeatedly factoring a matrix into an orthogonal matrix Q and an upper triangular matrix R, then multiplying them in reverse order:
The QR algorithm preserves eigenvalues at each step because Ak+1 = QkTAkQk is a similarity transformation. In practice, the algorithm is accelerated using Wilkinson shifts and an initial Hessenberg reduction, bringing the complexity from O(n4) down to roughly O(n3) per run.
Real symmetric matrices (where A = AT) enjoy especially nice eigendecomposition properties guaranteed by the Spectral Theorem:
The symmetric eigendecomposition is the foundation of Principal Component Analysis (PCA), where the covariance matrix (always symmetric positive semi-definite) is decomposed to find the directions of greatest variance in a dataset.
Eigenvalues and eigenvectors appear across virtually every area of science and engineering:
Computing eigenvalues and eigenvectors is significantly more expensive than many other matrix operations:
For the matrices supported by this calculator (up to 5×5), computation is instantaneous. Industrial applications use optimized LAPACK routines for matrices with thousands or millions of rows.
Eigendecomposition is deeply connected to the other matrix decompositions available on this site:
Explore more matrix decomposition methods with dedicated calculators and step-by-step solutions.
Factor a matrix into Lower and Upper triangular matrices. Essential for solving systems of linear equations efficiently.
Open calculator →Decompose into an orthogonal matrix Q and upper triangular R. Used in least squares regression and eigenvalue algorithms.
Open calculator →The most general matrix decomposition. Factorize any matrix into U, Sigma, V-transpose. Powers recommendation systems and data compression.
Open calculator →Efficient factorization for symmetric positive definite matrices into LL-transpose. Widely used in Monte Carlo simulations and optimization.
Open calculator →