Factor any square matrix into the product A = QTQT where Q is an orthogonal matrix and T is upper triangular, revealing eigenvalues along the diagonal. Step-by-step explanations included.
Enter a matrix and click Decompose
or press Enter
Schur decomposition is a fundamental matrix factorization that expresses any square matrix A as the product A = QTQT (or A = QTQ* in the complex case), where Q is a unitary (orthogonal, in the real case) matrix and T is an upper triangular matrix. Named after the German mathematician Issai Schur who proved its existence in 1909, the Schur decomposition reveals the eigenvalues of A along the diagonal of T while maintaining perfect numerical stability through orthogonal transformations.
Unlike eigendecomposition, which requires a matrix to have a full set of linearly independent eigenvectors, the Schur decomposition exists for every square matrix without exception. This universality makes it one of the most reliable tools in numerical linear algebra.
In this factorization, Q is an orthogonal matrix (QTQ = I), meaning its columns form an orthonormal basis, and T is an upper triangular matrix whose diagonal entries are the eigenvalues of A (for matrices with real eigenvalues) or whose diagonal contains 1×1 and 2×2 blocks encoding real and complex conjugate eigenvalue pairs (in the real Schur form).
The Schur decomposition occupies a central position in computational linear algebra because it provides a numerically stable way to compute eigenvalues. Direct computation of eigenvalues by finding roots of the characteristic polynomial is notoriously ill-conditioned: small perturbations in the matrix entries can lead to large errors in the computed eigenvalues when working through the polynomial. The Schur decomposition avoids the characteristic polynomial entirely, instead relying on orthogonal similarity transformations that preserve the 2-norm of the matrix and do not amplify rounding errors.
Every major numerical linear algebra library, including LAPACK, MATLAB, NumPy, and Julia's LinearAlgebra module, computes eigenvalues by first reducing the matrix to Schur form. The eigenvalues then simply appear on the diagonal of the triangular factor T. This approach is both more stable and more efficient than alternative methods for general matrices.
The existence of the Schur decomposition is guaranteed by a fundamental theorem in matrix analysis:
Theorem (Schur, 1909): For every square matrix A in Cn×n, there exists a unitary matrix Q and an upper triangular matrix T such that A = QTQ*. The diagonal entries of T are the eigenvalues of A (in any prescribed order).
The proof proceeds by induction on the matrix dimension. For a 1×1 matrix, the result is trivial. For an n×n matrix, one takes any eigenvalue λ with corresponding unit eigenvector q1, extends q1 to an orthonormal basis, and forms the unitary matrix U = [q1 | U2]. Then U*AU has λ in the (1,1) position and zeros below it in the first column. The remaining (n-1)×(n-1) submatrix can be decomposed by the inductive hypothesis, completing the construction.
This theorem is remarkable because it imposes no conditions on A whatsoever: the matrix need not be symmetric, normal, diagonalizable, or even invertible. Every square matrix has a Schur decomposition.
When working with real matrices (entries in R rather than C), the eigenvalues may include complex conjugate pairs. Since the Schur decomposition aims to produce an upper triangular matrix with eigenvalues on the diagonal, the complex Schur form requires complex arithmetic even when the input matrix is real.
The real Schur form addresses this by allowing T to be quasi-upper triangular rather than strictly upper triangular. In the real Schur decomposition A = QTQT, the matrix Q is real orthogonal and T is block upper triangular with 1×1 blocks for real eigenvalues and 2×2 blocks for complex conjugate pairs. Each 2×2 diagonal block has the form:
The real Schur form is preferred in practice because it avoids complex arithmetic entirely while still revealing all the eigenvalue information. The 2×2 diagonal blocks encode complex conjugate eigenvalue pairs, and all eigenvalues can be extracted from T without ever leaving real arithmetic.
The relationship between the Schur decomposition and eigenvalues is direct and elegant. Since A = QTQT is a similarity transformation (A and T are similar matrices), they share the same eigenvalues. For the triangular matrix T, the eigenvalues are simply the diagonal entries. This gives us an immediate and computationally stable way to read off eigenvalues.
For normal matrices (matrices satisfying AAT = ATA, which includes symmetric, orthogonal, and skew-symmetric matrices), the Schur form T is actually diagonal. This means the Schur decomposition of a normal matrix coincides with its eigendecomposition, and the columns of Q are the eigenvectors. For non-normal matrices, the off-diagonal entries of T encode information about the defectiveness of eigenvalues and the sensitivity of the eigenvalue problem.
The departure from normality, often measured by the Frobenius norm of the strictly upper triangular part of T, quantifies how far a matrix is from being diagonalizable by a unitary transformation. This measure is important in perturbation theory and the analysis of non-normal operators.
The standard computational method for finding the Schur decomposition is the QR algorithm, one of the most important algorithms in numerical analysis. The basic QR algorithm works as follows:
The accumulated product Q = Q0Q1Q2... gives the orthogonal factor in the Schur decomposition. In practice, the algorithm is enhanced with several critical optimizations:
The Schur decomposition appears throughout scientific computing and engineering:
expm, logm, and sqrtm functions.The full Schur decomposition of an n×n matrix requires the following computational effort:
10n³/3 floating-point operations.4n³/3 to the cost.The overall cost is approximately 25n³ floating-point operations when both Q and T are computed, or about 10n³ when only eigenvalues are needed. This makes the Schur decomposition more expensive than LU factorization (2n³/3) but comparable to the full SVD.
The Schur decomposition and eigendecomposition are closely related but differ in important ways:
Factor into lower and upper triangular matrices with partial pivoting. Essential for solving linear systems.
Open calculator →Decompose into orthogonal Q and upper triangular R. Ideal for least squares problems.
Open calculator →The most general decomposition. Factor any matrix into UΣVᵀ.
Open calculator →Efficient factorization for symmetric positive definite matrices.
Open calculator →Find eigenvalues and eigenvectors of square matrices.
Open calculator →