/ˈaɪˌɡənˌvɛk.tər/
noun … “the direction that refuses to bend under transformation.”
Eigenvector is a non-zero vector that, when a linear transformation represented by a matrix is applied, changes only in scale (by its corresponding eigenvalue) but not in direction. In other words, if A is a square matrix representing a linear transformation and v is an eigenvector, then A·v = λv, where λ is the associated eigenvalue. Eigenvectors reveal intrinsic directions in which a system stretches, compresses, or rotates without altering the vector’s line of action.
In practice, Eigenvectors are central to numerous areas of mathematics, physics, and machine learning. In Principal Component Analysis, eigenvectors of the covariance matrix indicate the directions of maximal variance, providing a basis for dimensionality reduction. In dynamics and control systems, they reveal modes of motion or stability. In quantum mechanics, eigenvectors of operators describe fundamental states of a system. Their corresponding eigenvalues quantify the magnitude of these effects.
Computing Eigenvectors involves solving the characteristic equation det(A - λI) = 0 to find eigenvalues, then finding vectors v satisfying (A - λI)v = 0. For symmetric or positive-definite matrices, eigenvectors are orthogonal, forming a natural coordinate system that simplifies many computations, such as diagonalization, spectral decomposition, or solving systems of differential equations.
Eigenvectors intersect with related concepts such as Eigenvalue, Linear Algebra, Covariance Matrix, Principal Component Analysis, and Singular Value Decomposition. They serve as the backbone for algorithms in data science, signal processing, computer graphics, and machine learning, providing the axes along which data or transformations behave in the simplest, most interpretable way.
Example conceptual workflow for using eigenvectors in data analysis:
compute covariance matrix of dataset
solve characteristic equation to find eigenvalues
for each eigenvalue, find corresponding eigenvector
sort eigenvectors by decreasing eigenvalue magnitude
project original data onto top eigenvectors for dimensionality reductionIntuitively, an Eigenvector is like a resilient rod embedded in a flexible sheet: when the sheet is stretched, bent, or twisted, the rod maintains its orientation while only lengthening or shortening. It defines the natural directions along which the system acts, revealing the geometry hidden beneath complex transformations.