Eigenvalues & Eigenvectors Made Practical: From Matrices to Phase Portraits
Eigenvalues and eigenvectors reveal the natural directions of a linear transformation. They power applications from vibration analysis to population models and are the engine behind solving linear systems of differential equations \(\dot{\mathbf{x}} = M\mathbf{x}\). This article blends a freshman-friendly explanation with IB Math AIHL depth: steady states, coupled systems, phase portraits, stability, and complex-eigenvalue behavior.
What are eigenvalues and eigenvectors?
For a square matrix \(A\), a nonzero vector \(\mathbf{x}\) is an eigenvector with eigenvalue \(\lambda\) if
\(A\mathbf{x}=\lambda \mathbf{x}\)
Rearranging gives \((A-\lambda I)\mathbf{x}=0\). A nontrivial solution exists only when the characteristic equation holds:
\(\det(A-\lambda I)=0\)
The roots of this polynomial are the eigenvalues. For an \(n\times n\) matrix there are \(n\) roots (counted with multiplicity), possibly complex. If \(A\) is real, complex roots occur in conjugate pairs \(a\pm bi\).
Two core facts you’ll use constantly
- Trace–sum: \(\displaystyle\sum_{i=1}^{n}\lambda_i=\text{tr}(A)\) (sum of diagonal entries).
- Determinant–product: \(\lambda_1\lambda_2\cdots\lambda_n=\det(A)\).
Why they matter
- Vibrations & stability: Natural frequencies of \(M^{-1}K\) in \(M\mathbf{Y}”=K\mathbf{Y}\) control how structures vibrate.
- Long-run behavior: Markov-like population models stabilize along eigenvectors of eigenvalue 1; other modes decay when \(|\lambda|<1\).
- Data science: PCA finds eigenvectors of a covariance matrix to identify directions of maximum variance.
From Linear Algebra to Differential Equations: \(\dot{\mathbf{x}}=M\mathbf{x}\)
General solution when \(M\) has distinct real eigenvalues
Let \(M\) be \(2\times2\) with distinct real eigenvalues \(\lambda_1,\lambda_2\) and corresponding eigenvectors \(\mathbf{p}_1,\mathbf{p}_2\). Then every solution of the coupled system
\(\dot{\mathbf{x}}=M\mathbf{x}\)
has the form
\(\mathbf{x}(t)=A e^{\lambda_1 t}\mathbf{p}_1+B e^{\lambda_2 t}\mathbf{p}_2\)
where \(A,B\in\mathbb{R}\) are determined by the initial condition \(\mathbf{x}(0)\).
Warm-up (uncoupled ➜ coupled)
- Uncoupled: \(dx/dt=2x\), \(x(0)=5\) gives \(x(t)=5e^{2t}\).
- Coupled example: \(\dot{x}=2x,\ \dot{y}=3y\) with \((x(0),y(0))=(3,5)\) yields \((x(t),y(t))=(3e^{2t},5e^{3t})\) — already in the eigenform above.
Coupled system written as a matrix
Any linear system like
\(\begin{cases}\dot{x}=a_{11}x+a_{12}y\\[4pt] \dot{y}=a_{21}x+a_{22}y\end{cases}\)
is \(\dot{\mathbf{x}}=M\mathbf{x}\) with \(M=\begin{bmatrix}a_{11}&a_{12}\\ a_{21}&a_{22}\end{bmatrix}\) and \(\mathbf{x}=\begin{bmatrix}x\\y\end{bmatrix}\).
IB-style model: competing fungi
Suppose areas (cm\(^2\)) covered by two fungi \(X\) and \(Y\) evolve as
\(\dot{x}=0.4x-0.2y,\qquad \dot{y}=0.1x+0.1y\)
Then \(M=\begin{bmatrix}0.4&-0.2\\ 0.1&0.1\end{bmatrix}\).
Write the system as \(\dot{\mathbf{x}}=M\mathbf{x}\), find \(\lambda_{1,2}\) from \(\det(M-\lambda I)=0\), then construct
\(\mathbf{x}(t)=A e^{\lambda_1 t}\mathbf{p}_1+B e^{\lambda_2 t}\mathbf{p}_2\)
and use \(\mathbf{x}(0)=(10,6)^T\) to solve for \(A,B\). Long-term ratios \(\displaystyle\frac{x(t)}{y(t)}\) are governed by the eigenvector associated with the eigenvalue of largest real part.
Worked example
Given
\(\dot{x}=3x-4y,\qquad \dot{y}=x-2y,\qquad (x(0),y(0))=(4,-2)\)
we have \(M=\begin{bmatrix}3&-4\\1&-2\end{bmatrix}\).
Solve
\(\det(M-\lambda I)
=\left|\begin{array}{cc}
3-\lambda & -4\\
1 & -2-\lambda
\end{array}\right|
=(3-\lambda)(-2-\lambda)+4
=\lambda^2-\lambda-2=0,\)
so \(\lambda = 2, -1.\)
Eigenvectors: for \(\lambda=2\), \((M-2I)\mathbf{p}_1=\mathbf{0}\) gives \(\mathbf{p}_1=\begin{bmatrix}4\\1\end{bmatrix}\); for \(\lambda=-1\), \(\mathbf{p}_2=\begin{bmatrix}1\\1\end{bmatrix}\). Thus
\(\begin{bmatrix}x\\y\end{bmatrix}=A e^{2t}\begin{bmatrix}4\\1\end{bmatrix}+B e^{-t}\begin{bmatrix}1\\1\end{bmatrix}\)
Apply \((x(0),y(0))=(4,-2)\) to get \(4=4A+B\) and \(-2=A+B\), hence \(A=2,\ B=-4\). Therefore
\(\begin{bmatrix}x\\y\end{bmatrix}=2e^{2t}\begin{bmatrix}4\\1\end{bmatrix}-4e^{-t}\begin{bmatrix}1\\1\end{bmatrix}\)
Long-term ratio. As \(t\to\infty\), the \(e^{-t}\) term vanishes; solutions align with \(\mathbf{p}_1\), so \(\displaystyle\frac{x}{y}\to \frac{4}{1}=4\!:\!1\)
Phase Portraits & Stability
Equilibrium & classification
For \(\dot{\mathbf{x}}=M\mathbf{x}\) the origin \((0,0)\) is always an equilibrium. Its stability depends on eigenvalues:
- Both real and negative: stable node (sink). Trajectories move toward the origin as \(t\to\infty\).
- Both real and positive: unstable node (source). Trajectories move away from the origin.
- Opposite signs: saddle point. Flow in along the eigenvector with negative eigenvalue; flow out along the positive one.
- Complex \(a\pm bi\) with \(a<0\): stable spiral (inward spirals).
- Complex \(a\pm bi\) with \(a>0\): unstable spiral (outward spirals).
- Purely imaginary \(\pm bi\): center. Closed orbits (ellipses/circles), neutrally stable.
Reading the portrait quickly
- Eigenvectors draw the special straight-line trajectories through the origin.
- The sign of \(\text{Re}(\lambda)\) decides in/out; the magnitude decides speed toward/away from the origin.
- To determine rotation direction for spirals, evaluate \(dy/dt\) on the \(x\)-axis or \(dx/dt\) on the \(y\)-axis at a test point.
General solution with complex eigenvalues
If \(\lambda=a\pm bi\) and \(\mathbf{v}_{1,2}\) are (possibly complex) eigenvectors, then solutions can be written as
\(\mathbf{x}(t)=e^{at}\!\left(A e^{ibt}\mathbf{v}_1+B e^{-ibt}\mathbf{v}_2\right)\)
Using \(e^{ibt}=\cos(bt)+i\sin(bt)\) shows oscillation with period \(\displaystyle\frac{2\pi}{b}\) and exponential envelope \(e^{at}\).
Example: Population Mixing & Steady State
Married–single model (Markov flavor)
Let
\(A=\begin{bmatrix}0.7&0.2\\0.3&0.8\end{bmatrix},\quad \mathbf{w}_0=\begin{bmatrix}8000\\2000\end{bmatrix}\)
Iterating \(\mathbf{w}_{n+1}=A\mathbf{w}_n\) yields \(\mathbf{w}_{12}=\begin{bmatrix}4000\\6000\end{bmatrix}\) and thereafter no change — a steady-state vector (eigenvalue 1). Any initial distribution converges to a multiple of that eigenvector; remaining components decay if their \(|\lambda|<1\).
Trace, Determinant, and Similarity — quick checks
- Check your work: If you compute eigenvalues numerically, verify that their sum equals \(\text{tr}(A)\) and their product equals \(\det(A)\).
- Coordinate change: If \(B=S^{-1}AS\) (similar matrices), then \(A\) and \(B\) share the same eigenvalues. You’ve only changed basis.
Further Reading
- MIT OCW 18.06: Linear Algebra — authoritative lectures and notes.
- Khan Academy: Eigenvalues & Eigenvectors — visual introductions and exercises.
- Wolfram MathWorld: Eigenvalue — concise reference and properties.
- Wikipedia: Phase portrait — phase-plane behavior and classification.
- Paul’s Online Notes: Systems of DEs — worked examples for \(\dot{\mathbf{x}}=M\mathbf{x}\).
- 3Blue1Brown: Essence of Linear Algebra (Eigenvectors) — superb geometric intuition.



