We will now work through an example where the principal components cannot easily determined by inspection.

Given 4 data points in 2 dimensions:

[mathjaxinline]\displaystyle \displaystyle \mathbf{x}^{(1)}[/mathjaxinline] [mathjaxinline]\displaystyle =[/mathjaxinline] [mathjaxinline]\displaystyle (0,2)[/mathjaxinline]
[mathjaxinline]\displaystyle \mathbf{x}^{(2)}[/mathjaxinline] [mathjaxinline]\displaystyle =[/mathjaxinline] [mathjaxinline]\displaystyle (0,-2)[/mathjaxinline]
[mathjaxinline]\displaystyle \mathbf{x}^{(3)}[/mathjaxinline] [mathjaxinline]\displaystyle =[/mathjaxinline] [mathjaxinline]\displaystyle (1,1)[/mathjaxinline]
[mathjaxinline]\displaystyle \mathbf{x}^{(4)}[/mathjaxinline] [mathjaxinline]\displaystyle =[/mathjaxinline] [mathjaxinline]\displaystyle (-1,-1)[/mathjaxinline]
By inspection, roughly estimate the direction in which the (empirical) variance is the largest. (There is no answer box for this question.)

Find the spectral decomposition of the [mathjaxinline]\mathbf{S}[/mathjaxinline]. That is, find the eigenvalues and their corresponding eigenvectors.

Enter the eigenvalues in decreasing order (so [mathjaxinline]\lambda _1>\lambda _2[/mathjaxinline].)

[mathjaxinline]\lambda ^{(1)}=\quad [/mathjaxinline]
=
[mathjaxinline]\lambda ^{(2)}=\quad [/mathjaxinline]

Find the eigenvectors [mathjaxinline]\mathbf v_{\lambda _1}[/mathjaxinline] and [mathjaxinline]\mathbf v_{\lambda _2}[/mathjaxinline]. (All scalar multiples will be accepted)

[mathjaxinline]\mathbf v_{\lambda _1}=\quad[/mathjaxinline]

[mathjaxinline]\mathbf v_{\lambda _2}=\quad[/mathjaxinline]

1 answer

To estimate the direction in which the variance is the largest, we can calculate the covariance matrix of the data points and find its eigenvectors and eigenvalues.

First, we calculate the mean of the data points:

[mathjaxinline]\bar{\mathbf{x}} = \frac{1}{4} \left( \mathbf{x}^{(1)} + \mathbf{x}^{(2)} + \mathbf{x}^{(3)} + \mathbf{x}^{(4)} \right) = \frac{1}{4} \left( (0,2) + (0,-2) + (1,1) + (-1,-1) \right) = \left( 0,0 \right)[/mathjaxinline]

Next, we calculate the covariance matrix [mathjaxinline]\mathbf{S}[/mathjaxinline]:

[mathjaxinline]\mathbf{S} = \frac{1}{3} \left( (\mathbf{x}^{(1)} - \bar{\mathbf{x}}) (\mathbf{x}^{(1)} - \bar{\mathbf{x}})^T + (\mathbf{x}^{(2)} - \bar{\mathbf{x}}) (\mathbf{x}^{(2)} - \bar{\mathbf{x}})^T + (\mathbf{x}^{(3)} - \bar{\mathbf{x}}) (\mathbf{x}^{(3)} - \bar{\mathbf{x}})^T + (\mathbf{x}^{(4)} - \bar{\mathbf{x}}) (\mathbf{x}^{(4)} - \bar{\mathbf{x}})^T \right)[/mathjaxinline]

Plugging in the values:

[mathjaxinline]\mathbf{S} = \frac{1}{3} \left( \begin{pmatrix} 0 \\ 2 \end{pmatrix} \begin{pmatrix} 0 & 2 \end{pmatrix} + \begin{pmatrix} 0 \\ -2 \end{pmatrix} \begin{pmatrix} 0 & -2 \end{pmatrix} + \begin{pmatrix} 1 \\ 1 \end{pmatrix} \begin{pmatrix} 1 & 1 \end{pmatrix} + \begin{pmatrix} -1 \\ -1 \end{pmatrix} \begin{pmatrix} -1 & -1 \end{pmatrix} \right)[/mathjaxinline]

Simplifying:

[mathjaxinline]\mathbf{S} = \frac{1}{3} \begin{pmatrix} 2 & 2 \\ 2 & 2 \end{pmatrix} + \frac{1}{3} \begin{pmatrix} 2 & -2 \\ -2 & 2 \end{pmatrix} + \frac{1}{3} \begin{pmatrix} 2 & 2 \\ 2 & 2 \end{pmatrix} + \frac{1}{3} \begin{pmatrix} 2 & -2 \\ -2 & 2 \end{pmatrix}[/mathjaxinline]

[mathjaxinline]\mathbf{S} = \begin{pmatrix} \frac{4}{3} & \frac{4}{3} \\ \frac{4}{3} & \frac{4}{3} \end{pmatrix}[/mathjaxinline]

Now, we find the eigenvalues [mathjaxinline]\lambda^{(1)}[/mathjaxinline] and [mathjaxinline]\lambda^{(2)}[/mathjaxinline] of [mathjaxinline]\mathbf{S}[/mathjaxinline]:

[mathjaxinline]\text{det}(\mathbf{S} - \lambda \mathbf{I}) = 0[/mathjaxinline]

[mathjaxinline]\begin{vmatrix} \frac{4}{3} - \lambda & \frac{4}{3} \\ \frac{4}{3} & \frac{4}{3} - \lambda \end{vmatrix} = 0[/mathjaxinline]

[mathjaxinline]\left(\frac{4}{3} - \lambda\right)\left(\frac{4}{3} - \lambda\right) - \left(\frac{4}{3}\right)\left(\frac{4}{3}\right) = 0[/mathjaxinline]

[mathjaxinline]\lambda^2 - \frac{8}{3} \lambda + \frac{16}{9} - \frac{16}{9} = 0[/mathjaxinline]

[mathjaxinline]\lambda^2 - \frac{8}{3} \lambda = 0[/mathjaxinline]

[mathjaxinline]\lambda \left( \lambda - \frac{8}{3} \right) = 0[/mathjaxinline]

Solving for [mathjaxinline]\lambda[/mathjaxinline]:

[mathjaxinline]\lambda = 0 \quad \text{ or } \quad \lambda = \frac{8}{3}[/mathjaxinline]

The eigenvalues are [mathjaxinline]\lambda^{(1)} = \frac{8}{3}[/mathjaxinline] and [mathjaxinline]\lambda^{(2)} = 0[/mathjaxinline].

To find the corresponding eigenvectors, we solve the equation [mathjaxinline](\mathbf{S} - \lambda \mathbf{I})\mathbf{v} = \mathbf{0}[/mathjaxinline].

For [mathjaxinline]\lambda^{(1)}[/mathjaxinline]:

[mathjaxinline]\begin{pmatrix} \frac{4}{3} - \frac{8}{3} & \frac{4}{3} \\ \frac{4}{3} & \frac{4}{3} - \frac{8}{3} \end{pmatrix} \begin{pmatrix} v_{1} \\ v_{2} \end{pmatrix} = \begin{pmatrix} 0 \\ 0 \end{pmatrix}[/mathjaxinline]

[mathjaxinline]\begin{pmatrix} -1 & \frac{4}{3} \\ \frac{4}{3} & -1 \end{pmatrix} \begin{pmatrix} v_{1} \\ v_{2} \end{pmatrix} = \begin{pmatrix} 0 \\ 0 \end{pmatrix}[/mathjaxinline]

Simplifying:

[mathjaxinline]-v_{1} + \frac{4}{3}v_{2} = 0[/mathjaxinline]

[mathjaxinline}\frac{4}{3}v_{1} - v_{2} = 0[/mathjaxinline]

Solving this system of equations, we find the eigenvector [mathjaxinline]\mathbf{v}_{\lambda^{(1)}} = \begin{pmatrix} 1 \\ 1 \end{pmatrix}[/mathjaxinline].

For [mathjaxinline]\lambda^{(2)}[/mathjaxinline]:

[mathjaxinline]\begin{pmatrix} \frac{4}{3} & \frac{4}{3} \\ \frac{4}{3} & \frac{4}{3} \end{pmatrix} \begin{pmatrix} v_{1} \\ v_{2} \end{pmatrix} = \begin{pmatrix} 0 \\ 0 \end{pmatrix}[/mathjaxinline]

[mathjaxinline]\begin{pmatrix} \frac{4}{3} & \frac{4}{3} \\ \frac{4}{3} & \frac{4}{3} \end{pmatrix} \begin{pmatrix} v_{1} \\ v_{2} \end{pmatrix} = \begin{pmatrix} 0 \\ 0 \end{pmatrix}[/mathjaxinline]

Simplifying:

[mathjaxinline]\frac{4}{3}v_{1} + \frac{4}{3}v_{2} = 0[/mathjaxinline]

[mathjaxinline}\frac{4}{3}v_{1} + \frac{4}{3}v_{2} = 0[/mathjaxinline]

Solving this system of equations, we find the eigenvector [mathjaxinline]\mathbf{v}_{\lambda^{(2)}} = \begin{pmatrix} 1 \\ -1 \end{pmatrix}[/mathjaxinline].

Therefore, the spectral decomposition of [mathjaxinline]\mathbf{S}[/mathjaxinline] is:

[mathjaxinline]\mathbf{S} = \frac{8}{3} \begin{pmatrix} 1 \\ 1 \end{pmatrix} \begin{pmatrix} 1 & -1 \end{pmatrix} + 0 \begin{pmatrix} 1 \\ -1 \end{pmatrix} \begin{pmatrix} 1 & -1 \end{pmatrix}[/mathjaxinline]

[mathjaxinline]\mathbf{S} = \frac{8}{3} \begin{pmatrix} 1 & -1 \\ 1 & -1 \end{pmatrix}[/mathjaxinline]

The eigenvalues in decreasing order are [mathjaxinline]\lambda^{(1)} = \frac{8}{3}[/mathjaxinline] and [mathjaxinline]\lambda^{(2)} = 0[/mathjaxinline].

The eigenvectors are [mathjaxinline]\mathbf{v}_{\lambda^{(1)}} = \begin{pmatrix} 1 \\ 1 \end{pmatrix}[/mathjaxinline] and [mathjaxinline]\mathbf{v}_{\lambda^{(2)}} = \begin{pmatrix} 1 \\ -1 \end{pmatrix}[/mathjaxinline].