🔬 Tutorial problems epsilon #
Note
This problems are designed to help you practice the concepts covered in the lectures. Not all problems may be covered in the tutorial, those left out are for additional practice on your own.
.1#
What is the rank of the
What about the upper-triangular matrix which diagonal elements are 1?
Check all relevant definitions
By definition,
Draft of the proof for the second question:
For the upper triangular matrix start by showing that the columns are linearly independent, and because there are
.2#
Show that if
Check all relevant definitions
Let
In the proof we will exploit the fact that
So pick any vectors
Using these definitions, linearity of
This chain of equalities confirms
.3#
Chose an orthonormal basis in
To come up with an orthonormal basis in
Denote
45 degree line between
and axes axis itself since it is normal to the whole plane spanned by and axes, and thus orthogonal to the first chosen linelastly, the 45 degree line between the negative side of
axes and positive side of axes
In other words, choose the following vectors to form the basis:
It is clear that
Computing the norm of which of the vectors and dividing by it we get the following orthonormal basis:
The transformation matrix is formed by placing the vectors of the new basis in the coordinates of the old basis into its columns. The matrix is
To show that the matrix is orthogonal, we need to show that
Alternatively, by performing Gauss-Jacobian operations, you could derive the inverse matrix
.4#
For each of the linear maps defined by the following matrices
perform the following tasks:
Find eigenvalues
Find at least one eigenvector for each eigenvalue
Form a new basis from the eigenvectors (normalized or not)
Compute the transformation matrix to the new basis
Find the matrix
in the new basis and verify that it is diagonal
See example in the lecture notes
1.
To find eigenvalues solve
Therefore the eigenvalues are
2.
To find eigenvectors plug the eigenvalues one by one to
Thus, any value of
Therefore, the vector
Obviously, all vectors of the form
Now, all vectors of the form
3.
We have chosen the eigenvectors in such a way that they are already normalized, i.e. have length of 1. To verify, observe
It’s easy to verify that vectors
forms a normalized basis in
4.
The transformation matrix is a matrix with columns formed from the vectors of the new basis expressed in coordinates (``the language’’) of the old basis.
5.
The matrix
In any case, we need
We can find the solutions of all three systems by Gaussian elimination performing elementary row operations on an ``extra’’ augmented matrix with three columns in place of the right-hand side.
Therefore, the inverse of the
Additional exercise: verify that
Now we can compute
1.
To find eigenvalues solve
(expanding along the top row)
Therefore the eigenvalues are
2.
To find eigenvectors plug the eigenvalues one by one to
Doing Gaussian-Jordan elimination we have
In other words, eigenvector
In other words, eigenvector
In other words, eigenvector
3.
We have chosen the eigenvectors in a way that they are not normalized, and let’s try to see if this approach results in a diagonal matrix
forms a basis in
4.
The transformation matrix is a matrix with columns formed from the vectors of the new basis expressed in coordinates (``the language’’) of the old basis.
5.
Again, the matrix
We find
Therefore, the inverse of the
Additional exercise: verify that
Now we can compute
1.
To find eigenvalues solve
Therefore the only eigenvalue is
2.
To find eigenvectors plug the eigenvalues one by one to
Because the eigenvalue is repeated, we should expect difficulties finding enough eigenvectors to form a new basis — we need at least three linearly independent eigenvectors in
Doing Gaussian-Jordan elimination we have
In other words, all eigenvectors have the form
In order to form a basis from eigenvectors, we need three linearly independent of them, which is impossible in this case because there is only one free parameter! In other words, all eigenvectors we can come up with will lie within the same line (one degree of freedom), and thus we can not even have two linearly inpendent eigenvectors, let alone three.
3. - 5.
1.
First of all, note that
Therefore eigenvalues are
2.
To find eigenvectors plug the eigenvalues one by one to
Because the eigenvalue is repeated, we should expect to do more work than usual, but the basis using eigenvectors should still be possible to find.
In other words, the eigenvectors corresponding to
It is clear immediately, that the only restriction placed by this linear system of equations is that
3.
In order to form a basis from eigenvectors, we need three linearly independent of them. Fortunately, there is enough degrees of freedom in the parameters (one from the first eigenvalue and two from the second) to have three linearly independent eigenvectors. For example,
4.
The transformation matrix is a matrix with columns formed from the vectors of the new basis expressed in coordinates (``the language’’) of the old basis.
5.
Again, the matrix
We find
Additional exercise: verify that
Now we can compute
We see again that
.5#
Compute 10th power of the following matrix
No way you should compute tenth power directly.
Consider diagonalization of
Let’s diagonalize the matrix
and computing
Let’s find the eigenvalues of
Set
Therefore, the eigenvalues given by
Now we need to find a basis formed of eigenvectors. Each time let’s solve the corresponding system
We conclude that the vectors of the form
For
In this case the vectors of the form
For
Vectors of the form
The transformation matrix
Performing Gaussian elimination further to get the inverse matrix
Now we have all the components to compute
where
The final answer is
.6#
A stochastic matrix is a square matrix, whose rows sum up to 1.
Consider the following
where
Show that the maximum eigenvalue of
Both direct proof and proof my mathematical induction will work. In both cases it is worth starting with the simple case of
Start with
The eigenvalues
We have
The last line with factorization can be obtained by the quadratic formula or by Vieta’s formula about the product and the sum of the roots of a quadratic equation.
It is clear from the characteristic equation that the two eigenvalues are
Now consider the general case of
Expanding this determinant along the second column we get
And again expanding this determinant along the second column we get
And again and so forth until
In the end we are left with nearly the same determinant as in the case of
The eigenvalues which are the roots of the equation
are
Given that
Therefore