Summary of lectures and homework assignments
Here is what we have done so far:
- 8/30: We talked about fields. Most of linear algebra can be
done using
scalars in any field (a notable exception being inner products which
require for example a notion of positivity). That being said,
after
this mention of other fields to broaden your horizons, we shall
concentrate on the fields R (the real numbers) and C (the complex
numbers). We also discussed the algebraic rules satisfied by
matrices. They are almost the same as for fields except you don't
have
AB=BA and A-1 does not exist for
some nonzero
matrices A. I handed out a test of some concepts from first
semester
linear algebra which I want you to hand in on 9/6. The material I
covered roughly corresponds to chapter 1 of H&K.
- 9/1: 2.1,2.2 We talked about vector spaces and subspaces.
Since H&K does not give a notation for the set of functions from S
to F we use the standard notation FS for this.
Probably every vector space we encounter will be a subspace of FS
for an appropriate S. Homework due 9/8:
- 2.1: 5, 6, 7
- 2.2: 1, 2, 5, 7
- Grade distribution, (scaled down by a factor of 7): 2, 2.5, 3,
5.4, 6.3, 6.5, 7, 7, 7.1, 7.5, 7.5, 7.7, 8.2, 8.7, 8.7, 9.5, 9.7
- 9/6: We covered more material from 2.1,2.2. We
showed that c0 =0,
0a = 0, and (-1)a = -a. We talked about linear combinations of vectors,
showed
the solutions to AX=0 form a subspace of Fnxk, talked
about the
subspace spanned by a set of vectors.
- 9/8: We talked a little about the homework. We looked at
the sum of subspaces W1+W2+...Wk. We started on 2.3, linear
dependence and independence. I handed back the diagnostic exam.
The score distribution was:
- 90-100: 11
- 80-89: 4
- 70-79: 3
- 50-69: 1
- 40-49: 4
- 9/11: We continued with section 2.3. We defined basis and
gave some examples. We showed that any two bases have the same
number of elements. Homework due 9/15:
- 9/13: We finished 2.3, showing that any linearly independent
subset of a subspace W of a finite dimensional vector space V can be
completed to a (finite) basis of W. In particular, any subspace
of a finite dimensional vector space has a finite basis. We
showed that if W1 and W2 are two subspaces of a finite dimensional
vector space then dim(W1)+dim(W2) = dim(W1+W2) + dim(W1 intersect W2).
- 9/15: We covered 2.4, coordinates with respect to a basis, and
talked some about the ideas in 2.5. Our goal in 2.5 was to show
that the row reduced echelon form of a matrix is unique, so if you do
two different sets of row operations they will still end up with the
same rref. In fact we sketched why the rref of A is determined by
the row space. So in fact the rref depends only on the row space.
- 9/18: We started talking about 3.1, linear transformations,
giving some examples, showing a linear transformation is uniquely
determined by its values on a finite basis, and showing the range and
kernel (or null space as H&K calls it) is a subspace. Homework due
9/22:
- 3.1: 1(and say why), 2, 3, 4 (give reason), 8, 9, 12
- 3.2: 1, 6
- 9/20: We talked about most of 3.2. We showed that the set
of linear transformations from V to W forms a vector space L(V,W). We
showed the composition of linear transformations is a linear
transformation. We showed that the inverse of a linear
transformation, if it exists, is a linear transformation. We gave
an example of an onto linear transformation from V to V which is not
one to one, and a one to one linear transformation from V to V which is
not onto. This is not possible in finite dimensions.
- 9/22: We proved Thm 2 of 3.1 and talked about isomorphisms,
showing that finite dimensional vector spaces are isomorphic if and
only if they have the same dimension.
- 9/25: We talked about representing a linear transformation as a
matrix, 3.4. HW due Monday, 10/2.
- 3.3: 3, 6, 7
- 3.4: 1, 8, 10
- 3.5: 1, 2, 7
- 9/27: We talked about linear functionals, dual bases,
anihilator, 3.5.
- 9/29: We talked about the double dual and showed there is a
natural isomorphism between V and V** if V is finite dimensional.
In infinite dimensions, we showed the natural map is one to one
(assuming the axiom of choice) but is not an isomorphism. 3.6.
- 10/2: We talked about the transpose of a linear transformation,
3.7
- 10/4: No new material, we did some problems from 3.6 and
3.7.
- 10/6: Exam #1
- 10/9: Ideals of polynomials, 4.4. An ideal in F[x] is a
subspace J of F[x] which is closed under multiplication by x.
Equivalently, it is closed under multiplication by any polynomial in
F[x]. It turns out all ideals in F[x] are principal, that is, any
ideal is the set of all multiples of some fixed polynomial p(x).
Homework due 10/16:
- 4.2: 1
- 4.4: 1, 4
- 6.2: 1, 5, 11, 15
- 10/11: 6.1, 6.2, 6.3 We talked about the annihilating ideal
of a linear transformation T:V->V, the set of all polynomials p so
that p(T)=0. The minimal polynomial of T is the monic polynomial
which generates this annihilationg ideal. We then started talking
about characteristic values and characteristic vectors, which are just
other names for what you probably learned as eigenvalues and
eigenvectors. The due date for homework has been changed to
Monday.
- 10/13: More 6.2 and 6.3. We showed that any characteristic
value of T is a root of its minimal polynomial and vice versa. We
then defined the characteristic polynomial of T and stated the Cayley
Hamilton theorem that the characteristic polynomial annihilates
T. I find the proof of this given in H&K 6.3 rather obscure
so let's wait for later to prove it. Meanwhile I gave a sort of
proof, showing that it is true for diagonalizable operators and then
noting that char poly(T) is a continuous function on Cnxn
which is 0 on the diagonalizable matrices and any matrix may be
arbitrarily closely approximated by a diagonalizable matrix, hence by
continuity this function is 0 on all of Cnxn.
- 10/16: We started talking about invariant subspaces, 6.4. HW due
10/23:
- 6.3: 2, 4 (you may assume prob 3), 6
- 6.4: 1, 3, prove 5, 9
- 10/18: We talked about the T conductor. Then proved that a linear
operator is triangulable if and only if its minimal polynomial is a
product of linear factors. We started proving that a linear operator is
diagonalizable if and only if its minimal polynomial is a product of
distinct linear factors.
- 10/20: We finished the proof that a linear operator is
diagonalizable if and only if its minimal polynomial is a product of
distinct linear factors. We stated the result that a family of
commuting diagonalizable operators is simultaneously diagonalizable and
that a family of commuting triangulable operators is simultaneously
triangulable. We got off on a tangent though and didn't prove
it. the proof is in the book in section 6.5.
- 10/23: 6.6, 6.7, We talked about the (internal) direct sum of
subspaces. If V is the direct sum of T invariant subspaces, then
equivalently we may find a basis which puts T in block diagonal form.
HW due 10/30:
- 6.5: 1a, 2
- 6.6: 1, 4, 5
- 6.7: 2, 6 (but he means example 5 of section 6.3)
- 6.8: 1
- 10/25: 6.7 We talked about the projection operators E_i
associated to a direct sum decomposition. I also gave a proof of
the Cayley-Hamilton theorem if the field F is the complex numbers (and
in fact the proof works for any algebraically closed field if you know
what that is). It used the fact that any operator on a finite
dimensional vector space over C is triangulable.
- 10/27: 6.8 We proved an important special case of the Primary
Decomposition Theorem, simplifying things a little by assuming that the
minimal polynomial is a product of linear factors (as would be true if
the field were the complex numbers). This simplification does not
change the flavor of the proof at all, but does mean we don't need to
spend time talking about prime polynomials. Anyway, when reading
Theorem 12 in the text you can assume for our purposes that pi
= x-ci.
- 10/30: I talked about a simplified version of 7.1, 7.2 which
covers what we need for the Jordan form. Notes on this are here. HW due 11/8:
- 7.1: 1, 5
- 7.2: 1
- 7.3: 1, 3, 4, 5, 8, 10, 14
- 11/1: I talked a little more on the simplified 7.1, 7.2.
- 11/3: 7.3 I talked about getting a matrix into Jordan form.
After class I postponed the homework due date to Wednesday.
- 11/6: 7.3 more on the handout from 10/30. Here are a
few notes on Jordan form, they may help with
the homework.
- 11/8: We did a number of homework problems. One was on
square roots and I made a few comments on finding
the square root of a linear operator.
- 11/10: We began talking about inner products. Note that we
are specializing to the real or complex fields only. After class
someone floated the idea of delaying the exam until Monday 11/20.
We'll talk about this next time.
- 11/13: We discussed the Gram-Schmidt process and did some
problems. Exam #2 is definitely delayed until Monday, 11/20. But
anyone wishing to take an alternate exam on Friday 11/17 may do
so. It will be given after class. HW due 11/20:
- 8.1: 1, 2, 6, 9, 12
- 8.2: 3, 6, 7, 10, 17
- 11/15: We showed the Cauchy-Schwartz inequality, the triangular
inequality, and the Pythagorean theorem. We also looked at the
orthogonal complement of a set and showed that in finite dimensions
that the orthogonal complement of the orthogonal complement of S is the
span of S. This not true in infinite dimensions, for example if V
is complex valued continuous functions on [0,1] and $ is the set of
e^{2k\pi i t} for k=0, 1, -1, 2, -2, ... then the orthogonal complement
of S is 0. Thus S perp perp = V which is not the span of S.
- 11/17: We looked at the many uses of *, as the conjugate
transpose, as the adjoint, and as the dual. These are all related
to each other and any ambiguities can be resolved.
- First the dual. If T:V->W is linear then there is an
induced linear transformation T*:W*->V* which is given by T*f=fT, in
other words for each a in V, T*f(a) = f(Ta).
- Next the adjoint. If T:V->W where V and W are inner
product spaces, then the adjoint of T is a linear transformation
T*:W->V so that (Ta | b) = (a | T*b) for all a in V and b in
W. (H&K only considered the case W=V but this is not needed).
- Finally the conjugate transpose. If A is an nxm matrix
over R or C then A* is an mxn matrix.
These three uses of * are all related. We showed that if V is
finite dimensional with an inner product ( | ) then there is a one to
one correspondence V<->V* which identifies b in V to the
functional f_b:V->F given by f_b(a) = (a | b). This
correspondence relates the dual T* with the adjoint T*. So the
adjoint T* is the composition of W<->W*, the dual T*:W*->V*,
and V*<->V. The conjugate transpose is related to the
adjoint as follows. Suppose V=Cmx1
and W=Cnx1 then as usual, an nxm
matrix A
gives a linear transformation T:V->W defined by Ta = Aa.
Consider the standard inner products on V and W, (a | b) = b*a.
Then the conjugate transpose A* is the matrix for the adjoint T* since
(a | T*b) = (A*b)*a = b*A**a = b*Aa = (Ta | b).
- 11/20: exam
- 11/22: 8.4 We talked about isomorphisms of inner product spaces
and unitary matrices.
- 11/27: More 8.4, but I'm doing some things
differently. First using the Gram-Schmidt process on its columns,
any matrix A can be written A=QR where Q is unitary and R is upper
triangular. H&K does a variation on this in Theorem 14 by
essentially working with the transpose, but the QR decomposition is
more standard. Now if B is any triangulable matrix, there is a
matrix P so that T = P-1BP is upper triangular. Let
P=QR be the QR decompostion of P. Then Q-1BQ = RTR-1.
But R, T, and R-1 are all upper triangular, so Q-1BQ
is upper triangular. So any triangulable matrix is triangulable using a
unitary matrix Q. The corresponding statement for operators is
that if T:V->V is an operator on a finite dimensional inner product
space, then there is an orthonormal basis B={b1,b2,...,bn} so that [T]B
is upper triangular. The proof is to take any basis A so [T]A
is upper triangular, then do Gram -Schmidt on A to get B, and then [T]B
will automatically be upper triangular also. Last HW due 12/4
- 8.3: 1, 4, 6, 9
- 8.4: 1, 2, 6, 10
- 8.5: 1, 4, 8, 9
- 11/29: 8.5: We showed that any normal operator T:V->V on a
complex inner product space is orthonormally diagonalizable, i.e.,
there is an orthonormal basis B of V so [T]B is
diagonal. The proof I gave was different from that in
H&K, As we did on 11/27, you can pick an orthonormal basis B
so that [T]B is upper triangular. But then we showed
that any upper triangular matrix which is also normal must be diagonal,
so [T]B is diagonal. In the real case we showed that
an operator is orthonormally diagonalizable if and only if it is
symmetric. We showed a symmetric operator is orthonormally
diagonalizable. The trick is to represent it by a symmetric
matrix which we then temporarily think of as being complex Hermitian
matrix when we can then conclude that all its characteristic values are
real, so its minimal polynomial is a product of linear factors, so it
is triangulable, so it is orthonormally triangulable, but any upper
triangular symmetric matrix must be diagonal, so it is othonormally
diagonalizable. We started out looking at what you do with normal
real matrices which are not symmetric, such as skew symmetric and
orthogonal matrices, but didn't get far.
- 12/1: We looked at real skew symmetric operatorss and showed that
there is an orthonormal basis so that the matrix of the operator is
block diagonal with either 1x1 blocks of 0 or 2x2 blocks [0 b; -b
0]. I'll write out some notes on this stuff since it is not in
the book.
- 12/4: We talked more about real normal matrices.
- 12/6: We did a matlab demo on real normal matrices and talked
about a real Jordan form for any real n x n matrix. Preliminary writeup is here.
- 12/8: Review
- 12/11: Review
- 12/13: I'll drop by our classroom at 1:00 just in case anyone has
any questions.