Linear Algebra I
MATH-GA.63.2110-001
Fall 2011, Tuesdays, 5:10 - 7:00 pm, Warren Weaver Hall 517
Instructor:
Olof Widlund
Coordinates of Olof Widlund
Office: WWH 612
Telephone: 998-3110
Office Hours: Tuesdays and Thursdays, 4:00 - 5:00 pm. You can also try to drop in
or send email or call for an appointment.
Email: widlund at cims.nyu.edu
URL for this course: http://math.nyu.edu/courses/fall11/G63_2110_001/index.html
Text book: Linear Algebra by Friedberg, Insel, and Spence. Prentice
Hall. Homework assignments will often be from the Fourth Edition of this
book.
Homework:
There will be regular homework assignments. Scores will be available
on Blackboard.
It is important that you do the homework yourself, but when you get stuck,
I encourage you to consult with other students
or me, to get help when necessary. However, when you get help,
it is important to acknowledge it in writing.
Please staple everything together and order the problems in the
same order as given. The best way of turning in homework is to give it to me
personally, in class or in my office. If I am not in my office, you can
slide it under my door. If you put your homework in my mailbox, it is at
your own risk.
Homework Assignments:
Set 1, due September 27 at midnight: #12 and 18 on p. 15; #10 and 12 on p. 21; #28 and 30 on p. 23; #12 on p. 34; #15 on p. 35; # 10 on p. 41; #17 on p. 42; #11 on p. 55; #16 on p. 56; #29 and 31 on p. 57.
Set 2, due October 4 at midnight: #9 and 11 on p. 75; #20 on p. 76;
#26 on 77; #35 on p. 78; #7 on p. 85 and #15 on p. 86.
Set 3, due October 18 at 5:10pm: #9 and 13 on p. 97; #17 on p. 98; #6 and 9
on p. 107; #16 on p. 108 and #22 on p. 109.
Set 4, due October 25 at midnight: #20.2, 20.3, 21.2, 21.5, and 21.6, all
from the handout given out on October 4.
Set 5, due November 9 at 10:00am: #4, 6, and 8 on pp. 124-125;
#3b and 3d on p. 141; #2b, 2d, 2f, 6b, and 6d on pp. 165-167; #3b, 3d, 5, and
8 on pp.180-181.
Set 6, due November 16 at 10:00am: #4 on p. 221; #22 and 25 on p. 222;
#4 on p. 228; #12 and 15 on p. 229; #23 on p. 230; #9 on p. 258; #14 and 18
on p. 259.
Set 7, due November 23 at 10:00am: #2b and f, 8, 14a, 18 on pp. 279-282;
21 and 23 on p. 312.
Set 8, due December 7 at 10:00am: #3, 5, 17, and 19 on pp. 322-324;
#8, 11, and 24 on pp. 337-339.
Blackboard: That system will be used primarily for the
homework and exam scores. A grader has been appointed on September 19.
Exams: There will be a midterm exam on October 18 and a final
exam on December 20. You may bring two standard sheets of paper
to the exams with notes written by yourself on both sides and for your own
private use during the exam.
Lectures
September 6: Vector spaces. Examples of vector spaces which satisfies
the properties required. Subspaces and how to verify that we have a
subspace. Linear combinations and the span of a finite set.
Linear dependence and linear independence. The span
of a set of elements is a linear space V; it is a subspace of V.
September 13: Bases for spaces of polynomials. Lagrange and Hermite
interpolation. These procedures provide alternative bases, which are convenient
when solving interpolation problems. Sums and direct sums of sets and vector
spaces. Generating sets and bases for vector spaces; a basis provides a minimal
generating set and all elements of the generated space are represented uniquely
as a linear combination of the basis elements. Finite dimensional vector spaces;
any two bases of such a space have the same number of elements. The replacement
theorem and a variety of applications thereof. Examples of linear spaces of
matrices and explicit bases for these spaces which allow us to compute their
dimensions. (This brings us to the end of our discussion of Chapter 1 of the
textbook.)
September 20: Linear transformations; functions on vector spaces preserving
the linear structure. Example: differentiation, integration, rotations,
reflections, and projections. Range and null spaces of linear transformations;
nullity and rank. nullity(T)+rank(T)=dim(V), for V of finite dimension.
one-to-one and onto functions and linear transformations. Matrix representation
of linear transformations and ordered bases. The set of all linear
transformations between a pair of vector spaces is a vector space and so are
the set of m-by-n matrices. Composition of linear transformations and matrix
multiplication.
September 27: More on linear transformations. Isomorphisms and
invertibility of linear transformations and matrices. The isomorphism of
families of linear transformations and matrices. A first introduction to
Gaussian elimination; a handout will be available next week.
October 4: Handout on Gaussian Elimination.
Solving linear systems of algebraic equations. Straight lines,
planes, or hyperplanes that intersect in one point or not. Comments on the
column and row ranks of matrices; we will establish that they are the same.
Gaussian elimination in terms of special unit lower triangular matrices.
The inverses of these matrices and products of these matrices can easily
be found. The need for pivoting and the partial pivoting strategy; if the
matrix in invertible, this strategy will always work. What happens if the
matrix is not invertible. Cost of Gaussian factorization in the general
case and in the case when the matrix is tridiagonal. The cost of solving
the system of equations once the triangular factors are known.
October 11: NYU holiday.
October 18: Midterm exam. The exam, with
solutions.
October 25: Material from Chapter 3, in particular a proof and a discussion
of theorems 3.4 and 3.6. Homogeneous linear differential equations with constant
coefficients. Note that there is a complication since the relevant vector space
is not finite dimensional; see section 2.7. A few words on linear functionals;
see section 2.6.
November 1: More about dual spaces and linear functionals. The second dual
of a finite dimensional linear space V is isomorphic to V. Determinants of
2-by-2 matrices and the recursive definition of determinants in terms of
determinants of smaller matrices. Cofactors and cofactor expansions with respect
to an arbitrary row of the matrix; this requires proofs by induction.
November 8: Assorted results on determinants: the effect of interchanging
rows, the determinant of a product of two matrices, the determinant of the
transpose of a matrix, among them. Determinants and the area or volume of
parallelograms and parallelepipeds. Eigenvalues and eigenvectors of matrices.
Transforming matrices onto diagonal form by using their eigenvectors; it does
not always work because a full set of eigenvectors is not available for all
matrices.
November 15: Alternatives for matrices that cannot be transformed into
diagonal form: a brief discussion of the Jordan normal form and the Schur
factorization of matrices. Limits of sequences of matrices and sequences
obtained by powers of a given matrix. Conditions on the eigenvalues of
matrix in order for its powers to converge to a limit. Transition matrices.
Vector and matrix norms associated with vector norms. How to compute the
ell_1 and ell_infinity norms of matrices. Gerschgorin's circle theorem.
Invariant subspaces and invariant subspaces defined by an element in the
vector space. The eigenvalues of the linear operator defined by restricting
the given operator to such a subspace; these eigenvalues form a subset of
the eigenvalues of the given linear operator.
November 22: The Caley-Hamilton theorem with a proof for diagonalizable
matrices and another for the general case. Inner product spaces with several
examples. Orthonormal bases for inner product spaces and the Gram-Schmidt
algorithm. Legendre polynomial, Hermite and Laguerre polynomials. Fourier
series. Projections onto subspaces spanned by a subset of an orthonormal
basis and a few words on linear least squares.
November 29: There were two handouts on Householder transformations
and symmetric tridiagonal matrices, respectively. Further results on
systems of orthogonal polynomials: they can be generated by three-term
recursions and all their roots are simple and lie in the interior of
the interval of integration. Linear least squares problem and the normal
equations derived in two different ways. Factoring matrices in terms of
a product of an orthogonal matrix Q and a matrix R with an upper triangular
matrix on top. How to compute QR factorizations by Householder transformations
and by Gram-Schmidt. How to use Householder transformations to construct
an upper Hessenberg matrix which is similar to a given square matrix. The
special case of symmetric matrices; the Hessenberg matrix is then symmetric
and tridiagonal. How to compute the characteristic polynomial of a symmetric
tridiagonal matrix by a three-term recursion.
December 6: Cauchy's interlace theorem and Sturm sequences. Schur
factorization and normal matrices. A different proof of the fact that
two eigenvectors associated with two different eigenvalues of a symmetric
matrix are orthogonal. Rayleigh quotient. Formulation of Sylvester's inertia
theorem and using it to determine the inertia of a Schur complement of a
symmetric matrix. First comments on Courant-Fischer's theorem.
December 13: Courant-Fischer's theorem. Sylvester's inertia theorem.
the singular value decomposition and a few applications. Solving underdetermined
linear systems of equations. A few words in conditioning of linear systems
of algebraic equations.