The exam will cover chapters 5 and 6 of the book. Let me give you an outline: We have introduced the notion of a vector space, which can refer to any collection of objects with which operations of addition and of multiplication by a real number can be defined in such a way that the usual rules known from geometric vectors and vectors in R^n apply. The precise list of rules can be found in the book and in the glossary. If there were no further examples for vector spaces than R^n, the whole wisdom of linear algebra would probably not be worthwhile an entire course of its own. Particular examples of vector spaces other than R^n are; - (for any size choice mxn) the vector space of mxn matrices - (for any m) the vector space of symmetric mxm matrices - various vector spaces of functions, like (e.g) the vector space of all continuous functions defined for all x in the interval [0,1] The fundamental notions that apply to any vector space are: linear combination, spanning set, span, linearly dependent versus linearly independent sets, basis, dimension and coordinates wrt a basis. For instance an *example* of a linear combination of the matrices [ 1 2 1 ] [-3 1 7 ] [ 1 1 6 ] [ 2 3 5 ] and [ 1 1 1 ] and [-1 2 2 ] [ 1 0 2 ] [ 0 2 1 ] [ 2 1 6 ] would be [ 1 2 1 ] [-3 1 7 ] [ 1 1 6 ] -1 [ 2 3 5 ] + 3 [ 1 1 1 ] - 4 [-1 2 2 ] [ 1 0 2 ] [ 0 2 1 ] [ 2 1 6 ] an example of a linear combination of the functions f and g (where f(x)= cos x and g(x)= x^2) would be the function 3f-7g given by the formula 3 cos x - 7 x^2 and of course similarly for linear combinations of vectors in R^n. Make sure you understand the meanings of the other words in the above list and study the pertinent hwk problems that illustrate them. Checking whether a given set of vectors (eg in R^n) is a spanning set for this vector space, or is linearly independent, amounts to answering questions about linear systems of equations. If you have n vectors in R^n these questions can be answered by forming a square matrix out of these vectors and checking its determinant. However, if you are given, say, two vectors in R^3, or four vectors in R^3, determinants do not apply and you have to decide with row reduction whether the system of equations has a solution for every right hand side, or has only the trivial solution in case the right hand side is 0. In the case of a vector space of functions you had two problems in the hwk, which you should make sure to understand. The book mentions deciding linear independence of functions by means of Wronskians, but we did not cover this in class. Given a vector space, finding a basis may be extremely easy, or extremly tricky. For instance, it is very easy to give a basis for the vector space R^n. In contrast I never asked you (and will never ask you) to give a basis for the v.sp. of all continuous functions defined on R. Nobody can do this in practice. Manageable problems of finding a basis were #14,#15. They used insight into the nature of the particular vector space at hand, rather than a routine method. We studied the row, column, and null spaces of a given matrix. For those, a routine method to find a basis was explained, and of course you need to know it. As a consequence of this method, we learned the notion of a rank of a matrix, and the crucial result that rank A + nullity A = # of columns of A Obviously you should be able to perform the calculations involved in finding such bases, as well as understand the meaning of the notions of `row space', `column space' and null space'. We finally studied the notion of an inner product in vector spaces. (Ch 6) You have seen examples and non-examples in R^n, and also an inner product in the vector space of nxm matrices, and the inner product in the space of continuous functions on [0,1] given by the formula = integral from 0 to 1 f(x) g(x) dx. This is an important inner product which most of you will encounter in later classes. Make sure you understand all these examples. Also make sure you can use the calculational trick of completing the square in simple examples where it needs to be determined whether a certain expression satisfies the last requirement for an inner product (Ch6, hwk 1b). Problem #8,9 in Ch.6 are of paramount importance, but given that you may find it not so easy and the time until the exam is short, it will not be required on exam 3. Give these two hwk problems due attention first thing after the exam. An important algorithm is Gram Schmidt orthogonalization. It allows, given a basis, to convert this basis into a different one that consists of mutually orthogonal vectors. This is important because an orthogonal basis is much more convenient in practice. Gram-Schmidt is the last material that still goes on exam 3. Everything beyond will have to wait for inclusion in the comprehensive final. NOTE: Often the last classes before the finals squeeze in material that doesn't get digested too well. Take the 3 lessons after exam 3 serious. They contain highlights of the course, and stuff you are most likely to need later, in particular more about eigenvalues and possibly the QR decomposition of a matrix..