MAT1341C, Winter 2017, University of Ottawa


The final exams are graded. Your calculated numerical grade appears in Blackboard and your official final letter grade will appear in uoZone. You are welcome to view your exam during my office hours (math department, KED, office 305D) on Thursday May 4 from 1 to 3 pm and on Friday May 5th from 9 to 11 am. If these times are not convenient for you, you can email me for an appointment (just provide a list of some times you are available). Have a great summer!

Office hours to see your exam: Tuesday, May 9 from 2 to 3 pm, or by appointment.

Daily log

(Apr 6) Some other things I could have covered in the last lecture. This term, we covered the entire textbook except for Chapters 20 and 24; this alternate is about Chapter 24. Enjoy!

(Apr 6) last class. My lecture notes: recap of diagonalization and orthogonalization algorithms, Google PageRank, overview of course plus description of final exam.

(Apr 4) My lecture notes: Chapter 23: Diagonalization. Why we dream of diagonalizability : applications, calculating \(A^k\). Distinguishing diagonalizable and non-diagonalizable matrices. Exercises: 23.1, 23.2, 23.3. (Preview of MAT2384: 23.4)

(Mar 30) My lecture notes : Chapter 22: eigenvalues and eigenvectors, characteristic polynomial, eigenspace, algebraic multiplicity. What we dream of: diagonalizability. Exercises: 22.1, 22.2.

(Mar 28) Determinants (Chapter 21): My lecture notes : ways to compute, plus shortcuts and tricks. Exercises: 21.1, 21.2, 21.3, 21.4 (and for those who want to see cool stuff with determinants, 21.5)

(Mar 23) My lecture notes : An example of Gram-Schmidt, then proceed to complex numbers (Chapter 1). Exercises: 1.1, 1.2, 1.3, 1.4 (and 1.5 for fun)

(Mar 21) My lecture notes. We covered the rest of Chapter 19 (orthogonality). The test covers to the end of Section 19.2. Exercises: 19.2, 19.3, 19.4, 19.5, 19.6.

(Mar 16) My lecture notes. We finished up Chapters 17 and 18, and started on Chapter 19: Orthogonality. We mentioned what the "dot product" looks like in more interesting vector spaces, but for our calculations here we will focus on the usual dot product in \(R^n\). Orthogonal sets of vectors: their definition, their linear independence, and how to find coordinates with respect to an orthogonal basis. Exercises: 14.7, 16.3, 16.4, 17.1, 19.1

( \(\pi\) day) Chapter 15: All three spaces: row space, column space and null space, with interesting properties and more algorithms. My lecture notes. Exercises: 15.1.

(Mar 9) Chapter 16: efficient row space and column space algorithms for finding a bases. My lecture notes. If everything up to this point feels muddled in your head, here is where we straighten it out into a step-by-step process. Exercises: 16.1 and 16.2.

(Mar 7) Finishing Chapter 18 (matrix inverses) and then starting on Column space of a matrix. my lecture notes.

(Mar 2) Continuing in Chapter 14, then Chapter 18 : matrix inverses, which we only sometimes have, but which are the next best thing to being able to divide (which we can't). My lecture notes (4.8MB). Exercises: 14.3, 18.1, 18.2 (a-d). (For fun, try 14.4.)

(Feb 28) My lecture notes (4.7MB). We recapped the last few weeks, just highlighting major themes, and then (after a brief diversion) we proceeded to matrix multiplication, which is built out of the dot product, and is fairly weird (and incredibly powerful). Exercises: anything you haven't done from Chapters 7 through 10, plus 14.1, 14.5, 14.6.

(Feb 16) My lecture notes (7.5MB). Today was about dimension, bases and coordinates, and their applications. Exercises: 10.1 (and 10.2 for fun).

(Feb 14) My lecture notes (5MB). Today was about linearly independent spanning sets, also known as bases, of vector spaces. The dimension of a vector space is the number of vectors in a basis. Exercises: 9.1 to 9.4. (9.5 and 9.6 are also fun, for real puzzle lovers)

(Feb 9) A draft of my lecture notes (5MB). Today is about how spanning and linear independence relate. We didn't cover the Theorem on Enlarging LI sets; that will be next time. Exercises: 8.1 and 8.2.

(Feb 7)' Think of span and linear independence as competing notions: to span a subspace you want lots of vectors; to be linear independence you want fewer vectors. Where they balance out is going to be very interesting.

(Feb 7) More on spanning sets, and starting on linear independence. My lectures notes, part 1 and part 2. I simplified some examples during the lecture. Exercises: 7.1, 7.2, 7.3 and 7.5.

(Feb 4) Just to note: Test 2 covers to the end of Chapter 6, which we will wrap up on Tuesday, before going on to Chapter 7. The DGD on Monday Feb 6 will go over material relevant for Test 2; see Blackboard for the exercises to be discussed, together with some explanatory material.

(Feb 2) A bit from Chapter 5 (subspaces), then Chapter 6 (spans of sets). Here are Dr. Johannes Cuno's exquisite lecture notes. Exercises: 6.1 to 6.4.

(Jan 31) Some of my lecture notes. We did more of Chapter 4 (vector spaces) and also defined and did examples of subspaces, with the subspace test. A subspace is a vector space sitting inside another vector space: like a line through the origin sitting in \(R^2\). Exercises: Do as many from 5.1 to 5.4 as you can --- the only thing that makes this concept really sink in is just seeing so many examples (and non-examples) that you can picture what a subspace actually is.

(Jan 26') About adding sine waves --- which in this class, we understand means adding "vectors" in the vector space of all waveforms.

(Jan 26) Chapter 4 : the BIG ONE. Vector spaces, in all their glory: mind-bending, abstract, and sometimes really weird ideas --- that are the reason that linear algebra shows up EVERYWHERE. Exercises are the only way to get used to these. Exercises: do at least one of each of the multi-part questions 4.1 through 4.11. (These are in fact old test questions; the ones with * have answers at the back.) For the mathematically curious: 4.12 through 4.14 are really mind-blowing (not hard, once your decipher them, just really, really cool and out there).

(Jan 24) Chapter 13: Applications Day! We saw how linear systems (and their solutions) can be used in the analysis of networks, in certain types of nonlinear systems, and in production-resource modeling. The point: once we see how solving the problem is equivalent to solving a system of linear equations, doing row reduction gives an answer that is convenient to work with --- but don't forget to think about real-world constraints on your parameters! Exercises: 13.2, 13.4, 13.5 (or more!)

(Jan 19) We finished the material in Chapters 11 and 12 (row reduction, RREF, different kinds of systems) and did some examples, including starting to see where linear systems show up in other contexts (besides the intersection of planes!). My last example got completely botched : not only did I make a calculation mistake (that a student caught for me, thank you!) but then when I wrote down the solution I wrote it BACKWARDS. Please delete the last 2 minutes of the lecture and get the correct RREF, and the correct solution, on your own. (You will know it is correct because plugging it into the equation for the linear combination will make it an equality.) Exercises: Go to the Linear Algebra Test Bank (a link in Blackboard) and choose Linear Systems to get lots of different kinds of examples of questions.

(Jan 17') Curious about the geometry behind the dot and cross products? Looking for more practice questions with row reduction?

(Jan 17) Chapter 11/12 (to be continued Thursday) : row reduction = Gaussian elimination = solving linear systems using (augmented) matrices. This is the fundamental calculation technique we use for answering practically all our questions in this course; some weird choices about how we do things that seem a bit odd right now will become more clear as the term progresses! Exercises: 11.1(b), 11.2(b), 12.1(b,d,f), 12.2(b,d).

(Jan 12) Chapter 3, lines in \(\mathbb{R}^n\) and planes in \(\mathbb{R}^3\). Test 1 will include examples not done in class; go to the DGD, see the old tests and review your high school material. Next class we'll finish up Chapter 3 then start on Chapter 11 (matrices and row reduction), before coming back to Chapter 4. Exercises: 3.1(b), 3.3(a,b,d), 3.4(b,d), 3.5(b,c), 3.6(b,d,f,h), 3.7(b). For those with a mathematical inclination (or just: you like to know why things are true) 3.8, 3.9 and 3.10 are very cool.

(Jan 10) Chapter 2, review of vectors, with an eye towards defining \(\mathbb{R}^n\). Exercises: Check out the tests of years past (link below) to get into the right mindset for Thursday's class (Chapter 3). The relevant test is the diagnostic=Test 1, but we will not have questions on complex numbers (\(a+bi\)) on our Test 1. For today's class: do problems 2.3, 2.4, 2.5 (on page 23 in the textbook "Vector Spaces First" on Blackboard Learn).

(Jan 8) Our first class is Tuesday, January 10th. We will cover Chapters 2 and 3 of the textbook this week (Vector Geometry, and Lines and Planes). While this will include a brief review of your high school Vectors material, the focus will be on generalizing to \(n\) dimensions. Therefore, if you don't know what a vector is, or how to write the equation of a line or plane in two and three dimensions, please review the MAT1339 material which is posted in Blackboard Learn, and come to office hours on Thursday with your questions.

(Dec 21) The syllabus has been posted.

Course info

Additional resources

Come to office hours! I am happy to answer any and all questions.

The Math Help Centre is open virtually all day, every day, for you to drop in and get answers to your questions (see website for details).

See Blackboard Learn for links to the Linear Algebra Test Bank and the applications page Linear Algebra Close to Earth, both written by Barry Jessup and Joseph Khoury.