__Things I will remember until 3:30__1) To get a basis for Col(A), take the pivot columns. To get a basis for Row(A), take the pivot rows. To get a basis for Nul(A)... solve for the null space.

2) A linear transformation satisfies T(u+v) = T(u) + T(v), and T(cu) = cT(u). A subspace satisfies those, and MUST contain the zero vector.

3) To get the steady-state vector for stochastic A, solve (A-I)x=0.

4) To get the change of basis vector P(C<-B), solve the huge augmented matrix [C|B] to [I|M]. M is P(C<-B)

5) Polynomial space: P_n = a_0 + (a_1)x_1 + (a_2)x_2 ^2 + ... + (a_n)x_n ^n.

6) Col(A) is orthogonal to Nul(A) transpose

7) Row(A) is orthogonal to Nul(A)

8) QR factorization of A: Q = orthogonal basis in a matrix, R = upper triangular, Q^T * A = R

9) Gram-Schmidt:

v1 = x1

v2 = x2 - (x2 dot v1)/(v1 dot v1) * v1

v3 = x3 - (x3 dot v1)/(v1 dot v1) * v1 - (x3 dot v2)/(v2 dot v2) * v2

...

10) Given basis {b1,b2...} and vector [x]_b, to get x, calculate B*x

11) Given basis {b1,b2...} and vector x, to get [x]_b do [b1 b2 | I ] -> [ I | P ] and calculate P*x

12) To solve a least squares problem in the form of Ax=b, solve (A^T)Ax=(A^T)b

13) To solve a least squares model, for example (b1)x + (b2)x^2 = y, construct an X matrix [(x) (x^2)] for the x data points and a Y matrix [y] for the corresponding y values. Solve for (X^T)X(b) = (X^T)y. The b1, b2 values are the answers, in the b matrix.