Complexity

Time complexity of an operation is just the number of multiplications required (we assume addition is free).

The dot product of two vectors of length n is O(n).

Computing X'X requires p^2 dot products, each requiring n multiplications, so is O(p^2 n). But note that is X is sparse, this can effectively be cheaper.

Exactly inverting a p*p matrix requires O(p^3) operations. (We'll later learn much faster approximate methods that find pseudo-inverses using SVD).

Note that for "big O" notation, we just drop the smaller terms, so O(n^2 + n) = O(n^2).

Note that O(n + p) cannot be simplified unless you know that n<< p or p << n.

For more details, see

https://www.cs.cmu.edu/~adamchik/15-121/lectures/Algorithmic%20Complexity/complexity.html

or for even more details ( more than you need for this class), see: http://www.cs.dartmouth.edu/~ac/Teach/CS19-Winter06/SlidesAndNotes/CLRS-3.1.pdf

Back to Lectures