Lecture #5 looked at the training of neural networks, in particular the details of backpropagation.
Notes are available in four formats:
The lecture covered material from Chapter 3 of the textbook, from pages 44-54.
You should go through this material in conjunction with the notes, ensuring that at least you understand why backpropagation works even if you don't understand the calculus needed.
Then you should make sure that you can actually apply backpropagation by continuing the training of the even-parity network from the example. At the very least, run through the rest of the training data once (as the book suggests, using a spreadsheet to do this is probably the easiest thing to do).
If you want more detail on backpropagation, read:
Rumelhart D.E., Hinton G.E. and Williams R.J. (1986). Learning Internal Representations by Error Back-propagation. In Rumelhart D.E., McClelland J.L. (eds.) , Parallel Distributed Processing, vol. 1, ch. 8, MIT Press.