Lecture #4 looked at the basics of neural networks, and paved the way for finding out how to do learning with them.
Notes are available in two formats:
The lecture covered material from Chapters 2 and 3 of the textbook, from pages 29-32 and 37-39. If you hve the textbook, read through this.
If you have the textbook, you should go through this material in conjunction with the notes, ensuring that at least you understand why gradient descent works even if you don't understand the calculus needed.
Worth reading is David Medler's Brief history of neural network approaches.