Training Neural Networks: Crash Course AI #4

Today, we’re going to talk about how neurons in a neural network learn by getting their math adjusted, called backpropagation, and how we can optimize networks by finding the best combinations of weights to minimize error. Then, we’ll send John Green Bot into the metaphorical jungle to find where this error is the smallest, known as the global optimal solution, compared to just where it is relatively small, called local optimal solutions, and we’ll discuss some strategies we can use to help neural networks find these optimized solutions more quickly.

Crash Course AI is produced in association with PBS Digital Studios.