Deep learning systems are able to learn extremely complex patterns, and they accomplish this by adjusting their weights. How are the weights of a deep neural network adjusted exactly? They are adjusted through a process called backpropagation. Without backpropagation, deep neural networks wouldn’t be able to carry out tasks like recognizing images and interpreting natural language. Understanding how backpropagation works is critical to understanding deep neural networks in general, so let’s delve into backpropagation and see how the process is used to adjust a network’s weights.
Backpropagation can be difficult to understand, and the calculations used to carry out backpropagation can be quite complex. This article will endeavor to give you an intuitive understanding of backpropagation, using little in the way of complex math. However, some discussion of the math behind backpropagation is necessary. Read More