Member-only story
Deep Learning Fundamentals — Part Four
This is the fourth part of the Deep Learning Fundamentals series, if you didn’t read the last parts, read them before reading this one, here are the links:
So far we learned about Neural Network and it’s structure and how pattern complexity can be a big factor for choosing the right net. You might be wondering why deep learning networks (deep nets) didn’t become popular sooner since they work so well. One reason is that when we try to train them using a method called backpropagation, something called the “vanishing gradient” or “exploding gradient” occurs. This makes training take much longer and the results aren’t as good.