Member-only story

Deep Learning Fundamentals — Part Four

Emad Dehnavi
5 min readDec 29, 2024

This is the fourth part of the Deep Learning Fundamentals series, if you didn’t read the last parts, read them before reading this one, here are the links:

So far we learned about Neural Network and it’s structure and how pattern complexity can be a big factor for choosing the right net. You might be wondering why deep learning networks (deep nets) didn’t become popular sooner since they work so well. One reason is that when we try to train them using a method called backpropagation, something called the “vanishing gradient” or “exploding gradient” occurs. This makes training take much longer and the results aren’t as good.

Vanishing Gradient

--

--

Emad Dehnavi
Emad Dehnavi

Written by Emad Dehnavi

With 8 years as a software engineer, I write about AI and technology in a simple way. My goal is to make these topics easy and interesting for everyone.

No responses yet