Deep Learning

Deep Learning

MIT Introduction to Deep Learning | 6.S191: MIT Introduction to Deep Learning 6.S191: Lecture 1<br>*New 2020 Edition*<br>Foundations of Deep Learning<br>Lecturer: Alexander Amini<br>January 2020<br><br>For all lectures, slides, and lab materials: http://introtodeeplearning.com<br><br>Lecture Outline<br>0:00 - Introduction<br>4:14 - Course information<br>8:10 - Why deep learning?<br>11:01 - The perceptron<br>13:07 - Activation functions<br>15:32 - Perceptron example<br>18:54 - From perceptrons to neural networks<br>25:23 - Applying neural networks<br>28:16 - Loss functions<br>31:14 - Training and gradient descent<br>35:13 - Backpropagation<br>39:25 - Setting the learning rate<br>43:43 - Batched gradient descent<br>46:46 - Regularization: dropout and early stopping<br>51:58 - Summary<br><br>Subscribe to @stay up to date with new deep learning lectures at MIT, or follow us on @MITDeepLearning on Twitter and Instagram to stay fully-connected!!

Deep Learning