项目作者: ooleksyuk

项目描述 :
Improving-Deep-Neural-Networks
高级语言: Jupyter Notebook
项目地址: git://github.com/ooleksyuk/Improving-Deep-Neural-Networks.git


03-Basic Recipe for Machine Learning _1647692719344.pdf
1412.6980_1647692811216.pdf
04-Understanding exponentially weighted averages _1647692838580.pptx
05-Bias correction in exponentially weighted averages _1647692843039.pptx
06-Gradient descent with momentum _1647692849646.pptx
07-RMSprop_1647692855227.pptx
08-Adam optimization algorithm _1647692861582.pptx
09-Learning rate decay _1647692866670.pptx
10-The problem of local optima _1647692870816.pptx
01-Tuning process _1647692898707.pptx
02-Using an appropriate scale to pick hyperparameters _1647692905116.pptx
03-Hyperparameters tuning in practice- Pandas vs. Caviar _1647692916410.pptx
04-Normalizing activations in a network _1647692925157.pptx
05-Fitting Batch Norm into a neural network _1647692934448.pptx
07-Batch Norm at test time _1647692972611.pptx
08-Softmax Regression _1647693000685.pptx
09-Training a softmax classifier _1647693007347.pptx
10-Deep learning frameworks _1647693011100.pptx
11-TensorFlow_1647693020473.pptx
02-Bias : Variance _1647692714072.pptx
04-Regularization_1647692725104.pptx
06-Dropout Regularization _1647692735518.pptx
08-Other regularization methods _1647692749861.pptx
09-Normalizing inputs _1647692756796.pptx
10-Vanishing : Exploding gradients _1647692763020.pptx
12-Numerical approximation of gradients _1647692771852.pptx
13-Gradient checking _1647692776808.pptx
01-Mini-batch gradient descent _1647692821171.pptx
02-Understanding mini-batch gradient descent _1647692828318.pptx
03-Exponentially weighted averages _1647692832698.pptx