项目作者: mmahesh

项目描述 :
SC-Adagrad, SC-RMSProp and RMSProp algorithms for training deep networks proposed in
高级语言: Python
项目地址: git://github.com/mmahesh/variants-of-rmsprop-and-adagrad.git
创建时间: 2017-07-14T15:25:01Z
项目社区:https://github.com/mmahesh/variants-of-rmsprop-and-adagrad

开源协议:Other

下载


Variants of RMSProp and Adagrad

Keras implementation of SC-Adagrad, SC-RMSProp and RMSProp Algorithms proposed in here

Short version accepted at ICML, 17 can be found here

I wrote a blog/tutorial here, describing Adagrad, RMSProp, Adam, SC-Adagrad and SC-RMSProp in simple terms, so that it is easy to understand the gist of the algorithms.

Usage

So, you created a deep network using keras, now you want to train with above algorithms. Copy the file “new_optimizers.py” into your repository. Then in the file where the model is created (also to be compiled) add the following

  1. from new_optimizers import *
  2. # lets for example you want to use SC-Adagrad then
  3. # create optimizer object as follows.
  4. sc_adagrad = SC_Adagrad()
  5. # similarly for SC-RMSProp and RMSProp (Ours)
  6. sc_rmsprop = SC_RMSProp()
  7. rmsprop_variant = RMSProp_variant()

Then in the code where you compile your keras model you must set optimizer=sc_adagrad. You can do the same for SC-RMSProp and RMSProp algorithms.

Overview of Algorithms