项目作者: mmarouen

项目描述 :
Flexible deep learning neural network. Implements a multi layer perceptron and autoencoders.
高级语言: R
项目地址: git://github.com/mmarouen/Deep-Neural-Net.git
创建时间: 2017-08-18T12:52:38Z
项目社区:https://github.com/mmarouen/Deep-Neural-Net

开源协议:GNU General Public License v3.0

下载


Deep-Neural-Net

Implements:
-neural networks regression
-binary and multinomial classification
-autoencoders
Several features available for a more efficient gradient descent:

  1. mini-batch GD
  2. optimization algorithms: ‘Adam’, ‘RMSProp’, ‘Momentum’
  3. Batch normalization
  4. Ridge regulrization

Calling Function:”neuralNet”

Graphics: The library also comes with a visualizer (neuralNet_visualizer sub-folder)

  1. visualize model summary (error rate, gradient check values, …)
  2. plot objective function for validation and training
  3. plot weights evolution through network layers

Multilayer Perceptron

NN parameters are very flexible:

  1. Type= “Classification” or “Regression”
  2. DW (Depth/Width of the network)=vector where each value indicates neurons in indexed layer
  3. loss (loss function)= “RSS” or “Deviance” (either logistic loss or residual sum of squares)
  4. outputFunc (output function)= “Sigmoid”, “Softmax”, “Identity”
  5. activationFunc (activation function)= “tanh”, “sigmoid”, “linear” (relu on the way)
  6. rate=learning rate
  7. weightDecay=TRUE/FALSE
  8. lambda=coefficient of the weight decay
  9. a host of <NN cheking tools such as gradient checker, traceObjective function, trace loss function, …

    Autoencoder:

  10. set type=”Regression”
  11. set loss=”RSS”
  12. if input is 0…1 then set output to “Softmax” or “Sigmoid” if parameters >1 or ==1 respectively
  13. Other parameters are tuned identically to other neural networks
    Remark: For runtime optimization reasons, the purpose of this implementation is experimentation and is not intended to be used in production environment.
    Please send all your remarks to azzouz.marouen@gmail.com