项目作者: ml4j

项目描述 :
Simple demos of ml4j NeuralNets...AutoEncoders...RBMs....ConvNets...DBNs....coming soon
高级语言: Java
项目地址: git://github.com/ml4j/ml4j-neuralnets-demo.git
创建时间: 2017-09-29T06:38:38Z
项目社区:https://github.com/ml4j/ml4j-neuralnets-demo

开源协议:Apache License 2.0

下载


ml4j-neuralnets-demo

Creating a supervised NeuralNetwork

  1. // Create a simple 2-layer SupervisedFeedForwardNeuralNetwork
  2. // First, construct a MatrixFactory implementation.
  3. MatrixFactory matrixFactory = new JBlasMatrixFactory();
  4. // First Layer, takes a 28 * 28 grey-scale image as input (plus bias) - with every input Neuron connected to all 100 hidden Neurons
  5. // This Layer applies a sigmoid non-linearity and does not use batch-norm
  6. FeedForwardLayer<?, ?> firstLayer = new FullyConnectedFeedForwardLayerImpl(
  7. new Neurons3D(28, 28 ,1, true), new Neurons(100, false),
  8. new SigmoidActivationFunction(), matrixFactory, false);
  9. // Second Layer, takes the activations of the 100 hidden Neurons as input and produces activations of the 10 softmax output Neurons.
  10. FeedForwardLayer<?, ?> secondLayer =
  11. new FullyConnectedFeedForwardLayerImpl(new Neurons(100, true),
  12. new Neurons(10, false), new SoftmaxActivationFunction(), matrixFactory, false);
  13. // Neural Network
  14. SupervisedFeedForwardNeuralNetwork neuralNetwork
  15. = new SupervisedFeedForwardNeuralNetworkImpl(firstLayer, secondLayer);

Training the NeuralNetwork

  1. double[][] trainingData = ... ; // Our training data - one row per training example.
  2. double[][] trainingLabels = ... ; // Our training label activations - one row per training example.
  3. // Create NeuronsActivation instances for the inputs and desired outputs.
  4. NeuronsActivation trainingDataActivations = new NeuronsActivation(matrixFactory.createMatrix(trainingData), NeuronsActivationFeatureOrientation.COLUMNS_SPAN_FEATURE_SET);
  5. NeuronsActivation desiredOutputActivations = new NeuronsActivation(matrixFactory.createMatrix(trainingLabels), NeuronsActivationFeatureOrientation.COLUMNS_SPAN_FEATURE_SET);
  6. // Create a context to train the network from Layer 0 to the final Layer.
  7. FeedForwardNeuralNetworkContext context =
  8. new FeedForwardNeuralNetworkContextImpl(matrixFactory, 0, null);
  9. // Configure the context to train in mini-batches of 32 and to run for 100 Epochs
  10. // and specify the learning rate
  11. context.setTrainingMiniBatchSize(32);
  12. context.setTrainingEpochs(100);
  13. context.setTrainingLearningRate(0.05);
  14. // Train the NeuralNetwork
  15. neuralNetwork.train(trainingDataActivations, desiredOutputActivations, context);

Using the NeuralNetwork

  1. // Use the NeuralNetwork, to obtain output activations for test set data activations.
  2. double[][] testSetData = ... ; // Our test set data - one row per training example.
  3. // Create NeuronsActivation instance from this test set data.
  4. NeuronsActivation testSetDataActivations = new NeuronsActivation(matrixFactory.createMatrix(trainingData), NeuronsActivationFeatureOrientation.COLUMNS_SPAN_FEATURE_SET);
  5. // Obtain the output NeuronsActivation by forward propagating the input activations through the Network
  6. NeuronsActivation outputActivations =
  7. neuralNetwork.forwardPropagate(testSetDataActivations, context).getOutputs();
  8. // Use the output activations (eg. to classify, by taking the argmax of each row)