项目作者: HalfInner

项目描述 :
MLP Approximator. Conducting the research how number of perceptrons influences onto learning quality.
高级语言: Python
项目地址: git://github.com/HalfInner/MLPApproximator.git
创建时间: 2020-03-13T01:17:00Z
项目社区:https://github.com/HalfInner/MLPApproximator

开源协议:MIT License

下载


MLPApproximator

MLP Neural Network Learning

Educational project of function approximation. Main goal how increasing
number of hidden layer perceptron influence on accuracy and quality of approximation.

3 function are mapped into three inputs giving 3 outputs.

Getting Started

Installing

  • python3.7
  • numpy
  • matplotlib - optional
  1. python -m pip install -r requirements.txt
  2. python MLPApproximatorConsoleUI.py -h

Example of usage

  1. $> python MLPApproximatorConsoleUI.py -ds Examples/DataSetM5.txt -e 10
  2. Approximator: MLP Function Approximator
  3. Approximator: input number=3
  4. Approximator: output number=3
  5. Approximator: hidden number=3
  6. Approximator: Train on 82 samples
  7. Approximator: Epoch: 1/10
  8. Approximator: Epoch Time=0.0684s GlobalTime=0.0684s Loss=15.6%
  9. Approximator: Epoch: 2/10
  10. Approximator: Epoch Time=0.0408s GlobalTime=0.109s Loss=15.6%
  11. (...)
  12. Approximator: Epoch: 9/10
  13. Approximator: Epoch Time=0.0731s GlobalTime=0.498s Loss=15.6%
  14. Approximator: Epoch: 10/10
  15. Approximator: Epoch Time=0.057s GlobalTime=0.555s Loss=15.6%
  16. Approximator: Training Time=0.555s
  17. Approximator: Testing:
  18. Approximator: Loss=15.8%

Running the tests

Whole research is included into integration test. The result are saves into ‘TestResults’ folder.
It takes around 1h per group. 3 groups exist.

  1. python -m unittest MLPApproximatorTest.test_integration.TestIntegration

Author

License

This project is licensed under the MIT License - see the LICENSE.md file for details