项目作者: atanasster

项目描述 :
ES6 hyperparameters search for tfjs
高级语言: JavaScript
项目地址: git://github.com/atanasster/hyperparameters.git
创建时间: 2018-05-28T18:29:33Z
项目社区:https://github.com/atanasster/hyperparameters

开源协议:Apache License 2.0

下载


ES6 hyperparameters optimization

Build Status dependencies Status devDependencies Status License: MIT

:warning: Early version subject to changes.

Features

  • written in javascript - Use with tensorflow.js as a replacement to your python hyperparameters library
  • use from cdn or npm - Link hpjs in your html file from a cdn, or install in your project with npm
  • versatile - Utilize multiple parameters and multiple search algorithms (grid search, random, bayesian)

Installation

  1. $ npm install hyperparameters

Parameter Expressions

  1. import * as hpjs from 'hyperparameters';

hpjs.choice(options)

  • Randomly returns one of the options

hpjs.randint(upper)

  • Return a random integer in the range [0, upper)

hpjs.uniform(low, high)

  • Returns a single value uniformly between low and high i.e. any value between low and high has an equal probability of being selected

hpjs.quniform(low, high, q)

  • returns a quantized value of hp.uniform calculated as round(uniform(low, high) / q) * q

hpjs.loguniform(low, high)

  • Returns a value exp(uniform(low, high)) so the logarithm of the return value is uniformly distributed.

hpjs.qloguniform(low, high, q)

  • Returns a value round(exp(uniform(low, high)) / q) * q

hpjs.normal(mu, sigma)

  • Returns a real number that’s normally-distributed with mean mu and standard deviation sigma

hpjs.qnormal(mu, sigma, q)

  • Returns a value round(normal(mu, sigma) / q) * q

hpjs.lognormal(mu, sigma)

  • Returns a value exp(normal(mu, sigma))

hpjs.qlognormal(mu, sigma, q)

  • Returns a value round(exp(normal(mu, sigma)) / q) * q

Random numbers generator

  1. import { RandomState } from 'hyperparameters';

example:

  1. const rng = new RandomState(12345);
  2. console.log(rng.randrange(0, 5, 0.5));

Spaces

  1. import { sample } from 'hyperparameters';

example:

  1. import * as hpjs from 'hyperparameters';
  2. const space = {
  3. x: hpjs.normal(0, 2),
  4. y: hpjs.uniform(0, 1),
  5. choice: hpjs.choice([
  6. undefined, hp.uniform('float', 0, 1),
  7. ]),
  8. array: [
  9. hpjs.normal(0, 2), hpjs.uniform(0, 3), hpjs.choice([false, true]),
  10. ],
  11. obj: {
  12. u: hpjs.uniform(0, 3),
  13. v: hpjs.uniform(0, 3),
  14. w: hpjs.uniform(-3, 0)
  15. }
  16. };
  17. console.log(hpjs.sample.randomSample(space));

fmin - find best value of a function over the arguments

  1. import * as hpjs from 'hyperparameters';
  2. const trials = hpjs.fmin(optimizationFunction, space, estimator, max_estimates, options);

example:

  1. import * as hpjs from 'hyperparameters';
  2. const fn = x => ((x ** 2) - (x + 1));
  3. const space = hpjs.uniform(-5, 5);
  4. fmin(fn, space, hpjs.search.randomSearch, 1000, { rng: new hpjs.RandomState(123456) })
  5. .then(trials => console.log(result.argmin));

Getting started with tensorflow.js

1. include javascript file

  • include (latest) version from cdn

<script src="https://cdn.jsdelivr.net/npm/hyperparameters@latest/dist/hyperparameters.min.js" ></script>

  • create search space
    ```
    const space = {
    optimizer: hpjs.choice([‘sgd’, ‘adam’, ‘adagrad’, ‘rmsprop’]),
    epochs: hpjs.quniform(50, 250, 50),
    };
  1. * create tensorflow.js train function. Parameters are optimizer and epochs. input and output data passed as second argument

const trainModel = async ({ optimizer, epochs }, { xs, ys }) => {
// Create a simple model.
const model = tf.sequential();
model.add(tf.layers.dense({ units: 1, inputShape: [1] }));
// Prepare the model for training: Specify the loss and the optimizer.
model.compile({
loss: ‘meanSquaredError’,
optimizer
});
// Train the model using the data.
const h = await model.fit(xs, ys, { epochs });
return { model, loss: h.history.loss[h.history.loss.length - 1] };
};

  1. * create optimization function

const modelOpt = async ({ optimizer, epochs }, { xs, ys }) => {
const { loss } = await trainModel({ optimizer, epochs }, { xs, ys });
return { loss, status: hpjs.STATUS_OK };
};

  1. * find optimal hyperparameters

const trials = await hpjs.fmin(
modelOpt, space, hpjs.search.randomSearch, 10,
{ rng: new hpjs.RandomState(654321), xs, ys }
);
const opt = trials.argmin;
console.log(‘best optimizer’,opt.optimizer);
console.log(‘best no of epochs’, opt.epochs);

  1. ### 2. [install with npm](https://github.com/atanasster/hyperparameters/tree/master/examples/react-sample)
  2. * install hyperparameters in your package.json

$ npm install hyperparameters

  1. * import hyperparameters

import as tf from ‘@tensorflow/tfjs’;
import
as hpjs from ‘hyperparameters’;

  1. * create search space

const space = {
optimizer: hpjs.choice([‘sgd’, ‘adam’, ‘adagrad’, ‘rmsprop’]),
epochs: hpjs.quniform(50, 250, 50),
};

  1. * create tensorflow.js train function. Parameters are optimizer and epochs. input and output data passed as second argument

const trainModel = async ({ optimizer, epochs }, { xs, ys }) => {
// Create a simple model.
const model = tf.sequential();
model.add(tf.layers.dense({ units: 1, inputShape: [1] }));
// Prepare the model for training: Specify the loss and the optimizer.
model.compile({
loss: ‘meanSquaredError’,
optimizer
});
// Train the model using the data.
const h = await model.fit(xs, ys, { epochs });
return { model, loss: h.history.loss[h.history.loss.length - 1] };
};

  1. * create optimization function

const modelOpt = async ({ optimizer, epochs }, { xs, ys }) => {
const { loss } = await trainModel({ optimizer, epochs }, { xs, ys });
return { loss, status: hpjs.STATUS_OK };
};

  1. * find optimal hyperparameters

const trials = await hpjs.fmin(
modelOpt, space, hpjs.search.randomSearch, 10,
{ rng: new hpjs.RandomState(654321), xs, ys }
);
const opt = trials.argmin;
console.log(‘best optimizer’,opt.optimizer);
console.log(‘best no of epochs’, opt.epochs);
```

License

MIT © Atanas Stoyanov & Martin Stoyanov