Background

I’ve been pretty busy this semester, but now that I’m in winter break, I thought I’d do another HPJS example. Since my last article, there have been some nice updates in the JS machine learning community. For example, Tensorflow recently released tfjs-vis, which lets you visualize info about your models. It works really well with existing tfjs models in my opinion, and I’ve used it in this example.

For this example, I’ll go over how to use HPJS to optimize some hyperparameters (optimizer and # of layers in this case) in a tfjs model for the iris dataset. If you’re unfamiliar with HPJS, I’d recommend reading this article for some background. On to the example.

Example

You can run the example and view the code here. However, I’ve pasted the code below as well. If the trainModel and modelOpt functions are confusing, I’d recommend this article, where step by step I go through creating a simple example with hpjs.

This example is based on an iris example by the tensorflow team; I’ve just added hpjs hyperparameter optimization on top of it. On thing to keep in mind: any part of the code that mentions “callbacks” (lines 29 and 62) is only for the user interface in our example and can be ignored.

As we can see in the defined search space (line 52), the hyperparameters we’re optimizing are the number of layers and the optimizer:

const space = {

optimizer: hpjs.choice([‘sgd’, ‘adam’, ‘adagrad’, ‘rmsprop’]),

numLayers: hpjs.quniform(2, 5, 1),

};

We’re implementing the random layer # in the optimization function like this (line 12):

// adding random number of layers

for (let i = 0; i < numLayers; i += 1) {

model.add(tf.layers.dense({

inputShape: i === 0 ? [4] : [10], // input shape 4 for 1st layer

activation: i === numLayers - 1 ? 'softmax' : 'sigmoid',

units: i === numLayers - 1 ? 3 : 10, // last layer is 3 units

}));

}

Since we’re passing in the random ‘numLayers’ variable from the search space, the for loop will run that amount of times.

The input shape is 4 for first layer since we’re given 4 data points for each flower (sepal length and width, and petal length and width).

For the activation fn, we’re using softmax for the last layer, as per tensorflowjs’ example.

The units for the last layer are set to 3 since we’re classifying the 3 flower types.

For the optimizer, we’re choosing from our list of optimizers when compiling(line 22) with a learning rate 0.01:

model.compile({

loss: 'categoricalCrossentropy',

metrics: ['accuracy'],

optimizer: optimizers[optimizer](0.01), // line 22

});

In tfjs you can set the optimizer as a string or create an object in order to pass parameters such as the learning rate. We’re doing the second to keep with the example by the tensorflow team. Since in tfjs you set an optimizer like “tf.train.<optimizer>, I defined an object on line 2 that does the appropriate translation when we call it on line 22.

const optimizers = {

sgd: tf.train.sgd,

adagrad: tf.train.adagrad,

adam: tf.train.adam,

adamax: tf.train.adamax,

rmsprop: tf.train.rmsprop,

};

That’s pretty much it for the iris specific stuff.

You can now run our iris example and see that we have used tfjs-vis to display info like loss, accuracy and a confusion matrix for each trial. Currently, our implementation of tfjs-vis seems to be the first using React, so if you’re working on a similar project, check out our code here.