I want to plot the learning rate and loss in the chain.

Asked 2 years ago, Updated 2 years ago, 33 views

I just implemented NN using mnist data.At that time
I'd like to plot a learning curve like this one here, but I don't know how to do it
There seems to be in the extension of the chain, but I don't know how to use it

Also, learning curve in Adam, AdaGrad, and SGD
I would like to compare learning curves with and without BN

+ Finally
I also want to see the Gradient value when comparing BN and none

Thank you for your cooperation!

import matplotlib.pyplot as plt
from chain import cuda
from chain import serializer
import chain
from chain import functions as F
from chain import links as L
from chain import Variable
import numpy as np
from chain import optimizers
from chain import training


train_full, test_full=chain.datasets.get_mnist()
train=chain.datasets.SubDataset(train_full,0,1000)
test=chainer.datasets.SubDataset(test_full,0,1000)


batchsize = 30
train_iter=chain.iterators.SerialIterator(train,batchsize)
test_iter = chain.iterators.SerialIterator(test, batchsize,
                                             repeat=False, shuffle=False)


class MultilayerPerceptron (chain.Chain):

    def_init__(self, n_units, n_out):
        super(MultilayerPerceptron, self).__init__()
        with self.init_scope():
            # full combination
            # at the same time, create a weight matrix(n_inputs, n_units)
            self.l1 = L. Linear(None,n_units)#n_in->n_units
            self.l2 = L. Linear (None, n_units) # n_units - > n_units
            self.l3 = L. Linear (None, n_out) # n_units - > n_out
            self.bn = L.BatchNormalization(n_units)

    def__call__(self, x):
        h1 = self.l1(x)
        hb1 = F.relu(h1)
        h2 = self.l2(hb1)
        hb2 = F.relu(self.l2(h2)))
        y = self.l3(hb2)
        returny


class MultilayerPerceptronV2 (MultilayerPerceptron):

    def__call__(self, x):
        # most common activation function
        h1 = self.l1(x)
        b1 = self.bn(h1)
        hb1 = F.relu(b1)
        h2 = self.l2(hb1)
        b2 = self.bn(h2)
        hb2 = F.relu(self.l2(b2)))
        y = self.l3(hb2)
        returny


model = L. Classifier (MultilayerPerceptron (784, 10))
# choose optimizer
# AdaDelta, AdaGrad, Adam, MomentumSGD, NesterovAG, RMSprop, RMSprop Graves, SGD, SMORMS3
opt=optimizers.SGD()
# self.setup (Link or Chain)
opt.setup(model)

# device=-1 means Using CPU
updater=training.StandardUpdater(train_iter,opt,device=-1)

epoch=10
trainer=training.Trainer(updater, (epoch, 'epoch', out='/tmp/result')
trainer.extend(training.extensions.Evaluator(test_iter, model, device=-1)
trainer.extend(training.extensions.LogReport(trigger=(1, "epoch"))))
trainer.extend(training.extensions.PrintReport(
    ['epoch', 'main/loss', 'main/accuracy', 'validation/main/loss', 'validation/main/accuracy', 'elapped_time', ],
    trigger=(1, "epoch")
trainer.run()

python python3

2022-09-30 16:16

1 Answers

In short, PlotReport does not allow you to display the results of multiple experiments with optimizer or model changes in a single image.

To put it a little longer,

  • Change optimizer and model to run multiple experiments (while changing directories)
  • Read [execution directory]/result/log and draw your own graph

This makes it possible.

In a longer term, loss information is dumped in JSON format by LogReport to log files (typically [execution directory]/result/log).
Therefore, the recorded loss will be displayed using the appropriate JSON parser and graphic libraries.

The internal implementation of PlotReport uses matplotlib as a graphic library, and for Python, the JSON parser has a library called json as a standard library.

We recommend that you build a separate topic and ask questions about how to use these individual libraries.


2022-09-30 16:16

If you have any answers or tips


© 2024 OneMinuteCode. All rights reserved.