Understanding Batch of SGD in Tensorflow.keras

Asked 1 years ago, Updated 1 years ago, 99 views

There is no batch size in the SGD argument, how should I specify it?
optimizer=SGD(learning_rate=0.02)
I checked the contents of the SGD code, but it seemed that there was no part that changed the Batchsize
https://github.com/tensorflow/tensorflow/blob/v2.4.0/tensorflow/python/keras/optimizer_v2/gradient_descent.py#L30-L194

Processing looks like gradient method

Does Batch automatically use what is defined here?

train_ds=tf.data.Dataset.from_tensor_slices((x_train,y_train))).shuffle(10000).batch(32)
test_ds=tf.data.Dataset.from_tensor_slices((x_test,y_test)) .batch(32)

tensorflow keras

2022-09-30 19:15

1 Answers

As for the batch size, I think the one defined by nay is used.
If you look at the item tf.keras.optimizers.SGD in the TensorFlow document, it does not appear to have such arguments.
https://www.tensorflow.org/api_docs/python/tf/keras/optimizers/SGD


2022-09-30 19:15

If you have any answers or tips


© 2024 OneMinuteCode. All rights reserved.