Difference between steps_per_epoch and epochs in model.fit_generator
See original GitHub issueI have doubt about the parameter steps_per_epoch and epochs in model.fit_generator.
I found that below two functions work the same way.
model.fit_generator(generate_arrays_from_file, steps_per_epoch=10, epochs=1)
model.fit_generator(generate_arrays_from_file, steps_per_epoch=1, epochs=10)
Am I right? Or I can do something between epochs?
Issue Analytics
- State:
- Created 6 years ago
- Reactions:1
- Comments:5 (2 by maintainers)
Top Results From Across the Web
What's the difference between "samples_per_epoch" and ...
fit() takes numpy arrays data into memory, while fit_generator() takes data from the sequence generator such as keras.utils.Sequence which works ...
Read more >What to set in steps_per_epoch in Keras' fit_generator?
steps_per_epoch: Integer. Total number of steps (batches of samples) to yield from generator before declaring one epoch finished and starting ...
Read more >keras.fit() and keras.fit_generator() - GeeksforGeeks
We can calculate the value of steps_per_epoch as the total number of samples in your dataset divided by the batch size. -> Epochs...
Read more >How to set steps_per_epoch,validation_steps and ...
steps_per_epoch is batches of samples to train. It is used to define how many batches of samples to use in one epoch. It...
Read more >Keras: BATCH_SIZE, STEPS_PER_EPOCH, and fit_generator
By definition, an epoch is a full pass of the dataset. The number of steps that are required to complete an epoch is...
Read more >
Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free
Top Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
model.fit_generator
requires the input dataset generator to run infinitely.steps_per_epoch
is used to generate the entire dataset once by calling the generatorsteps_per_epoch
times where asepochs
gives the number of times the model is trained over the entire dataset. As @ISosnovik pointed out, callbacks can be used to perform certain operations such as Tensorboard Logging (specific to Tensorflow backend), model checkpointing etc. at the end of each epoch.You can use callbacks. In this case, some operations would be performed when the epoch is started/ended.