Epoch
- Last UpdatedOct 07, 2024
- 1 minute read
- Glossary
An epoch refers to one cycle through the full training dataset. Usually training a skill takes more than a few epochs. For a discrete state detection skill the training is run for 20 epochs. Each epoch helps improves the skill's ability to correctly predict the classification or anomaly.