Homeβ€ΊGlossaryβ€ΊUnderfitting
Training

Underfitting

When a model is too simple or insufficiently trained to capture meaningful patterns in the data.

Underfitting occurs when a model fails to learn the structure of the data well enough to perform the task. It performs poorly on both training data and new data because it never developed strong predictive ability in the first place.

This can happen when the model is too small, training ends too early, the features are poor, or the optimization setup is ineffective. In practical terms, underfitting means the system is leaving performance on the table.

Rule of thumb: If the model is bad everywhere, you may have an underfitting problem. If it is great on training data but bad elsewhere, that points to overfitting.

How Teams Fix Underfitting

  • Increase model capacity β€” use a stronger architecture or more parameters
  • Train longer β€” allow optimization to converge further
  • Improve features or data β€” better signals lead to better learning
  • Tune optimization β€” adjust learning rate, batch size, or optimizer

Model development is often about navigating between underfitting and overfitting. The sweet spot is where the model is expressive enough to learn the task but still generalizes well.

Related Terms

← Back to Glossary