Ask Your Question

Revision history [back]

click to hide/show revision 1
initial version

There can be several reasons for the increasing loss in each epoch during simple linear regression in PyTorch. Some of them are:

  1. Learning rate: If the learning rate is too high, the model may overshoot the optimal solution and continue to diverge. Therefore, it is important to fine-tune the learning rate.

  2. Poor initialization of weights: The weights of the model must be initialized correctly to avoid the model getting stuck in a local minimum.

  3. Data normalization: If the data is not normalized before training, the model may have difficulty converging.

  4. Model complexity: A model that is too complex may also lead to an increasing loss as it may overfit the training data.

  5. Insufficient training: The model might not have been trained for a sufficient number of epochs, and therefore not reached the minimum loss.

  6. Outliers: If there are outliers in the data, it can cause the model to have an increasing loss as it tries to fit to these outliers.