Ask Your Question
2

What is the reason for the increasing loss in each epoch during simple linear regression in PyTorch?

asked 2021-04-24 11:00:00 +0000

nofretete gravatar image

edit retag flag offensive close merge delete

1 Answer

Sort by ยป oldest newest most voted
1

answered 2021-10-20 10:00:00 +0000

lalupa gravatar image

There can be several reasons for the increasing loss in each epoch during simple linear regression in PyTorch. Some of them are:

  1. Learning rate: If the learning rate is too high, the model may overshoot the optimal solution and continue to diverge. Therefore, it is important to fine-tune the learning rate.

  2. Poor initialization of weights: The weights of the model must be initialized correctly to avoid the model getting stuck in a local minimum.

  3. Data normalization: If the data is not normalized before training, the model may have difficulty converging.

  4. Model complexity: A model that is too complex may also lead to an increasing loss as it may overfit the training data.

  5. Insufficient training: The model might not have been trained for a sufficient number of epochs, and therefore not reached the minimum loss.

  6. Outliers: If there are outliers in the data, it can cause the model to have an increasing loss as it tries to fit to these outliers.

edit flag offensive delete link more

Your Answer

Please start posting anonymously - your entry will be published after you log in or create a new account. This space is reserved only for answers. If you would like to engage in a discussion, please instead post a comment under the question or an answer that you would like to discuss

Add Answer


Question Tools

Stats

Asked: 2021-04-24 11:00:00 +0000

Seen: 7 times

Last updated: Oct 20 '21