"You want your training loss to be lower than your validation loss" - said Jeremy.

When you are underfitting, you can:

  • train for longer,
  • train the last bit at a lower learning rate.

If you are still underfitting, then you are going to have to decrease regularization. Jeremy announced that topic as yet to be covered in further episodes of the course.

Source: Lesson 3: Deep Learning 2019 - Data blocks; Multi-label classification; Segmentation