fix bug in tuning/utils.py: add optimizer.zero_grad() before loss.backward()

This commit is contained in:
Hongji Wang
2025-10-13 20:50:29 +08:00
parent 55ba6e2825
commit 64fb49e1c8

View File

@@ -240,6 +240,7 @@ def train(config,
loss = criterion(stacked, targets)
loss = (loss * masks).mean()
optimizer.zero_grad()
loss.backward()
optimizer.step()
losses.update(loss.item(), masks.numel())