optimizer.zero_grad()

2022-09-02 21:29:18 浏览数 (2)

代码语言:javascript复制
# zero the parameter gradients
optimizer.zero_grad()
# forward   backward   optimize
outputs = net(inputs)
loss = criterion(outputs, labels)
loss.backward()
optimizer.step()

optimizer.zero_grad()意思是把梯度置零,也就是把loss关于weight的导数变成0.

0 人点赞