Skip to content

Commit 53b3c25

Browse files
EnayatUllahfacebook-github-bot
authored andcommitted
Fixing Ghost Clipping with Batch Memory Manager
Summary: Ghost Clipping with Batch memory manager had an error, resulting in major accuracy loss. The issue was in the accumulate function, the command `p.summed_grad +=p.grad`, wasn't working as expected, since `p.grad` is modified in every iteration. The fix is to copy it and modify in-place. Reviewed By: HuanyuZhang Differential Revision: D67778159 fbshipit-source-id: b103cf95905c0b1feb9745249ec1669c95c11979
1 parent 6b756a7 commit 53b3c25

File tree

1 file changed

+3
-2
lines changed

1 file changed

+3
-2
lines changed

opacus/optimizers/optimizer_fast_gradient_clipping.py

+3-2
Original file line numberDiff line numberDiff line change
@@ -14,6 +14,7 @@
1414

1515
from __future__ import annotations
1616

17+
import copy
1718
import logging
1819
from typing import Callable, Optional
1920

@@ -112,9 +113,9 @@ def accumulate(self):
112113
"""
113114
for p in self.params:
114115
if p.summed_grad is not None:
115-
p.summed_grad += p.grad
116+
p.summed_grad.add_(p.grad.data)
116117
else:
117-
p.summed_grad = p.grad
118+
p.summed_grad = copy.deepcopy(p.grad.data)
118119

119120
def zero_grad(self, set_to_none: bool = False):
120121
"""

0 commit comments

Comments
 (0)