Skip to content

Commit f4dc430

Browse files
HuanyuZhangfacebook-github-bot
authored andcommitted
Fixing the corner case when the optimizer has no trainable parameters (#619)
Summary: Pull Request resolved: #619 We made the following changes: 1. We has fixed the corner case when the optimizer has no trainable parameters. This might happen when there are more than one optimizers while some of them are frozen during the fine-tuning. 2. We have changed the "closure" logic in the "step" function in "ddpoptimizer.py", to make it consistent with "optimizer.py". Differential Revision: D53055273 fbshipit-source-id: 4e8e1e6184f1c9d380da862f585bdad2d6c2bf55
1 parent d0290d7 commit f4dc430

File tree

2 files changed

+10
-1
lines changed

2 files changed

+10
-1
lines changed

opacus/optimizers/ddpoptimizer.py

Lines changed: 5 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -70,8 +70,12 @@ def reduce_gradients(self):
7070
def step(
7171
self, closure: Optional[Callable[[], float]] = None
7272
) -> Optional[torch.Tensor]:
73+
if closure is not None:
74+
with torch.enable_grad():
75+
closure()
76+
7377
if self.pre_step():
7478
self.reduce_gradients()
75-
return self.original_optimizer.step(closure)
79+
return self.original_optimizer.step()
7680
else:
7781
return None

opacus/optimizers/optimizer.py

Lines changed: 5 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -491,6 +491,11 @@ def pre_step(
491491
closure: A closure that reevaluates the model and
492492
returns the loss. Optional for most optimizers.
493493
"""
494+
# The corner case when the optimizer has no trainable parameters.
495+
# Essentially the DPOptimizer act as a normal optimizer
496+
if self.grad_samples is None or len(self.grad_samples) == 0:
497+
return True
498+
494499
self.clip_and_accumulate()
495500
if self._check_skip_next_step():
496501
self._is_last_step_skipped = True

0 commit comments

Comments
 (0)