You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Summary:
Two updates:
1. Github page: Added a line that the latest version supports fast gradient and ghost clipping.
2. Wesbite: Removed the line about passing in custom alphas in the privacy accountant in the FAQs section of website.
Differential Revision: D63790553
Copy file name to clipboardexpand all lines: README.md
+6
Original file line number
Diff line number
Diff line change
@@ -10,6 +10,11 @@
10
10
[Opacus](https://opacus.ai) is a library that enables training PyTorch models with differential privacy.
11
11
It supports training with minimal code changes required on the client, has little impact on training performance, and allows the client to online track the privacy budget expended at any given moment.
12
12
13
+
## News
14
+
**August, 2024**: The latest release supports Fast Gradient Clipping and Ghost Clipping (details in the [blogpost](https://pytorch.org/blog/clipping-in-opacus/)) to enable memory-efficient differentially private training of models. Feel free to try and share your [feedback](https://github.com/pytorch/opacus/issues).
15
+
16
+
17
+
13
18
## Target audience
14
19
This code release is aimed at two target audiences:
15
20
1. ML practitioners will find this to be a gentle introduction to training a model with differential privacy as it requires minimal code changes.
@@ -99,6 +104,7 @@ If you want to learn more about DP-SGD and related topics, check out our series
99
104
-[PriCon 2020 Tutorial: Differentially Private Model Training with Opacus](https://www.youtube.com/watch?v=MWPwofiQMdE&list=PLUNOsx6Az_ZGKQd_p4StdZRFQkCBwnaY6&index=52)
100
105
-[Differential Privacy on PyTorch | PyTorch Developer Day 2020](https://www.youtube.com/watch?v=l6fbl2CBnq0)
101
106
-[Opacus v1.0 Highlights | PyTorch Developer Day 2021](https://www.youtube.com/watch?v=U1mszp8lzUI)
107
+
-[Enabling Fast Gradient Clipping and Ghost Clipping in Opacus](https://pytorch.org/blog/clipping-in-opacus/)
0 commit comments