Skip to content

Commit 99a6fac

Browse files
EnayatUllahfacebook-github-bot
authored andcommitted
Website and Github update (#677)
Summary: Two updates: 1. Github page: Added a line that the latest version supports fast gradient and ghost clipping. 2. Wesbite: Removed the line about passing in custom alphas in the privacy accountant in the FAQs section of website. Differential Revision: D63790553
1 parent 53b3c25 commit 99a6fac

File tree

4 files changed

+21
-15
lines changed

4 files changed

+21
-15
lines changed

README.md

+2
Original file line numberDiff line numberDiff line change
@@ -10,6 +10,7 @@
1010
[Opacus](https://opacus.ai) is a library that enables training PyTorch models with differential privacy.
1111
It supports training with minimal code changes required on the client, has little impact on training performance, and allows the client to online track the privacy budget expended at any given moment.
1212

13+
1314
## Target audience
1415
This code release is aimed at two target audiences:
1516
1. ML practitioners will find this to be a gentle introduction to training a model with differential privacy as it requires minimal code changes.
@@ -99,6 +100,7 @@ If you want to learn more about DP-SGD and related topics, check out our series
99100
- [PriCon 2020 Tutorial: Differentially Private Model Training with Opacus](https://www.youtube.com/watch?v=MWPwofiQMdE&list=PLUNOsx6Az_ZGKQd_p4StdZRFQkCBwnaY6&index=52)
100101
- [Differential Privacy on PyTorch | PyTorch Developer Day 2020](https://www.youtube.com/watch?v=l6fbl2CBnq0)
101102
- [Opacus v1.0 Highlights | PyTorch Developer Day 2021](https://www.youtube.com/watch?v=U1mszp8lzUI)
103+
- [Enabling Fast Gradient Clipping and Ghost Clipping in Opacus](https://pytorch.org/blog/clipping-in-opacus/)
102104

103105

104106
## FAQ

docs/faq.md

+2-1
Original file line numberDiff line numberDiff line change
@@ -108,7 +108,8 @@ Opacus computes and stores *per-sample* gradients under the hood. What this mean
108108

109109
Although we report expended privacy budget using the (epsilon, delta) language, internally, we track it using Rényi Differential Privacy (RDP) [[Mironov 2017](https://arxiv.org/abs/1702.07476), [Mironov et al. 2019](https://arxiv.org/abs/1908.10530)]. In short, (alpha, epsilon)-RDP bounds the [Rényi divergence](https://en.wikipedia.org/wiki/R%C3%A9nyi_entropy#R%C3%A9nyi_divergence) of order alpha between the distribution of the mechanism’s outputs on any two datasets that differ in a single element. An (alpha, epsilon)-RDP statement is a relaxation of epsilon-DP but retains many of its important properties that make RDP particularly well-suited for privacy analysis of DP-SGD. The `alphas` parameter instructs the privacy engine what RDP orders to use for tracking privacy expenditure.
110110

111-
When the privacy engine needs to bound the privacy loss of a training run using (epsilon, delta)-DP for a given delta, it searches for the optimal order from among `alphas`. There’s very little additional cost in expanding the list of orders. We suggest using a list `[1 + x / 10.0 for x in range(1, 100)] + list(range(12, 64))`. You can pass your own alphas by passing `alphas=custom_alphas` when calling `privacy_engine.make_private_with_epsilon`.
111+
When the privacy engine needs to bound the privacy loss of a training run using (epsilon, delta)-DP for a given delta, it searches for the optimal order from among `alphas`. There’s very little additional cost in expanding the list of orders. We suggest using a list `[1 + x / 10.0 for x in range(1, 100)] + list(range(12, 64))`.
112+
<!-- You can pass your own alphas by passing `alphas=custom_alphas` when calling `privacy_engine.make_private_with_epsilon`. -->
112113

113114
A call to `privacy_engine.get_epsilon(delta=delta)` returns a pair: an epsilon such that the training run satisfies (epsilon, delta)-DP and an optimal order alpha. An easy diagnostic to determine whether the list of `alphas` ought to be expanded is whether the returned value alpha is one of the two boundary values of `alphas`.
114115

website/package.json

+14-12
Original file line numberDiff line numberDiff line change
@@ -9,21 +9,23 @@
99
"rename-version": "docusaurus-rename-version"
1010
},
1111
"devDependencies": {
12-
"docusaurus": "^1.9.0"
12+
"docusaurus": "^1.14.7"
1313
},
1414
"dependencies": {
15-
"prismjs": "^1.23.0",
16-
"bl": "^1.2.3"
15+
"@babel/helper-compilation-targets": "^8.0.0-alpha.14",
16+
"bl": "^5.0.0",
17+
"browserslist": "^4.21.4",
18+
"prismjs": "^1.29.0"
1719
},
1820
"resolutions": {
19-
"trim-newlines": "3.0.1",
20-
"normalize-url": "4.5.1",
21-
"highlight.js" : "10.5.0",
22-
"react-dev-utils": "11.0.4",
23-
"immer": "8.0.1",
24-
"prismjs": "1.23.0",
25-
"bl": "1.2.3",
26-
"glob-parent": "5.1.2",
27-
"browserslist": "4.16.5"
21+
"trim-newlines": "^4.0.2",
22+
"normalize-url": "^6.1.0",
23+
"highlight.js": "^11.8.0",
24+
"react-dev-utils": "^12.0.0",
25+
"immer": "^10.0.0",
26+
"prismjs": "^1.29.0",
27+
"bl": "^5.0.0",
28+
"glob-parent": "^6.0.2",
29+
"browserslist": "^4.21.4"
2830
}
2931
}

website/scripts/build_website.sh

+3-2
Original file line numberDiff line numberDiff line change
@@ -61,7 +61,7 @@ cd ..
6161
mkdir -p "website/pages/api/"
6262

6363
cwd=$(pwd)
64-
python website/scripts/parse_sphinx.py -i "${cwd}/website/sphinx/build/html/" -o "${cwd}/website/pages/api/" || exit 1
64+
# python3 website/scripts/parse_sphinx.py -i "${cwd}/website/sphinx/build/html/" -o "${cwd}/website/pages/api/" || exit 1
6565

6666
SPHINX_JS_DIR='website/sphinx/build/html/_static/'
6767
DOCUSAURUS_JS_DIR='website/static/js/'
@@ -87,7 +87,8 @@ echo "Generating tutorials"
8787
echo "-----------------------------------"
8888
mkdir -p "website/static/files"
8989
mkdir "website/tutorials"
90-
python website/scripts/parse_tutorials.py -w "${cwd}" || exit 1
90+
pip install --upgrade nbconvert
91+
python3 website/scripts/parse_tutorials.py -w "${cwd}" || exit 1
9192

9293
cd website || exit
9394

0 commit comments

Comments
 (0)