We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
There was an error while loading. Please reload this page.
1 parent 2d8ad4d commit 6ab013eCopy full SHA for 6ab013e
modelopt/torch/quantization/algorithms.py
@@ -789,6 +789,7 @@ def register_custom_support(
789
"""(Optional) Register custom support for `AutoQuantize` score estimation.
790
791
This custom support is used to enable memory/compute efficient backward gradient propagation. This involves:
792
+
793
- `grad_ckpt_context`: backward pass with gradient checkpointing enabled
794
- `is_param_grad_enabled`: AutoQuantize only needs activation gradients to be computed (not weight
795
gradients). `is_param_grad_enabled` is used to select which parameters should have gradients enabled,
0 commit comments