Skip to content

Commit

Permalink
Two small fixes related to the checkpoint loader (#36)
Browse files Browse the repository at this point in the history
- Checkpoint loader test: we do not need to download `config.json`
  anymore, since the loading is now in-place.
- Export checkpoint loader constructor from `models` module.
  • Loading branch information
danieldk authored Apr 16, 2024
1 parent 1b2e8a2 commit 872c7ec
Show file tree
Hide file tree
Showing 2 changed files with 1 addition and 1 deletion.
1 change: 1 addition & 0 deletions spacy_curated_transformers/models/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -6,6 +6,7 @@
build_camembert_transformer_model_v1,
build_camembert_transformer_model_v2,
build_pytorch_checkpoint_loader_v1,
build_pytorch_checkpoint_loader_v2,
build_roberta_transformer_model_v1,
build_roberta_transformer_model_v2,
build_xlmr_transformer_model_v1,
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -169,7 +169,6 @@ def test_pytorch_checkpoint_loader(test_config):

checkpoint_path = hf_hub_download(repo_id=model_name, filename="pytorch_model.bin")
# Curated Transformers needs the config to get the model hyperparameters.
hf_hub_download(repo_id=model_name, filename="config.json")
with_spans = build_with_strided_spans_v1(stride=96, window=128)
model = model_factory(
piece_encoder=piece_encoder, vocab_size=vocab_size, with_spans=with_spans
Expand Down

0 comments on commit 872c7ec

Please sign in to comment.