Skip to content

Commit dbff3be

Browse files
[LLM] Transformers int4 example small typo fixes (intel#8550)
1 parent 8e5608b commit dbff3be

File tree

3 files changed

+5
-4
lines changed

3 files changed

+5
-4
lines changed

python/llm/example/transformers/transformers_int4/falcon/README.md

+2-2
Original file line numberDiff line numberDiff line change
@@ -29,8 +29,8 @@ from huggingface_hub import snapshot_download
2929

3030
# for tiiuae/falcon-7b-instruct
3131
model_path = snapshot_download(repo_id='tiiuae/falcon-7b-instruct',
32-
revision="c7f670a03d987254220f343c6b026ea0c5147185",
33-
cache_dir="dir/path/where/model/files/are/downloaded")
32+
revision="c7f670a03d987254220f343c6b026ea0c5147185",
33+
cache_dir="dir/path/where/model/files/are/downloaded")
3434
print(f'tiiuae/falcon-7b-instruct checkpoint is downloaded to {model_path}')
3535

3636
# for tiiuae/falcon-40b-instruct

python/llm/example/transformers/transformers_int4/mpt/README.md

+2-1
Original file line numberDiff line numberDiff line change
@@ -22,7 +22,7 @@ python ./generate.py --repo-id-or-model-path REPO_ID_OR_MODEL_PATH --prompt PROM
2222
```
2323

2424
Arguments info:
25-
- `--repo-id-or-model-path REPO_ID_OR_MODEL_PATH`: argument defining the huggingface repo id for the MPT model (e.g. `mosaicml/mpt-7b-chat` and `mosaicml/mpt-7b-chat`) to be downloaded, or the path to the huggingface checkpoint folder. It is default to be `'mosaicml/mpt-7b-chat'`.
25+
- `--repo-id-or-model-path REPO_ID_OR_MODEL_PATH`: argument defining the huggingface repo id for the MPT model (e.g. `mosaicml/mpt-7b-chat` and `mosaicml/mpt-30b-chat`) to be downloaded, or the path to the huggingface checkpoint folder. It is default to be `'mosaicml/mpt-7b-chat'`.
2626
- `--prompt PROMPT`: argument defining the prompt to be infered (with integrated prompt format for chat). It is default to be `'What is AI?'`.
2727
- `--n-predict N_PREDICT`: argument defining the max number of tokens to predict. It is default to be `32`.
2828

@@ -67,6 +67,7 @@ AI, or artificial intelligence, is the simulation of human intelligence in machi
6767

6868
#### [mosaicml/mpt-30b-chat](https://huggingface.co/mosaicml/mpt-30b-chat)
6969
```log
70+
Inference time: xxxx s
7071
-------------------- Prompt --------------------
7172
<|im_start|>user
7273
What is AI?<|im_end|>

python/llm/example/transformers/transformers_int4/mpt/generate.py

+1-1
Original file line numberDiff line numberDiff line change
@@ -29,7 +29,7 @@
2929
parser = argparse.ArgumentParser(description='Predict Tokens using `generate()` API for MPT model')
3030
parser.add_argument('--repo-id-or-model-path', type=str, default="mosaicml/mpt-7b-chat",
3131
help='The huggingface repo id for the MPT models'
32-
'(e.g. `mosaicml/mpt-7b-chat` and `mosaicml/mpt-7b-chat`) to be downloaded'
32+
'(e.g. `mosaicml/mpt-7b-chat` and `mosaicml/mpt-30b-chat`) to be downloaded'
3333
', or the path to the huggingface checkpoint folder')
3434
parser.add_argument('--prompt', type=str, default="What is AI?",
3535
help='Prompt to infer')

0 commit comments

Comments
 (0)