Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

raise get_last_ffi_error() #17565

Open
zmtttt opened this issue Dec 20, 2024 · 0 comments
Open

raise get_last_ffi_error() #17565

zmtttt opened this issue Dec 20, 2024 · 0 comments
Labels
needs-triage PRs or issues that need to be investigated by maintainers to find the right assignees to address it type:ci Relates to TVM CI infrastructure

Comments

@zmtttt
Copy link

zmtttt commented Dec 20, 2024

I run the llama-engine , with the instructions:
python3 ../run.py
--input_text "世界上第二高的山峰是哪座?"
--max_output_len=50
--log_level info
--tokenizer_dir ./Llama-2-7b-chat-hf
--engine_dir ./tmp/Llama-2-7b-chat-hf/trt_engines/fp16/1-npu/

but met the following problems, have you met the same problems? thanks!

problems:
raise get_last_ffi_error()
tvm._ffi.base.TVMError: basic_string: :substr: __pos (which is 31) > this->size() (which is 24)

@zmtttt zmtttt added needs-triage PRs or issues that need to be investigated by maintainers to find the right assignees to address it type:ci Relates to TVM CI infrastructure labels Dec 20, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
needs-triage PRs or issues that need to be investigated by maintainers to find the right assignees to address it type:ci Relates to TVM CI infrastructure
Projects
None yet
Development

No branches or pull requests

1 participant