Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

CogVideoX-5B-T2V black screen running colab #626

Open
1 of 2 tasks
kazakovaanastasia opened this issue Dec 28, 2024 · 3 comments
Open
1 of 2 tasks

CogVideoX-5B-T2V black screen running colab #626

kazakovaanastasia opened this issue Dec 28, 2024 · 3 comments
Assignees

Comments

@kazakovaanastasia
Copy link

System Info / 系統信息

Inferenced notebook with GPU T4 for 2 times. Every time got 10 seconds black screen video as an output. Tried to change fps, did not help. Should I use requirements.txt from this repo before running or do something different?

CogVideoX-5B-T2V-Int8-Colab.ipynb: CogVideoX-5B Quantized Text-to-Video Inference Colab code, which takes about 30 minutes per run.

Information / 问题信息

  • The official example scripts / 官方的示例脚本
  • My own modified scripts / 我自己修改的脚本和任务

Reproduction / 复现过程

1.download CogVideoX-5B-T2V-Int8-Colab.ipynb: CogVideoX-5B Quantized Text-to-Video Inference Colab code, which takes about 30 minutes per run.

  1. run on kaggle T4 GPU

  2. got black video in outputs

Expected behavior / 期待表现

1.download CogVideoX-5B-T2V-Int8-Colab.ipynb: CogVideoX-5B Quantized Text-to-Video Inference Colab code, which takes about 30 minutes per run.

  1. run on kaggle T4 GPU

  2. gotnormal video in outputs

@zRzRzRzRzRzRzR zRzRzRzRzRzRzR self-assigned this Dec 29, 2024
@zRzRzRzRzRzRzR
Copy link
Member

Where is the INT8 implementation of this model, we have only tested the performance of BF16, the black screen video is most likely because the model did not generate normally, with the same configuration you can completely use the BF16 version, it is recommended to modify and test according to the link provided in our readme. Thank you for your understanding.

@kazakovaanastasia
Copy link
Author

Where is the INT8 implementation of this model, we have only tested the performance of BF16, the black screen video is most likely because the model did not generate normally, with the same configuration you can completely use the BF16 version, it is recommended to modify and test according to the link provided in our readme. Thank you for your understanding.

Thanks for answering!
I found the link to this code here, in your readme - it is official example script.

ауаукаку

In this script i should chenge this to bf16?.. , right?
pc2

it is recommended to modify and test according to the link provided in our readme --> you mean this one?

cli_demo_quantization: Quantized model inference code that can run on devices with lower memory. You can also modify this code to support running CogVideoX models in FP8 precisi

@kazakovaanastasia
Copy link
Author

maybe there is a problem with versions, i also want try use [requirements.txt] when running colab CogVideoX-5B-T2V-Int8-Colab.ipynb

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants