Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Use half precision to reduce memory usage. #74

Open
wants to merge 2 commits into
base: master
Choose a base branch
from

Conversation

kexul
Copy link

@kexul kexul commented May 21, 2024

Hi, this is a simple trick to reduce memory usage.

kexul added 2 commits May 21, 2024 16:47
Use half precision to reduce memory usage.
Use half precision.
@huchunjun425
Copy link

Hello, it is a pleasure to read this code, but when I want to train my own dataset, I find that "python download_training_dataset.py" this line of code needs this.py file to run, but I don't see this file, should I set up my own training dataset?

@kexul
Copy link
Author

kexul commented Sep 25, 2024

Hello, it is a pleasure to read this code, but when I want to train my own dataset, I find that "python download_training_dataset.py" this line of code needs this.py file to run, but I don't see this file, should I set up my own training dataset?

Sorry, I'm not the owner of this repo and never tried to train. You may consider consulting @hkchengrex for more information.

@hkchengrex
Copy link
Owner

Hello, it is a pleasure to read this code, but when I want to train my own dataset, I find that "python download_training_dataset.py" this line of code needs this.py file to run, but I don't see this file, should I set up my own training dataset?

@huchunjun425 The script is here: https://github.com/hkchengrex/CascadePSP/blob/master/scripts/download_training_dataset.py. I am not sure what you meant by "this.py".

@huchunjun425
Copy link

huchunjun425 commented Sep 29, 2024 via email

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants