Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
27 commits
Select commit Hold shift + click to select a range
393fcf1
dbug scprint
jkobject Feb 18, 2025
0bc9c5a
allowing flash attn
jkobject Feb 18, 2025
3cb94a3
Update _viash.yaml
jkobject Feb 20, 2025
086946b
Update CHANGELOG
lazappi Feb 21, 2025
f56905e
adding some debug
jkobject Mar 4, 2025
92942b8
better model loading and new model
jkobject Mar 5, 2025
9042fba
final debug
jkobject Mar 6, 2025
3436648
Merge branch 'main' into main
jkobject Mar 7, 2025
6491b5b
better now
jkobject Mar 7, 2025
30facd8
finish debug
jkobject Mar 10, 2025
93c1833
Merge branch 'main' of https://github.com/jkobject/task_batch_integra…
jkobject Mar 10, 2025
0f9bf7b
ending tests successfully
jkobject Mar 13, 2025
446f23e
removing flag
jkobject Mar 14, 2025
0455229
new dataloader version
jkobject Mar 14, 2025
ab0136f
Update CHANGELOG
lazappi Mar 18, 2025
f4075f1
solving some issues
jkobject Apr 23, 2025
5e91918
Merge branch 'main' into main
jkobject Aug 12, 2025
ec2df33
update scprint
jkobject Aug 12, 2025
978293c
Merge branch 'main' of https://github.com/jkobject/task_batch_integra…
jkobject Aug 12, 2025
e93bd46
Update src/methods/scprint/script.py
jkobject Aug 29, 2025
0a80d00
Update src/methods/scprint/script.py
jkobject Sep 4, 2025
cbac6ce
improve the scgpt installation (now uses flash attention)
jkobject Sep 29, 2025
03abde8
Merge branch 'openproblems-bio:main' into main
jkobject Sep 29, 2025
a9b80b9
changing default parameters
jkobject Oct 13, 2025
6215ffa
rm scgpt
jkobject Oct 13, 2025
ce55a51
adding update
jkobject Oct 13, 2025
10dfe3c
Merge branch 'main' into scgpt_update
jkobject Oct 13, 2025
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
8 changes: 0 additions & 8 deletions src/methods/scgpt_finetuned/config.vsh.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -49,15 +49,7 @@ resources:
engines:
- type: docker
image: openproblems/base_pytorch_nvidia:1
# TODO: Try to find working installation of flash attention (flash-attn<1.0.5)
setup:
#- type: python
# pypi:
# - gdown
# - scgpt # Install from PyPI to get dependencies
#- type: docker
# # Force re-installing from GitHub to get bug fixes
# run: pip install --upgrade --no-deps --force-reinstall git+https://github.com/bowang-lab/scGPT.git
- type: docker
run: |
git clone https://github.com/bowang-lab/scGPT && \
Expand Down
9 changes: 0 additions & 9 deletions src/methods/scgpt_zeroshot/config.vsh.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -51,15 +51,6 @@ resources:
engines:
- type: docker
image: openproblems/base_pytorch_nvidia:1
# TODO: Try to find working installation of flash attention (flash-attn<1.0.5)
setup:
#- type: python
# pypi:
# - gdown
# - scgpt # Install from PyPI to get dependencies
#- type: docker
# # Force re-installing from GitHub to get bug fixes
# run: pip install --upgrade --no-deps --force-reinstall git+https://github.com/bowang-lab/scGPT.git
Comment on lines -54 to -62
Copy link
Member

@rcannood rcannood Oct 19, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
# TODO: Try to find working installation of flash attention (flash-attn<1.0.5)
setup:
#- type: python
# pypi:
# - gdown
# - scgpt # Install from PyPI to get dependencies
#- type: docker
# # Force re-installing from GitHub to get bug fixes
# run: pip install --upgrade --no-deps --force-reinstall git+https://github.com/bowang-lab/scGPT.git
setup:

This shouldn't be removed

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

why?

- type: docker
run: |
git clone https://github.com/bowang-lab/scGPT && \
Expand Down
4 changes: 2 additions & 2 deletions src/methods/scprint/config.vsh.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -61,7 +61,7 @@ arguments:
- name: --max_len
type: integer
description: The maximum length of the gene sequence.
default: 4000
default: 2300
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this was already merged, right?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

then I don't see why it still shows as a modification... :/


resources:
- type: python_script
Expand All @@ -86,7 +86,7 @@ engines:
script: from scdataloader.utils import populate_my_ontology; populate_my_ontology()
runners:
- type: executable
# docker_run_args: --gpus all
# docker_run_args: --gpus all
- type: nextflow
directives:
label: [hightime, highmem, midcpu, gpu, highsharedmem]
Loading