Skip to content

Can't Run Demo #20

@FullMetalNicky

Description

@FullMetalNicky

I followed your instruction for creating the environment with python 3.8.
Everything builds fine, but when running the demo (instruction copied from the readme), there is an error.
It seems like there is an issue with dinov2.
Checking their repo, it seems like they no longer supports python 3.8.
Looks like when you use the hub, you get whatever is most current, so even though you specify specific commits in your setup script, this cannot be controlled.

How to overcome this? Can I try it with python 3.10?

I am using Ubuntu 22.04, CUDA 12. GPU is RTX 5090.
The error is:

[10/01 13:46:27 detectron2]: Full config saved to output/coco_examples/config.yaml
[10/01 13:46:27 d2.utils.env]: Using a generated random seed 27248742
Using cache found in /home/nickybones/.cache/torch/hub/facebookresearch_dinov2_main
/home/nickybones/.cache/torch/hub/facebookresearch_dinov2_main/dinov2/layers/swiglu_ffn.py:43: UserWarning: xFormers is available (SwiGLU)
warnings.warn("xFormers is available (SwiGLU)")
/home/nickybones/.cache/torch/hub/facebookresearch_dinov2_main/dinov2/layers/attention.py:27: UserWarning: xFormers is available (Attention)
warnings.warn("xFormers is available (Attention)")
Traceback (most recent call last):
File "demo/demo.py", line 193, in
launch(
File "/home/nickybones/miniconda3/envs/ovmono3d/lib/python3.8/site-packages/detectron2/engine/launch.py", line 84, in launch
main_func(*args)
File "demo/demo.py", line 145, in main
model = build_model(cfg)
File "/home/nickybones/Code/ovmono3d/cubercnn/modeling/meta_arch/rcnn3d.py", line 257, in build_model
model = META_ARCH_REGISTRY.get(meta_arch)(cfg, priors=priors)
File "/home/nickybones/miniconda3/envs/ovmono3d/lib/python3.8/site-packages/detectron2/config/config.py", line 189, in wrapped
explicit_args = _get_args_from_config(from_config_func, *args, **kwargs)
File "/home/nickybones/miniconda3/envs/ovmono3d/lib/python3.8/site-packages/detectron2/config/config.py", line 245, in _get_args_from_config
ret = from_config_func(*args, **kwargs)
File "/home/nickybones/Code/ovmono3d/cubercnn/modeling/meta_arch/rcnn3d.py", line 30, in from_config
backbone = build_backbone(cfg, priors=priors)
File "/home/nickybones/Code/ovmono3d/cubercnn/modeling/meta_arch/rcnn3d.py", line 274, in build_backbone
backbone = BACKBONE_REGISTRY.get(backbone_name)(cfg, input_shape, priors)
File "/home/nickybones/Code/ovmono3d/cubercnn/modeling/backbone/dino.py", line 99, in build_dino_backbone
bottom_up = DINOBackbone(
File "/home/nickybones/Code/ovmono3d/cubercnn/modeling/backbone/dino.py", line 29, in init
dino_vit = torch.hub.load(f"facebookresearch/{dino_name}", self.checkpoint_name)
File "/home/nickybones/miniconda3/envs/ovmono3d/lib/python3.8/site-packages/torch/hub.py", line 570, in load
model = _load_local(repo_or_dir, model, *args, **kwargs)
File "/home/nickybones/miniconda3/envs/ovmono3d/lib/python3.8/site-packages/torch/hub.py", line 599, in _load_local
model = entry(*args, **kwargs)
File "/home/nickybones/.cache/torch/hub/facebookresearch_dinov2_main/dinov2/hub/backbones.py", line 75, in dinov2_vitb14
return _make_dinov2_model(arch_name="vit_base", pretrained=pretrained, weights=weights, **kwargs)
File "/home/nickybones/.cache/torch/hub/facebookresearch_dinov2_main/dinov2/hub/backbones.py", line 33, in _make_dinov2_model
from ..models import vision_transformer as vits
File "/home/nickybones/.cache/torch/hub/facebookresearch_dinov2_main/dinov2/models/init.py", line 8, in
from . import vision_transformer as vits
File "/home/nickybones/.cache/torch/hub/facebookresearch_dinov2_main/dinov2/models/vision_transformer.py", line 21, in
from dinov2.layers import Mlp, PatchEmbed, SwiGLUFFNFused, MemEffAttention, NestedTensorBlock as Block
File "/home/nickybones/.cache/torch/hub/facebookresearch_dinov2_main/dinov2/layers/init.py", line 11, in
from .block import NestedTensorBlock, CausalAttentionBlock
File "/home/nickybones/.cache/torch/hub/facebookresearch_dinov2_main/dinov2/layers/block.py", line 18, in
from .attention import Attention, MemEffAttention
File "/home/nickybones/.cache/torch/hub/facebookresearch_dinov2_main/dinov2/layers/attention.py", line 36, in
class Attention(nn.Module):
File "/home/nickybones/.cache/torch/hub/facebookresearch_dinov2_main/dinov2/layers/attention.py", line 58, in Attention
self, init_attn_std: float | None = None, init_proj_std: float | None = None, factor: float = 1.0
TypeError: unsupported operand type(s) for |: 'type' and 'NoneType'

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions