-
Notifications
You must be signed in to change notification settings - Fork 10.6k
Open
Labels
Potential BugUser is reporting a bug. This should be tested.User is reporting a bug. This should be tested.
Description
Custom Node Testing
- I have tried disabling custom nodes and the issue persists (see how to disable custom nodes if you need help)
Expected Behavior
a simpe genetration of an image,
Actual Behavior
during image genetration the following error message appears
Steps to Reproduce
install rocm-7.0.2, use a default workflow with ksampler -> ultimate SD upscale -> face detailer
after one image, the following will fail non stop,
Debug Logs
# ComfyUI Error Report
## Error Details
- **Node ID:** 116
- **Node Type:** FaceDetailer
- **Exception Type:** torch.AcceleratorError
- **Exception Message:** HIP error: an illegal memory access was encountered
HIP kernel errors might be asynchronously reported at some other API call, so the stacktrace below might be incorrect.
For debugging consider passing AMD_SERIALIZE_KERNEL=3
Compile with `TORCH_USE_HIP_DSA` to enable device-side assertions.
## Stack Trace
File "/home/lasse/ComfyUI/execution.py", line 496, in execute
output_data, output_ui, has_subgraph, has_pending_tasks = await get_output_data(prompt_id, unique_id, obj, input_data_all, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb, hidden_inputs=hidden_inputs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/lasse/ComfyUI/execution.py", line 315, in get_output_data
return_values = await _async_map_node_over_list(prompt_id, unique_id, obj, input_data_all, obj.FUNCTION, allow_interrupt=True, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb, hidden_inputs=hidden_inputs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/lasse/ComfyUI/custom_nodes/comfyui-lora-manager/py/metadata_collector/metadata_hook.py", line 165, in async_map_node_over_list_with_metadata
results = await original_map_node_over_list(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/lasse/ComfyUI/execution.py", line 289, in _async_map_node_over_list
await process_inputs(input_dict, i)
File "/home/lasse/ComfyUI/execution.py", line 277, in process_inputs
result = f(**inputs)
^^^^^^^^^^^
File "/home/lasse/ComfyUI/custom_nodes/comfyui-impact-pack/modules/impact/impact_pack.py", line 876, in doit
enhanced_img, cropped_enhanced, cropped_enhanced_alpha, mask, cnet_pil_list = FaceDetailer.enhance_face(
^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/lasse/ComfyUI/custom_nodes/comfyui-impact-pack/modules/impact/impact_pack.py", line 830, in enhance_face
DetailerForEach.do_detail(image, segs, model, clip, vae, guide_size, guide_size_for_bbox, max_size, seed, steps, cfg,
File "/home/lasse/ComfyUI/custom_nodes/comfyui-impact-pack/modules/impact/impact_pack.py", line 362, in do_detail
enhanced_image, cnet_pils = core.enhance_detail(cropped_image, model, clip, vae, guide_size, guide_size_for_bbox, max_size,
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/lasse/ComfyUI/custom_nodes/comfyui-impact-pack/modules/impact/core.py", line 383, in enhance_detail
refined_latent = impact_sampling.ksampler_wrapper(model2, seed2, steps2, cfg2, sampler_name2, scheduler2, positive2, negative2,
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/lasse/ComfyUI/custom_nodes/comfyui-impact-pack/modules/impact/impact_sampling.py", line 209, in ksampler_wrapper
refined_latent = separated_sample(model, True, seed, advanced_steps, cfg, sampler_name, scheduler,
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/lasse/ComfyUI/custom_nodes/comfyui-impact-pack/modules/impact/impact_sampling.py", line 182, in separated_sample
res = sample_with_custom_noise(model, add_noise, seed, cfg, positive, negative, impact_sampler, sigmas, latent_image, noise=noise, callback=callback)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/lasse/ComfyUI/custom_nodes/comfyui-impact-pack/modules/impact/impact_sampling.py", line 126, in sample_with_custom_noise
samples = comfy.sample.sample_custom(model, noise, cfg, sampler, sigmas, positive, negative, latent_image,
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/lasse/ComfyUI/comfy/sample.py", line 50, in sample_custom
samples = comfy.samplers.sample(model, noise, positive, negative, cfg, model.load_device, sampler, sigmas, model_options=model.model_options, latent_image=latent_image, denoise_mask=noise_mask, callback=callback, disable_pbar=disable_pbar, seed=seed)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/lasse/ComfyUI/comfy/samplers.py", line 1044, in sample
return cfg_guider.sample(noise, latent_image, sampler, sigmas, denoise_mask, callback, disable_pbar, seed)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/lasse/ComfyUI/comfy/samplers.py", line 1029, in sample
output = executor.execute(noise, latent_image, sampler, sigmas, denoise_mask, callback, disable_pbar, seed)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/lasse/ComfyUI/comfy/patcher_extension.py", line 112, in execute
return self.original(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/lasse/ComfyUI/comfy/samplers.py", line 997, in outer_sample
output = self.inner_sample(noise, latent_image, device, sampler, sigmas, denoise_mask, callback, disable_pbar, seed)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/lasse/ComfyUI/comfy/samplers.py", line 980, in inner_sample
samples = executor.execute(self, sigmas, extra_args, callback, noise, latent_image, denoise_mask, disable_pbar)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/lasse/ComfyUI/comfy/patcher_extension.py", line 112, in execute
return self.original(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/lasse/ComfyUI/comfy/samplers.py", line 752, in sample
samples = self.sampler_function(model_k, noise, sigmas, extra_args=extra_args, callback=k_callback, disable=disable_pbar, **self.extra_options)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/lasse/ComfyUI/.venv/lib/python3.12/site-packages/torch/utils/_contextlib.py", line 120, in decorate_context
return func(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^
File "/home/lasse/ComfyUI/comfy/k_diffusion/sampling.py", line 795, in sample_dpmpp_2m
denoised = model(x, sigmas[i] * s_in, **extra_args)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/lasse/ComfyUI/comfy/samplers.py", line 401, in __call__
out = self.inner_model(x, sigma, model_options=model_options, seed=seed)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/lasse/ComfyUI/comfy/samplers.py", line 953, in __call__
return self.outer_predict_noise(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/lasse/ComfyUI/comfy/samplers.py", line 960, in outer_predict_noise
).execute(x, timestep, model_options, seed)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/lasse/ComfyUI/comfy/patcher_extension.py", line 112, in execute
return self.original(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/lasse/ComfyUI/comfy/samplers.py", line 963, in predict_noise
return sampling_function(self.inner_model, x, timestep, self.conds.get("negative", None), self.conds.get("positive", None), self.cfg, model_options=model_options, seed=seed)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/lasse/ComfyUI/comfy/samplers.py", line 381, in sampling_function
out = calc_cond_batch(model, conds, x, timestep, model_options)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/lasse/ComfyUI/comfy/samplers.py", line 206, in calc_cond_batch
return _calc_cond_batch_outer(model, conds, x_in, timestep, model_options)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/lasse/ComfyUI/comfy/samplers.py", line 214, in _calc_cond_batch_outer
return executor.execute(model, conds, x_in, timestep, model_options)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/lasse/ComfyUI/comfy/patcher_extension.py", line 112, in execute
return self.original(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/lasse/ComfyUI/comfy/samplers.py", line 326, in _calc_cond_batch
output = model.apply_model(input_x, timestep_, **c).chunk(batch_chunks)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/lasse/ComfyUI/comfy/model_base.py", line 161, in apply_model
return comfy.patcher_extension.WrapperExecutor.new_class_executor(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/lasse/ComfyUI/comfy/patcher_extension.py", line 112, in execute
return self.original(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/lasse/ComfyUI/comfy/model_base.py", line 200, in _apply_model
model_output = self.diffusion_model(xc, t, context=context, control=control, transformer_options=transformer_options, **extra_conds).float()
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/lasse/ComfyUI/.venv/lib/python3.12/site-packages/torch/nn/modules/module.py", line 1773, in _wrapped_call_impl
return self._call_impl(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/lasse/ComfyUI/.venv/lib/python3.12/site-packages/torch/nn/modules/module.py", line 1784, in _call_impl
return forward_call(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/lasse/ComfyUI/comfy/ldm/modules/diffusionmodules/openaimodel.py", line 831, in forward
return comfy.patcher_extension.WrapperExecutor.new_class_executor(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/lasse/ComfyUI/comfy/patcher_extension.py", line 112, in execute
return self.original(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/lasse/ComfyUI/comfy/ldm/modules/diffusionmodules/openaimodel.py", line 873, in _forward
h = forward_timestep_embed(module, h, emb, context, transformer_options, time_context=time_context, num_video_frames=num_video_frames, image_only_indicator=image_only_indicator)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/lasse/ComfyUI/comfy/ldm/modules/diffusionmodules/openaimodel.py", line 38, in forward_timestep_embed
x = layer(x, emb)
^^^^^^^^^^^^^
File "/home/lasse/ComfyUI/.venv/lib/python3.12/site-packages/torch/nn/modules/module.py", line 1773, in _wrapped_call_impl
return self._call_impl(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/lasse/ComfyUI/.venv/lib/python3.12/site-packages/torch/nn/modules/module.py", line 1784, in _call_impl
return forward_call(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/lasse/ComfyUI/comfy/ldm/modules/diffusionmodules/openaimodel.py", line 239, in forward
return checkpoint(
^^^^^^^^^^^
File "/home/lasse/ComfyUI/comfy/ldm/modules/diffusionmodules/util.py", line 191, in checkpoint
return func(*inputs)
^^^^^^^^^^^^^
File "/home/lasse/ComfyUI/comfy/ldm/modules/diffusionmodules/openaimodel.py", line 252, in _forward
h = self.in_layers(x)
^^^^^^^^^^^^^^^^^
File "/home/lasse/ComfyUI/.venv/lib/python3.12/site-packages/torch/nn/modules/module.py", line 1773, in _wrapped_call_impl
return self._call_impl(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/lasse/ComfyUI/.venv/lib/python3.12/site-packages/torch/nn/modules/module.py", line 1784, in _call_impl
return forward_call(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/lasse/ComfyUI/.venv/lib/python3.12/site-packages/torch/nn/modules/container.py", line 244, in forward
input = module(input)
^^^^^^^^^^^^^
File "/home/lasse/ComfyUI/.venv/lib/python3.12/site-packages/torch/nn/modules/module.py", line 1773, in _wrapped_call_impl
return self._call_impl(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/lasse/ComfyUI/.venv/lib/python3.12/site-packages/torch/nn/modules/module.py", line 1784, in _call_impl
return forward_call(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/lasse/ComfyUI/comfy/ops.py", line 146, in forward
return self.forward_comfy_cast_weights(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/lasse/ComfyUI/comfy/ops.py", line 141, in forward_comfy_cast_weights
return self._conv_forward(input, weight, bias)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/lasse/ComfyUI/.venv/lib/python3.12/site-packages/torch/nn/modules/conv.py", line 543, in _conv_forward
return F.conv2d(
^^^^^^^^^
## System Information
- **ComfyUI Version:** 0.3.65
- **Arguments:** main.py --listen 0.0.0.0 --output-directory /home/lasse/MEGA/ComfyUI --use-pytorch-cross-attention --reserve-vram 1 --lowvram --fast --disable-smart-memory
- **OS:** posix
- **Python Version:** 3.12.3 (main, Aug 14 2025, 17:47:21) [GCC 13.3.0]
- **Embedded Python:** false
- **PyTorch Version:** 2.8.0+rocm7.0.2.git245bf6ed
## Devices
- **Name:** cuda:0 AMD Radeon Graphics : native
- **Type:** cuda
- **VRAM Total:** 17095983104
- **VRAM Free:** 15807266816
- **Torch VRAM Total:** 161480704
- **Torch VRAM Free:** 17809408
## Logs
2025-10-16T15:00:55.196764 - 2025-10-16T15:00:55.196864 - File "/home/lasse/ComfyUI/main.py", line 195, in prompt_worker
2025-10-16T15:00:55.197216 - 2025-10-16T15:00:55.197287 - 2025-10-16T15:00:55.197397 - e.execute(item[2], prompt_id, item[3], item[4])2025-10-16T15:00:55.197526 -
2025-10-16T15:00:55.197670 - 2025-10-16T15:00:55.197738 - File "/home/lasse/ComfyUI/execution.py", line 649, in execute
2025-10-16T15:00:55.198239 - 2025-10-16T15:00:55.198317 - 2025-10-16T15:00:55.198431 - asyncio.run(self.execute_async(prompt, prompt_id, extra_data, execute_outputs))2025-10-16T15:00:55.198526 -
2025-10-16T15:00:55.198680 - 2025-10-16T15:00:55.198785 - File "/usr/lib/python3.12/asyncio/runners.py", line 194, in run
2025-10-16T15:00:55.199070 - 2025-10-16T15:00:55.199147 - 2025-10-16T15:00:55.199274 - return runner.run(main)2025-10-16T15:00:55.199349 -
2025-10-16T15:00:55.199456 - 2025-10-16T15:00:55.199528 - 2025-10-16T15:00:55.199681 - 2025-10-16T15:00:55.199816 - 2025-10-16T15:00:55.199893 - 2025-10-16T15:00:55.199987 - 2025-10-16T15:00:55.200055 - 2025-10-16T15:00:55.200146 - 2025-10-16T15:00:55.200238 - 2025-10-16T15:00:55.200311 - 2025-10-16T15:00:55.200410 - 2025-10-16T15:00:55.200501 - 2025-10-16T15:00:55.200600 - ^2025-10-16T15:00:55.200710 - ^2025-10-16T15:00:55.200800 - ^2025-10-16T15:00:55.200891 - ^2025-10-16T15:00:55.200983 - ^2025-10-16T15:00:55.201043 - ^2025-10-16T15:00:55.201137 - ^2025-10-16T15:00:55.201219 - ^2025-10-16T15:00:55.201318 - ^2025-10-16T15:00:55.201490 - ^2025-10-16T15:00:55.201644 - ^2025-10-16T15:00:55.201754 - ^2025-10-16T15:00:55.201816 - ^2025-10-16T15:00:55.201906 - ^2025-10-16T15:00:55.201979 - ^2025-10-16T15:00:55.202075 - ^2025-10-16T15:00:55.202149 -
2025-10-16T15:00:55.202239 - 2025-10-16T15:00:55.202334 - File "/usr/lib/python3.12/asyncio/runners.py", line 118, in run
2025-10-16T15:00:55.202522 - 2025-10-16T15:00:55.202670 - 2025-10-16T15:00:55.202772 - return self._loop.run_until_complete(task)2025-10-16T15:00:55.202874 -
2025-10-16T15:00:55.202987 - 2025-10-16T15:00:55.203060 - 2025-10-16T15:00:55.203155 - 2025-10-16T15:00:55.203227 - 2025-10-16T15:00:55.203328 - 2025-10-16T15:00:55.203411 - 2025-10-16T15:00:55.203515 - 2025-10-16T15:00:55.203671 - 2025-10-16T15:00:55.203776 - 2025-10-16T15:00:55.203849 - 2025-10-16T15:00:55.203940 - 2025-10-16T15:00:55.204007 - 2025-10-16T15:00:55.204097 - ^2025-10-16T15:00:55.204188 - ^2025-10-16T15:00:55.204276 - ^2025-10-16T15:00:55.204382 - ^2025-10-16T15:00:55.204471 - ^2025-10-16T15:00:55.204533 - ^2025-10-16T15:00:55.204684 - ^2025-10-16T15:00:55.204785 - ^2025-10-16T15:00:55.204878 - ^2025-10-16T15:00:55.204978 - ^2025-10-16T15:00:55.205080 - ^2025-10-16T15:00:55.205169 - ^2025-10-16T15:00:55.205229 - ^2025-10-16T15:00:55.205359 - ^2025-10-16T15:00:55.205448 - ^2025-10-16T15:00:55.205537 - ^2025-10-16T15:00:55.205611 - ^2025-10-16T15:00:55.205718 - ^2025-10-16T15:00:55.205806 - ^2025-10-16T15:00:55.205866 - ^2025-10-16T15:00:55.205960 - ^2025-10-16T15:00:55.206042 - ^2025-10-16T15:00:55.206148 - ^2025-10-16T15:00:55.206303 - ^2025-10-16T15:00:55.206435 - ^2025-10-16T15:00:55.206525 - ^2025-10-16T15:00:55.206648 - ^2025-10-16T15:00:55.206725 - ^2025-10-16T15:00:55.206818 - ^2025-10-16T15:00:55.206918 - ^2025-10-16T15:00:55.206991 - ^2025-10-16T15:00:55.207122 - ^2025-10-16T15:00:55.207184 - ^2025-10-16T15:00:55.207312 - ^2025-10-16T15:00:55.207407 - ^2025-10-16T15:00:55.207536 -
2025-10-16T15:00:55.207613 - 2025-10-16T15:00:55.207712 - File "/usr/lib/python3.12/asyncio/base_events.py", line 687, in run_until_complete
2025-10-16T15:00:55.208176 - 2025-10-16T15:00:55.208256 - 2025-10-16T15:00:55.208357 - return future.result()2025-10-16T15:00:55.208460 -
2025-10-16T15:00:55.208652 - 2025-10-16T15:00:55.208728 - 2025-10-16T15:00:55.208858 - 2025-10-16T15:00:55.208998 - 2025-10-16T15:00:55.209071 - 2025-10-16T15:00:55.209147 - 2025-10-16T15:00:55.209247 - 2025-10-16T15:00:55.209406 - 2025-10-16T15:00:55.209499 - 2025-10-16T15:00:55.209572 - 2025-10-16T15:00:55.209687 - 2025-10-16T15:00:55.209787 - 2025-10-16T15:00:55.209848 - ^2025-10-16T15:00:55.209944 - ^2025-10-16T15:00:55.210071 - ^2025-10-16T15:00:55.210136 - ^2025-10-16T15:00:55.210233 - ^2025-10-16T15:00:55.210362 - ^2025-10-16T15:00:55.210423 - ^2025-10-16T15:00:55.210550 - ^2025-10-16T15:00:55.210695 - ^2025-10-16T15:00:55.210760 - ^2025-10-16T15:00:55.210889 - ^2025-10-16T15:00:55.210949 - ^2025-10-16T15:00:55.211074 - ^2025-10-16T15:00:55.211163 - ^2025-10-16T15:00:55.211259 - ^2025-10-16T15:00:55.211347 -
2025-10-16T15:00:55.211471 - 2025-10-16T15:00:55.211560 - File "/home/lasse/ComfyUI/execution.py", line 722, in execute_async
2025-10-16T15:00:55.211863 - 2025-10-16T15:00:55.211942 - 2025-10-16T15:00:55.212055 - comfy.model_management.unload_all_models()2025-10-16T15:00:55.212158 -
2025-10-16T15:00:55.212296 - 2025-10-16T15:00:55.212396 - File "/home/lasse/ComfyUI/comfy/model_management.py", line 1399, in unload_all_models
2025-10-16T15:00:55.212996 - 2025-10-16T15:00:55.213065 - 2025-10-16T15:00:55.213159 - free_memory(1e30, get_torch_device())2025-10-16T15:00:55.213268 -
2025-10-16T15:00:55.213380 - 2025-10-16T15:00:55.213508 - 2025-10-16T15:00:55.213651 - 2025-10-16T15:00:55.213750 - 2025-10-16T15:00:55.213881 - 2025-10-16T15:00:55.214008 - 2025-10-16T15:00:55.214068 - 2025-10-16T15:00:55.214199 - 2025-10-16T15:00:55.214306 - 2025-10-16T15:00:55.214439 - 2025-10-16T15:00:55.214545 - 2025-10-16T15:00:55.214697 - 2025-10-16T15:00:55.214810 - 2025-10-16T15:00:55.214901 - 2025-10-16T15:00:55.214993 - 2025-10-16T15:00:55.215083 - 2025-10-16T15:00:55.215144 - 2025-10-16T15:00:55.215271 - 2025-10-16T15:00:55.215366 - 2025-10-16T15:00:55.215493 - 2025-10-16T15:00:55.215597 - 2025-10-16T15:00:55.215699 - 2025-10-16T15:00:55.215789 - 2025-10-16T15:00:55.215875 - ^2025-10-16T15:00:55.215962 - ^2025-10-16T15:00:55.216066 - ^2025-10-16T15:00:55.216210 - ^2025-10-16T15:00:55.216316 - ^2025-10-16T15:00:55.216407 - ^2025-10-16T15:00:55.216506 - ^2025-10-16T15:00:55.216644 - ^2025-10-16T15:00:55.216709 - ^2025-10-16T15:00:55.216836 - ^2025-10-16T15:00:55.216927 - ^2025-10-16T15:00:55.216988 - ^2025-10-16T15:00:55.217112 - ^2025-10-16T15:00:55.217232 - ^2025-10-16T15:00:55.217346 - ^2025-10-16T15:00:55.217407 - ^2025-10-16T15:00:55.217498 - ^2025-10-16T15:00:55.217633 - ^2025-10-16T15:00:55.217729 -
2025-10-16T15:00:55.217836 - 2025-10-16T15:00:55.217977 - File "/home/lasse/ComfyUI/comfy/model_management.py", line 187, in get_torch_device
2025-10-16T15:00:55.218166 - 2025-10-16T15:00:55.218231 - 2025-10-16T15:00:55.218364 - return torch.device(torch.cuda.current_device())2025-10-16T15:00:55.218464 -
2025-10-16T15:00:55.218576 - 2025-10-16T15:00:55.218698 - 2025-10-16T15:00:55.218796 - 2025-10-16T15:00:55.218869 - 2025-10-16T15:00:55.218998 - 2025-10-16T15:00:55.219059 - 2025-10-16T15:00:55.219184 - 2025-10-16T15:00:55.219246 - 2025-10-16T15:00:55.219371 - 2025-10-16T15:00:55.219467 - 2025-10-16T15:00:55.219563 - 2025-10-16T15:00:55.219639 - 2025-10-16T15:00:55.219733 - 2025-10-16T15:00:55.219803 - 2025-10-16T15:00:55.219876 - 2025-10-16T15:00:55.219987 - 2025-10-16T15:00:55.220118 - 2025-10-16T15:00:55.220223 - 2025-10-16T15:00:55.220289 - 2025-10-16T15:00:55.220378 - 2025-10-16T15:00:55.220438 - 2025-10-16T15:00:55.220567 - 2025-10-16T15:00:55.220676 - 2025-10-16T15:00:55.220766 - 2025-10-16T15:00:55.220855 - 2025-10-16T15:00:55.220950 - ^2025-10-16T15:00:55.221046 - ^2025-10-16T15:00:55.221172 - ^2025-10-16T15:00:55.221266 - ^2025-10-16T15:00:55.221393 - ^2025-10-16T15:00:55.221482 - ^2025-10-16T15:00:55.221543 - ^2025-10-16T15:00:55.221650 - ^2025-10-16T15:00:55.221785 - ^2025-10-16T15:00:55.221846 - ^2025-10-16T15:00:55.221971 - ^2025-10-16T15:00:55.222057 - ^2025-10-16T15:00:55.222123 - ^2025-10-16T15:00:55.222200 - ^2025-10-16T15:00:55.222296 - ^2025-10-16T15:00:55.222436 - ^2025-10-16T15:00:55.222538 - ^2025-10-16T15:00:55.222678 - ^2025-10-16T15:00:55.222745 - ^2025-10-16T15:00:55.222833 - ^2025-10-16T15:00:55.222921 - ^2025-10-16T15:00:55.222981 - ^2025-10-16T15:00:55.223106 - ^2025-10-16T15:00:55.223200 - ^2025-10-16T15:00:55.223327 - ^2025-10-16T15:00:55.223414 - ^2025-10-16T15:00:55.223511 - ^2025-10-16T15:00:55.223656 -
2025-10-16T15:00:55.223733 - 2025-10-16T15:00:55.223863 - File "/home/lasse/ComfyUI/.venv/lib/python3.12/site-packages/torch/cuda/__init__.py", line 1072, in current_device
2025-10-16T15:00:55.224413 - 2025-10-16T15:00:55.224522 - 2025-10-16T15:00:55.224595 - return torch._C._cuda_getDevice()2025-10-16T15:00:55.224733 -
2025-10-16T15:00:55.224819 - 2025-10-16T15:00:55.224897 - 2025-10-16T15:00:55.224997 - 2025-10-16T15:00:55.225137 - 2025-10-16T15:00:55.225239 - 2025-10-16T15:00:55.225307 - 2025-10-16T15:00:55.225403 - 2025-10-16T15:00:55.225504 - 2025-10-16T15:00:55.225576 - 2025-10-16T15:00:55.225699 - 2025-10-16T15:00:55.225795 - 2025-10-16T15:00:55.225885 - 2025-10-16T15:00:55.225957 - ^2025-10-16T15:00:55.226083 - ^2025-10-16T15:00:55.226179 - ^2025-10-16T15:00:55.226276 - ^2025-10-16T15:00:55.226371 - ^2025-10-16T15:00:55.226496 - ^2025-10-16T15:00:55.226634 - ^2025-10-16T15:00:55.226711 - ^2025-10-16T15:00:55.226837 - ^2025-10-16T15:00:55.226957 - ^2025-10-16T15:00:55.227047 - ^2025-10-16T15:00:55.227144 - ^2025-10-16T15:00:55.227240 - ^2025-10-16T15:00:55.227322 - ^2025-10-16T15:00:55.227503 - ^2025-10-16T15:00:55.227635 - ^2025-10-16T15:00:55.227736 - ^2025-10-16T15:00:55.227833 - ^2025-10-16T15:00:55.227960 - ^2025-10-16T15:00:55.228059 - ^2025-10-16T15:00:55.228155 - ^2025-10-16T15:00:55.228243 - ^2025-10-16T15:00:55.228336 - ^2025-10-16T15:00:55.228460 - ^2025-10-16T15:00:55.228554 - ^2025-10-16T15:00:55.228699 - ^2025-10-16T15:00:55.228804 -
2025-10-16T15:00:55.228902 - 2025-10-16T15:00:55.229002 - torch2025-10-16T15:00:55.229121 - .2025-10-16T15:00:55.229247 - AcceleratorError2025-10-16T15:00:55.229308 - : 2025-10-16T15:00:55.229440 - HIP error: an illegal memory access was encountered
HIP kernel errors might be asynchronously reported at some other API call, so the stacktrace below might be incorrect.
For debugging consider passing AMD_SERIALIZE_KERNEL=3
Compile with `TORCH_USE_HIP_DSA` to enable device-side assertions.
2025-10-16T15:00:55.229563 -
## Attached Workflow
Please make sure that workflow does not contain any sensitive information such as API keys or passwords.
Workflow too large. Please manually upload the workflow from local file system.
## Additional Context
(Please add any additional context or steps to reproduce the error here)Other
No response
Metadata
Metadata
Assignees
Labels
Potential BugUser is reporting a bug. This should be tested.User is reporting a bug. This should be tested.