Skip to content

Commit 022bcc7

Browse files
[Bugfix] Fix 'ModuleNotFoundError: No module named 'intel_extension_for_pytorch'' for --tensor-parallel-size more than 1 (#12546)
1 parent c53dc46 commit 022bcc7

File tree

1 file changed

+11
-3
lines changed

1 file changed

+11
-3
lines changed

Diff for: vllm/distributed/parallel_state.py

+11-3
Original file line numberDiff line numberDiff line change
@@ -329,9 +329,17 @@ def all_reduce(self, input_: torch.Tensor) -> torch.Tensor:
329329
return input_
330330

331331
if input_.is_cpu:
332-
import intel_extension_for_pytorch as ipex
333-
ipex.distributed.all_reduce(input_, group=self.device_group)
334-
return input_
332+
try:
333+
import intel_extension_for_pytorch as ipex
334+
ipex.distributed.all_reduce(input_, group=self.device_group)
335+
return input_
336+
except ImportError:
337+
"""
338+
Intel IPEX not found. Falling back to PyTorch native
339+
all_reduce for CPU
340+
"""
341+
torch.distributed.all_reduce(input_, group=self.device_group)
342+
return input_
335343

336344
if self.tpu_communicator is not None and \
337345
not self.tpu_communicator.disabled:

0 commit comments

Comments
 (0)