Skip to content

error after installing gpt4all - and a fix #651

Open
@couhajjou

Description

@couhajjou

Error

Traceback (most recent call last):
  File "/usr/local/bin/llm", line 5, in
    from llm.cli import cli
  File "/usr/local/Cellar/llm/0.18/libexec/lib/python3.13/site-packages/llm/cli.py", line 1852, in
    load_plugins**()**
    ~~~~~~~~~~~~^^
  File "/usr/local/Cellar/llm/0.18/libexec/lib/python3.13/site-packages/llm/plugins.py", line 25, in load_plugins
    pm.load_setuptools_entrypoints**("llm")**
    ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~^^^^^^^
  File "/usr/local/Cellar/llm/0.18/libexec/lib/python3.13/site-packages/pluggy/_manager.py", line 421, in load_setuptools_entrypoints
    plugin = ep.load()
  File "/usr/local/Cellar/[email protected]/3.13.0_1/Frameworks/Python.framework/Versions/3.13/lib/python3.13/importlib/metadata/init.py", line 179, in load
    module = import_module(match.group('module'))
  File "/usr/local/Cellar/[email protected]/3.13.0_1/Frameworks/Python.framework/Versions/3.13/lib/python3.13/importlib/init.py", line 88, in import_module
    return bootstrap.gcd_import(name[level:], package, level)**
           ~~~~~~~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/Cellar/llm/0.18/libexec/lib/python3.13/site-packages/llm_gpt4all.py", line 1, in
    from gpt4all import GPT4All as _GPT4All
  File "/usr/local/Cellar/llm/0.18/libexec/lib/python3.13/site-packages/gpt4all/init.py", line 1, in
    from .gpt4all import CancellationError as CancellationError, Embed4All as Embed4All, GPT4All as GPT4All
  File "/usr/local/Cellar/llm/0.18/libexec/lib/python3.13/site-packages/gpt4all/gpt4all.py", line 23, in
    from ._pyllmodel import (CancellationError as CancellationError, EmbCancelCallbackType, EmbedResult as EmbedResult,
                             LLModel, ResponseCallbackType, empty_response_callback)
  File "/usr/local/Cellar/llm/0.18/libexec/lib/python3.13/site-packages/gpt4all/_pyllmodel.py", line 34, in
    if subprocess.run**(**
       ~~~~~~~~~~~~~~^
        "sysctl -n sysctl.proc_translated".split(), check=True, capture_output=True, text=True,
        ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
    ).stdout.strip() == "1":
    ^
  File "/usr/local/Cellar/[email protected]/3.13.0_1/Frameworks/Python.framework/Versions/3.13/lib/python3.13/subprocess.py", line 577, in run
    raise CalledProcessError(retcode, process.args,
                             output=stdout, stderr=stderr)
subprocess.CalledProcessError: Command '['sysctl', '-n', 'sysctl.proc_translated']' returned non-zero exit status 1.
➜  ___

Fix

sudo nano /usr/local/Cellar/llm/0.18/libexec/lib/python3.13/site-packages/gpt4all/_pyllmodel.py

I see you've opened the file. Looking at the code, we need to modify the Rosetta 2 detection logic. Let me show you exactly what to change:

  1. Find this problematic code section (it should be around line 34, before the load_llmodel_library() function):
if subprocess.run(
    "sysctl -n sysctl.proc_translated".split(), check=True, capture_output=True, text=True,
).stdout.strip() == "1":
  1. Replace it with this more robust code:
def is_rosetta():
    if platform.system() != "Darwin":
        return False
    try:
        result = subprocess.run(
            ["sysctl", "-n", "sysctl.proc_translated"],
            check=False,
            capture_output=True,
            text=True,
        )
        return result.stdout.strip() == "1"
    except:
        return False

# Use the function instead of direct check
if is_rosetta():

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions