Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

error after installing gpt4all - and a fix #651

Open
couhajjou opened this issue Nov 26, 2024 · 0 comments
Open

error after installing gpt4all - and a fix #651

couhajjou opened this issue Nov 26, 2024 · 0 comments

Comments

@couhajjou
Copy link

couhajjou commented Nov 26, 2024

Error

Traceback (most recent call last):
  File "/usr/local/bin/llm", line 5, in
    from llm.cli import cli
  File "/usr/local/Cellar/llm/0.18/libexec/lib/python3.13/site-packages/llm/cli.py", line 1852, in
    load_plugins**()**
    ~~~~~~~~~~~~^^
  File "/usr/local/Cellar/llm/0.18/libexec/lib/python3.13/site-packages/llm/plugins.py", line 25, in load_plugins
    pm.load_setuptools_entrypoints**("llm")**
    ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~^^^^^^^
  File "/usr/local/Cellar/llm/0.18/libexec/lib/python3.13/site-packages/pluggy/_manager.py", line 421, in load_setuptools_entrypoints
    plugin = ep.load()
  File "/usr/local/Cellar/[email protected]/3.13.0_1/Frameworks/Python.framework/Versions/3.13/lib/python3.13/importlib/metadata/init.py", line 179, in load
    module = import_module(match.group('module'))
  File "/usr/local/Cellar/[email protected]/3.13.0_1/Frameworks/Python.framework/Versions/3.13/lib/python3.13/importlib/init.py", line 88, in import_module
    return bootstrap.gcd_import(name[level:], package, level)**
           ~~~~~~~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/Cellar/llm/0.18/libexec/lib/python3.13/site-packages/llm_gpt4all.py", line 1, in
    from gpt4all import GPT4All as _GPT4All
  File "/usr/local/Cellar/llm/0.18/libexec/lib/python3.13/site-packages/gpt4all/init.py", line 1, in
    from .gpt4all import CancellationError as CancellationError, Embed4All as Embed4All, GPT4All as GPT4All
  File "/usr/local/Cellar/llm/0.18/libexec/lib/python3.13/site-packages/gpt4all/gpt4all.py", line 23, in
    from ._pyllmodel import (CancellationError as CancellationError, EmbCancelCallbackType, EmbedResult as EmbedResult,
                             LLModel, ResponseCallbackType, empty_response_callback)
  File "/usr/local/Cellar/llm/0.18/libexec/lib/python3.13/site-packages/gpt4all/_pyllmodel.py", line 34, in
    if subprocess.run**(**
       ~~~~~~~~~~~~~~^
        "sysctl -n sysctl.proc_translated".split(), check=True, capture_output=True, text=True,
        ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
    ).stdout.strip() == "1":
    ^
  File "/usr/local/Cellar/[email protected]/3.13.0_1/Frameworks/Python.framework/Versions/3.13/lib/python3.13/subprocess.py", line 577, in run
    raise CalledProcessError(retcode, process.args,
                             output=stdout, stderr=stderr)
subprocess.CalledProcessError: Command '['sysctl', '-n', 'sysctl.proc_translated']' returned non-zero exit status 1.
➜  ___

Fix

sudo nano /usr/local/Cellar/llm/0.18/libexec/lib/python3.13/site-packages/gpt4all/_pyllmodel.py

I see you've opened the file. Looking at the code, we need to modify the Rosetta 2 detection logic. Let me show you exactly what to change:

  1. Find this problematic code section (it should be around line 34, before the load_llmodel_library() function):
if subprocess.run(
    "sysctl -n sysctl.proc_translated".split(), check=True, capture_output=True, text=True,
).stdout.strip() == "1":
  1. Replace it with this more robust code:
def is_rosetta():
    if platform.system() != "Darwin":
        return False
    try:
        result = subprocess.run(
            ["sysctl", "-n", "sysctl.proc_translated"],
            check=False,
            capture_output=True,
            text=True,
        )
        return result.stdout.strip() == "1"
    except:
        return False

# Use the function instead of direct check
if is_rosetta():
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant