Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

It literally just doesn't work on windows #28

Open
GrumpyBud opened this issue Dec 5, 2024 · 8 comments
Open

It literally just doesn't work on windows #28

GrumpyBud opened this issue Dec 5, 2024 · 8 comments

Comments

@GrumpyBud
Copy link

No more than the description. Describes in detail every step but simply does nothing.

@AmberSahdev
Copy link
Owner

Are you using multiple monitors by chance? If so make sure the app is on the primary monitor and the cursor is clicked onto the primary screen as well.

Windows 10 seems to be working for me.

@Rathna-K
Copy link

Rathna-K commented Jan 14, 2025

same. trying with Ollama which same url works in open webui but not on this tool . error : Exception Unable to execute the request - 404 page not found

@Rathna-K
Copy link

ha! managed to run (api was missing 'v1' but looks like the context is set for macos and looks for spotlight?

@AmberSahdev
Copy link
Owner

Glad you were able to get it to work - if it's not recognizing that you're on Windows could you please emphasize in the Custom LLM Instructions section that you are on Windows, not MacOS - something like "You are running on a Windows system, do not use MacOS keyboard commands" should go far.
I'll try to fix the miscategorization and hallucinations in the next release.
Also are you on 0.8.0?

@Rathna-K
Copy link

yes, on the current release. I am trying to update the context but the execution function seems to fail because parameters also sometimes comes as functions from LLM response. maybe the quality of the LLM. I am using Llama 3.2

@scatterp2
Copy link

You are running on a Windows system, do not use MacOS keyboard commands" should go far.
this had basically no effect due to the critical instructions in the context
I attempted to recompile the code but it has some error that the model is not in the model folder
please move the context to the settings file and make a new release

Image

@AmberSahdev
Copy link
Owner

@scatterp2 could you please share your ~/.open-interface/settings.json? The error you're showing I had fixed in 0.8.0 but seems like it has resurfaced.

@scatterp2
Copy link

sure do you have discord ? or another messenger might go faster ?
i have some other progress that might be useful as well
file:
{
"base_url": "https://muskox-central-collie.ngrok-free.app",
"model": "ollama/llava",
"api_key": "cWZvanNveA==",
"default_browser": "Firefox",
"play_ding_on_completion": false,
"custom_llm_instructions": "You are running on a Windows system, do not use MacOS keyboard commands important windows does not have spotlight",
"theme": "superhero"
}

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants