You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Describe the bug
A contradictory information is found when loading the ollama/phi3:medium model.
"llm": {
"model": "ollama/phi3:medium",
"temperature": 0,
"format": "json", # Ollama needs the format to be specified explicitly
"base_url": "http://localhost:11434", # set Ollama URL
},
Model ollama/phi3:medium not found,
using default token size (8192)
However, the program seems work and return the web page content with the model.
To Reproduce
Steps to reproduce the behavior:
1: use the ollama/phi3:medium in llm config
2: run it
Expected behavior
the phi3:medium should be recognized by config. The main reason probably is that this model is not listed in
scrapegraphai/helpers/models_tokens.py
Screenshots
If applicable, add screenshots to help explain your problem.
Desktop (please complete the following information):
OS: Win11
Browser: Opera
Version: 114.0.5282.235
The text was updated successfully, but these errors were encountered:
Describe the bug
A contradictory information is found when loading the ollama/phi3:medium model.
However, the program seems work and return the web page content with the model.
To Reproduce
Steps to reproduce the behavior:
1: use the ollama/phi3:medium in llm config
2: run it
Expected behavior
the phi3:medium should be recognized by config. The main reason probably is that this model is not listed in
scrapegraphai/helpers/models_tokens.py
Screenshots
If applicable, add screenshots to help explain your problem.
Desktop (please complete the following information):
The text was updated successfully, but these errors were encountered: