Skip to content

Contradictory Information when using Ollama phi3:medium #844

Closed
@zwei2016

Description

@zwei2016

Describe the bug
A contradictory information is found when loading the ollama/phi3:medium model.

    "llm": {
        "model": "ollama/phi3:medium",
        "temperature": 0,
        "format": "json",  # Ollama needs the format to be specified explicitly
        "base_url": "http://localhost:11434",  # set Ollama URL
    },
Model ollama/phi3:medium not found,
                    using default token size (8192)

However, the program seems work and return the web page content with the model.

To Reproduce
Steps to reproduce the behavior:
1: use the ollama/phi3:medium in llm config
2: run it

Expected behavior
the phi3:medium should be recognized by config. The main reason probably is that this model is not listed in
scrapegraphai/helpers/models_tokens.py

Screenshots
If applicable, add screenshots to help explain your problem.

image

Desktop (please complete the following information):

  • OS: Win11
  • Browser: Opera
  • Version: 114.0.5282.235

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions