Skip to content

Web-LLM #10

@pacwoodson

Description

@pacwoodson

What about using Web-LLM instead of running an ollama server ?

https://github.com/mlc-ai/web-llm

Runs models in the browser via WASM/WebGPU

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions