Skip to content

Latest commit

 

History

History
104 lines (78 loc) · 3.95 KB

README.md

File metadata and controls

104 lines (78 loc) · 3.95 KB

llm.nvim

A neovim plugin for no frills LLM-assisted programming.

llm.nvim-showcase.mp4

Installation

Before using the plugin, set any of GROQ_API_KEY, OPENAI_API_KEY, ANTHROPIC_API_KEY env vars with your api keys.

lazy.nvim

{
    "melbaldove/llm.nvim",
    dependencies = { "nvim-neotest/nvim-nio" }
}

Usage

setup()

Configure the plugin. This can be omitted to use the default configuration.

require('llm').setup({
    -- How long to wait for the request to start returning data.
    timeout_ms = 10000,
    services = {
        -- Supported services configured by default
        -- groq = {
        --     url = "https://api.groq.com/openai/v1/chat/completions",
        --     model = "llama3-70b-8192",
        --     api_key_name = "GROQ_API_KEY",
        -- },
        -- openai = {
        --     url = "https://api.openai.com/v1/chat/completions",
        --     model = "gpt-4o",
        --     api_key_name = "OPENAI_API_KEY",
        -- },
        -- anthropic = {
        --     url = "https://api.anthropic.com/v1/messages",
        --     model = "claude-3-5-sonnet-20240620",
        --     api_key_name = "ANTHROPIC_API_KEY",
        -- },

        -- Extra OpenAI-compatible services to add (optional)
        other_provider = {
            url = "https://example.com/other-provider/v1/chat/completions",
            model = "llama3",
            api_key_name = "OTHER_PROVIDER_API_KEY",
        }
    }
})

prompt()

Triggers the LLM assistant. You can pass an optional replace flag to replace the current selection with the LLM's response. The prompt is either the visually selected text or the file content up to the cursor if no selection is made.

create_llm_md()

Creates a new llm.md file in the current working directory, where you can write questions or prompts for the LLM. You can also use :LLM command to call this function.

prompt_operatorfunc()

Allows you to use text objects and motions for the LLM prompt. So you can use a command to go into operator-pending mode and then use motions like w, b, e, i{, a{, i", a", etc. to select text to prompt the LLM.

Example Bindings

vim.keymap.set("n", "<leader>m", function() require("llm").create_llm_md() end, { desc = "Create llm.md" })

-- keybinds for prompting with groq
vim.keymap.set("n", "<leader>,", function() require("llm").prompt({ replace = false, service = "groq" }) end, { desc = "Prompt with groq" })
vim.keymap.set("v", "<leader>,", function() require("llm").prompt({ replace = false, service = "groq" }) end, { desc = "Prompt with groq" })
vim.keymap.set("v", "<leader>.", function() require("llm").prompt({ replace = true, service = "groq" }) end, { desc = "Prompt while replacing with groq" }))

-- keybinds for prompting with openai
vim.keymap.set("n", "<leader>g,", function() require("llm").prompt({ replace = false, service = "openai" }) end, { desc = "Prompt with openai" })
vim.keymap.set("v", "<leader>g,", function() require("llm").prompt({ replace = false, service = "openai" }) end, { desc = "Prompt with openai" })
vim.keymap.set("v", "<leader>g.", function() require("llm").prompt({ replace = true, service = "openai" }) end, { desc = "Prompt while replacing with openai" }))

-- keybinds to support vim motions
vim.keymap.set("n", "g,", function() require("llm").prompt_operatorfunc({ replace = false, service = "groq" }) end, { desc = "Prompt with groq" })
vim.keymap.set("n", "g.", function() require("llm").prompt_operatorfunc({ replace = true, service = "groq" }) end, { desc = "Prompt while replacing with groq" }))

File tagging

You can tag files to include in the context by prepending a path with @.

@foo/bar.md
@./foo/bar.md

Roadmap

Credits

  • Special thanks to yacine and his ask.md vscode plugin for inspiration!