A Bash Library for Ollama
Run LLM prompts straight from your shell, and more
Quickstart - Usage - Demos - Documentation - Functions - License - More from the Attogram Project - Discord - Repo - Download
git clone https://github.com/attogram/ollama-bash-lib.git
cd ollama-bash-lib
source ollama_bash_lib.sh
ollama_generate -m mistral:7b -p "Describe Bash in 3 words"
# Powerful, Flexible, Scripting.
# Tab completion to view available library functions
ollama_<TAB>
# ollama_api_get ollama_generate_json ollama_model_random
# ollama_api_ping ollama_generate_stream ollama_model_unload
# ollama_api_post ollama_generate_stream_json ollama_ps
# ollama_app_installed ollama_lib_about ollama_ps_json
# ollama_app_turbo ollama_lib_version ollama_show
# ollama_app_vars ollama_list ollama_show_json
# ollama_app_version ollama_list_array ollama_thinking
# ollama_app_version_cli ollama_list_json
# ollama_app_version_json ollama_messages
# ollama_chat ollama_messages_add
# ollama_chat_json ollama_messages_clear
# ollama_chat_stream ollama_messages_count
# ollama_chat_stream_json ollama_messages_last
# ollama_generate ollama_messages_last_json
Include in your shell or script:
source ./ollama_bash_lib.sh
Include in your script, with error checking:
ollama_bash_lib="path/to/ollama_bash_lib.sh"
if [[ ! -f "$ollama_bash_lib" ]]; then
echo "ERROR: Ollama Bash Lib Not Found: $ollama_bash_lib"
exit 1
fi
source "$ollama_bash_lib"
For detailed information on how to use Ollama Bash Lib, including how-to guides, function references, and dependency information, please see the documentation.
Here are a few of the most popular demos:
Demo | About |
---|---|
ollama_generate.md | Generate completion with ollama_generate |
ollama_generate_json.md | Generate completion, json output with ollama_generate_json |
ollama_generate_stream.md | Generate completion, streaming with ollama_generate_stream |
ollama_chat.md | Chat completion with ollama_messages_add and ollama_chat |
list.md | Ollama model list with ollama_list , ollama_list_array and ollama_list_json |
show.md | Model info with ollama_show and ollama_show_json |
prompt.all.models.md | Send a prompt to all models |
prompts.model.md | Send many prompts to a model |
about.md | Ollama Bash Lib info with ollama_lib_about and ollama_lib_version |
See the full list of demos.
Demo | About |
---|---|
review.lib.md | Code Review of ollama_bash_lib.sh |
review.lib-security.md | Security Review of ollama_bash_lib.sh |
review.readme.md | Marketing Review of README.md |
review.funny.md | Humorous project descriptions |
review.snarky.md | Snarky, sarcastic project review |
The library provides a rich set of functions to interact with Ollama, including:
- Generate Functions:
ollama_generate
,ollama_generate_stream
- Chat Functions:
ollama_chat
,ollama_messages_add
- Model Functions:
ollama_list
,ollama_show
For a full list of all functions and their detailed documentation, see the Function Reference.
Ollama Bash Lib is licensed under the terms of the MIT License.
Project | About |
---|---|
Attogram Project Discord Channel |
Join the Attogram Project Discord Channel for: - Announcements - Technical Support - General Chat about Attogram Projects |
Ollama Multirun | Run a prompt against all, or some, of your models running on Ollama. - Creates web pages with the output, performance statistics and model info. - All in a single Bash shell script. |
Ollama Bash Lib | A Bash Library for Ollama |
Ollama Bash Toolshed | Chat with tool calling models. - Sample tools included. - Add new tools to your shed with ease. - Runs on Ollama. - All via Bash shell scripts. |
LLM Council | Start a chat room between all, or some, of your models running on Ollama. - All in a single Bash shell script. |
Small Models | Comparison of small open source LLMs - 8b parameters or less |
AI Test Zone | AI Testing reports - hosted on https://attogram.github.io/ai_test_zone/ |