Skip to content

Commit

Permalink
Add benchmark iterations, phi3 model, and help command
Browse files Browse the repository at this point in the history
  • Loading branch information
metaspartan committed Apr 30, 2024
1 parent ffce54a commit 52a2bfa
Show file tree
Hide file tree
Showing 2 changed files with 225 additions and 120 deletions.
51 changes: 31 additions & 20 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -18,43 +18,54 @@ Run the Ollamark CLI using the following flags to customize the benchmarking pro
- `-m`: Model name to benchmark. Default is `"llama3"`.
- `-s`: Submit benchmark results. It accepts a boolean value. Default is `false`.
- `-o`: Ollama API endpoint. Default is `"http://localhost:11434/api/generate"`.
- `-i`: Number of iterations to run the benchmark. Default is `2`.
- `-h` or `-help`: Display the help message below.

```
Usage: ollamark [options]
Options:
-i int
Number of benchmark iterations (default 2)
-m string
Model name to benchmark (default "llama3")
-o string
Ollama API endpoint (default "http://localhost:11434/api/generate")
-s Submit benchmark results to Ollamark (default false)
Examples:
For Ollamark GUI mode:
ollamark (no flags)
For Ollamark CLI mode:
ollamark -m llama3 -i 10
ollamark -m phi3
ollamark -m phi3 -s -o http://localhost:11434/api/generate
```

### Example
```bash
./ollamark -m llama3 -s true -o "http://localhost:11434/api/generate"
./ollamark -m llama3 -s true -i 5 -o "http://localhost:11434/api/generate"
```

This command will benchmark the model "llama3", submit the results, and use the specified API endpoint.
This command will benchmark the model "llama3" for 5 iterations, submit the results, and use the specified API endpoint to interface with Ollama.

## Configuration
The CLI checks for command-line arguments and if provided, it runs in CLI mode. If no arguments are provided, it defaults to the Ollamark GUI application.

Refer to the code snippet for parsing CLI arguments:

```55:59:main.go
// Parse command-line arguments (Ollamark CLI)
modelPtr := flag.String("m", "llama3", "Model name to benchmark")
submitPtr := flag.Bool("s", false, "Submit benchmark results")
ollamaPtr := flag.String("o", "http://localhost:11434/api/generate", "Ollama API endpoint")
flag.Parse()
```
The CLI checks for command-line arguments and if provided, Ollamark runs in CLI mode. If no arguments are provided, it defaults to the Ollamark GUI application.


## Additional Information
- Ensure the `.env` file is correctly configured as it loads environment variables crucial for the application.
- The application can also be run as a Fyne GUI application if no CLI flags are provided.


Example .env file:
```
API_ENDPOINT=http://localhost:11434/api/generate
API_KEY=
```
## Contributing
Pull requests are welcome. For major changes, please open an issue first to discuss what you would like to change.

For more details on the implementation, refer to the main function in the Go source file:
## License
This project is licensed under the MIT License - see the LICENSE file for details.

```48:276:main.go
func main() {
// Load environment variables from .env file
err := godotenv.Load()
```
## Acknowledgments
- [Ollama](https://ollama.com/) for providing the Ollama API.
- [Fyne](https://fyne.io/) for providing the Fyne GUI framework.
Loading

0 comments on commit 52a2bfa

Please sign in to comment.