Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion chapters/en/chapter11/5.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -92,7 +92,7 @@ Example: `"mmlu|abstract_algebra|0|0"` evaluates on MMLU's abstract algebra task

Let's set up an evaluation pipeline for our finetuned model. We will evaluate the model on set of sub tasks that relate to the domain of medicine.

Here's a complete example of evaluating on automatic benchmarks relevant to one specific domain using Lighteval with the VLLM backend:
Here's a complete example of evaluating on automatic benchmarks relevant to one specific domain using Lighteval with the accelerate backend:

```bash
lighteval accelerate \
Expand Down
2 changes: 1 addition & 1 deletion chapters/ro/chapter11/5.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -92,7 +92,7 @@ Exemplu: `"mmlu|abstract_algebra|0|0"` evaluează pe sarcina de algebră abstrac

Să configurăm un pipeline de evaluare pentru modelul nostru ajustat fin. Vom evalua modelul pe un set de sub-sarcini care se raportează la domeniul medicinei.

Iată un exemplu complet de evaluare pe criterii de referință automate relevante pentru un domeniu specific folosind Lighteval cu backend-ul VLLM:
Iată un exemplu complet de evaluare pe criterii de referință automate relevante pentru un domeniu specific folosind Lighteval cu backend-ul accelerate:

```bash
lighteval accelerate \
Expand Down