-
Notifications
You must be signed in to change notification settings - Fork 3.2k
docs: Update performance dashboard docs #8857
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: develop
Are you sure you want to change the base?
Conversation
✅ Deploy Preview for label-studio-storybook canceled.
|
✅ Deploy Preview for heartex-docs ready!
To edit notification comments on pull requests, go to your Netlify project configuration. |
✅ Deploy Preview for label-studio-docs-new-theme ready!
To edit notification comments on pull requests, go to your Netlify project configuration. |
✅ Deploy Preview for label-studio-playground canceled.
|
shereezhang
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
left some small comments! overall looks really good thank you!
|
|
||
| | Column | Description | | ||
| | --- | --- | | ||
| | **Assigned** | The number of annotations that have been manually assigned to the user + the number of submitted annotations in projects using automatic assignment. | |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'd suggest here:
The number of tasks that have either been manually assigned to the user or have received a submitted annotation from the user.
| | Column | Description | | ||
| | --- | --- | | ||
| | **Assigned** | The number of annotations that have been manually assigned to the user + the number of submitted annotations in projects using automatic assignment. | | ||
| | **Pending** | The number of annotations that have been manually assigned to the user and have not yet been submitted or skipped. | |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'd suggest smthing like this (note the count is about tasks, instead of annotations)
The number of tasks manually assigned to the user which have neither received their submitted annotation nor have been skipped.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Ah, the tasks vs. annotation trap strikes again!
| | **Time Reviewing** | The total time spent reviewing. This is calculated from when a user opens an annotation to when they either take action on it (such as approving it) or close it by moving away from the task. This also includes idle time when the window is in focus. | | ||
|
|
||
| !!! note | ||
| Data collection for review time began on September 25, 2025. If you filter for earlier dates, review time will not be calculated. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
do we wanna call out on prem here?
eg.
Data collection for review time began on September 25, 2025 for LSE Cloud, or v 2.30 for LSE On-prem deployments. If you filter for earlier dates, review time will not be calculated.
| | Column | Description | | ||
| | --- | --- | | ||
| | **Annotations Reviewed** | The number of annotations that the user has reviewed. This includes all review actions (accept, reject, fix + accept). | | ||
| | **Pending** | The number of annotations that the user has been manually assigned as a reviewer and which have not yet been completed. | |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
instead of "not yet been completed"
can we say
"... not yet received their review"
| | **Fix + Accepted** | Number of annotations the user has updated and then accepted. | | ||
| | **Rejected** | Number of annotations the user has rejected. | | ||
| | **Total Time** | The total time this user spent reviewing. See [the table above](#Review-performance-summaries) for a complete description. | | ||
| | **Average Time** | The average time spent on each annotation. | |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
can we just specify the review part here: eg
The average time spent reviewing each annotation.
The median time spent reviewing each annotation.
shereezhang
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
thank you!
Update to reflect the new analytics UI