Skip to content

Conversation

@caitlinwheeless
Copy link
Contributor

Update to reflect the new analytics UI

@netlify
Copy link

netlify bot commented Nov 20, 2025

Deploy Preview for label-studio-storybook canceled.

Name Link
🔨 Latest commit d443fb8
🔍 Latest deploy log https://app.netlify.com/projects/label-studio-storybook/deploys/6926063faee82100080fb14b

@netlify
Copy link

netlify bot commented Nov 20, 2025

Deploy Preview for heartex-docs ready!

Name Link
🔨 Latest commit d443fb8
🔍 Latest deploy log https://app.netlify.com/projects/heartex-docs/deploys/6926063f37597d00085a2983
😎 Deploy Preview https://deploy-preview-8857--heartex-docs.netlify.app
📱 Preview on mobile
Toggle QR Code...

QR Code

Use your smartphone camera to open QR code link.

To edit notification comments on pull requests, go to your Netlify project configuration.

@netlify
Copy link

netlify bot commented Nov 20, 2025

Deploy Preview for label-studio-docs-new-theme ready!

Name Link
🔨 Latest commit d443fb8
🔍 Latest deploy log https://app.netlify.com/projects/label-studio-docs-new-theme/deploys/6926063fef711000087934a5
😎 Deploy Preview https://deploy-preview-8857--label-studio-docs-new-theme.netlify.app
📱 Preview on mobile
Toggle QR Code...

QR Code

Use your smartphone camera to open QR code link.

To edit notification comments on pull requests, go to your Netlify project configuration.

@github-actions github-actions bot added the docs label Nov 20, 2025
@netlify
Copy link

netlify bot commented Nov 20, 2025

Deploy Preview for label-studio-playground canceled.

Name Link
🔨 Latest commit d443fb8
🔍 Latest deploy log https://app.netlify.com/projects/label-studio-playground/deploys/6926063f4590250008e5be1c

Copy link
Contributor

@shereezhang shereezhang left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

left some small comments! overall looks really good thank you!


| Column | Description |
| --- | --- |
| **Assigned** | The number of annotations that have been manually assigned to the user + the number of submitted annotations in projects using automatic assignment. |
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'd suggest here:

The number of tasks that have either been manually assigned to the user or have received a submitted annotation from the user.

| Column | Description |
| --- | --- |
| **Assigned** | The number of annotations that have been manually assigned to the user + the number of submitted annotations in projects using automatic assignment. |
| **Pending** | The number of annotations that have been manually assigned to the user and have not yet been submitted or skipped. |
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'd suggest smthing like this (note the count is about tasks, instead of annotations)

The number of tasks manually assigned to the user which have neither received their submitted annotation nor have been skipped.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ah, the tasks vs. annotation trap strikes again!

| **Time Reviewing** | The total time spent reviewing. This is calculated from when a user opens an annotation to when they either take action on it (such as approving it) or close it by moving away from the task. This also includes idle time when the window is in focus. |

!!! note
Data collection for review time began on September 25, 2025. If you filter for earlier dates, review time will not be calculated.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

do we wanna call out on prem here?

eg.

Data collection for review time began on September 25, 2025 for LSE Cloud, or v 2.30 for LSE On-prem deployments. If you filter for earlier dates, review time will not be calculated.

| Column | Description |
| --- | --- |
| **Annotations Reviewed** | The number of annotations that the user has reviewed. This includes all review actions (accept, reject, fix + accept). |
| **Pending** | The number of annotations that the user has been manually assigned as a reviewer and which have not yet been completed. |
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

instead of "not yet been completed"

can we say

"... not yet received their review"

| **Fix + Accepted** | Number of annotations the user has updated and then accepted. |
| **Rejected** | Number of annotations the user has rejected. |
| **Total Time** | The total time this user spent reviewing. See [the table above](#Review-performance-summaries) for a complete description. |
| **Average Time** | The average time spent on each annotation. |
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

can we just specify the review part here: eg

The average time spent reviewing each annotation.

The median time spent reviewing each annotation.

Copy link
Contributor

@shereezhang shereezhang left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

thank you!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants