Skip to content

Commit 3e43dce

Browse files
caitlinwheelesscaitlinwheeless
and
caitlinwheeless
authored
docs: DOC-267: Update the Annotator Performance Report page (#6654)
Co-authored-by: caitlinwheeless <[email protected]>
1 parent 0638481 commit 3e43dce

File tree

2 files changed

+15
-6
lines changed

2 files changed

+15
-6
lines changed

docs/source/guide/dashboard_annotator.md

+15-6
Original file line numberDiff line numberDiff line change
@@ -1,10 +1,10 @@
11
---
2-
title: Annotator performance dashboard - Beta 🧪
2+
title: Annotator performance dashboard
33
short: Annotator dashboard
44
tier: enterprise
55
type: guide
66
order: 0
7-
order_enterprise: 70
7+
order_enterprise: 72
88
meta_title: Annotator dashboard
99
meta_description: Use annotator dashboards to track annotator work and progress.
1010
section: "Project Management"
@@ -28,10 +28,20 @@ The dashboard is available from the Organization page, meaning that your user ro
2828

2929
From the organization members list, select the user you want to view. Annotator performance reports are available for users in all roles, not just the Annotator role.
3030

31-
With the user selected, click **Performance Report** on the right.
31+
With the user selected, click **Annotator Performance Report** on the right.
3232

3333
![Screenshot of Performance Report button](/images/project/user_report.png)
3434

35+
## Export data
36+
37+
You can use the **Export** drop-down to to export the following:
38+
39+
* **Report** - Download the information in the dashboard as CSV or JSON.
40+
41+
* **Timeline** - Download a detailed timeline of all the user's annotation actions within the time frame, including when the began and submitted each annotation.
42+
43+
* **Comments Received** - Download a CSV file with all of the comments that other users have left on the user's annotations.
44+
3545
## Metrics
3646

3747
### Data used
@@ -47,15 +57,14 @@ The metrics are calculated from the following data:
4757

4858
### Performance summaries
4959

50-
![Screenshot of annotator dashboard summaries](/images/project/annotator_dashboard_summary.png)
51-
5260
| Metric | Calculation | Description |
5361
| --- | --- | --- |
54-
| **Total Time** | Sum of `lead_times` | The total time spent annotating during the selected time frame. This is calculated based on annotations that meet the criteria for **Submitted Annotations** (see below). <br /><br />The total time does not include time spent on annotations that have not been submitted and/or updated. For example, it does not include time spent on drafts or time spent on skipped annotations. <br /><br />However, if they return to an annotation draft or a previously skipped annotation, then their earlier time spent on the annotation is included when calculating their total annotation time. |
62+
| **Total Time** | Sum of `lead_times` | The total time spent annotating during the selected time frame. This is calculated based on annotations that meet the criteria for **Submitted Annotations** (see below). <br /><br />All annotations have a `lead_time`. The lead time reflects how much time a user spent labeling from the moment the task was opened until they click **Submit** or **Update**. This includes idle time. <br /><br />The total time does not include time spent on annotations that have not been submitted and/or updated. For example, it does not include time spent on drafts or time spent on skipped annotations. <br /><br />However, if they return to an annotation draft or a previously skipped annotation, then their earlier time spent on the annotation is included when calculating their total annotation time. |
5563
| **Submitted Annotations** | Sum of `submitted_or_reviewed` | The total number of annotations the user submitted during the selected time frame. <br /><br />This includes annotations that have been submitted and updated. <br /><br />It does not include annotations that have been skipped. It also does not include annotations that were submitted and have since been rejected by a reviewer. However, if the annotator updates a rejected annotation and that fix is then accepted by a reviewer, the corrected annotation is included within their Submitted Annotation count. <br /><br />Note that each annotation is only included in their submitted count once. Label Studio does not count the same annotation twice based if it is later updated. |
5664
| **Total Time (Median)** | Sum of `submitted_or_reviewed` * the median of `lead_times` | The number of submitted annotations multiplied by their median annotation time. |
5765
| **Time per Annotation (Median)** | Median of `lead_times` | The median time they spent on each submitted annotation. |
5866
| **Time per Annotation (Average)** | Average of `lead_times` | The average time they spent on each submitted annotation. |
67+
| **Performance Score** | Calculated from reviewer actions | The Performance Score reflects the overall performance of annotators in terms of review actions (**Accept**, **Reject**, **Fix+Accept**). <br /><br />The calculation is as follows:<ul><li>Each annotation review action (**Accept**, **Reject**, **Fix+Accept**) contributes to the score.</li><li>The score is calculated by summing the scores of all review actions and dividing by the total number of review actions. For example: </li><ul><li>If an annotation is rejected twice and then accepted once, the Performance Score would be (0 + 0 + 1) / 3 = 33%.</li><li>If an annotation is rejected once and then fixed+accepted with a score of 42%, the Performance Score would be (0 + 0.42) / 2 = 21%.</li></ul></ul> |
5968

6069
### Graphs
6170

Loading

0 commit comments

Comments
 (0)