You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardexpand all lines: docs/source/guide/dashboard_annotator.md
+15-6
Original file line number
Diff line number
Diff line change
@@ -1,10 +1,10 @@
1
1
---
2
-
title: Annotator performance dashboard - Beta 🧪
2
+
title: Annotator performance dashboard
3
3
short: Annotator dashboard
4
4
tier: enterprise
5
5
type: guide
6
6
order: 0
7
-
order_enterprise: 70
7
+
order_enterprise: 72
8
8
meta_title: Annotator dashboard
9
9
meta_description: Use annotator dashboards to track annotator work and progress.
10
10
section: "Project Management"
@@ -28,10 +28,20 @@ The dashboard is available from the Organization page, meaning that your user ro
28
28
29
29
From the organization members list, select the user you want to view. Annotator performance reports are available for users in all roles, not just the Annotator role.
30
30
31
-
With the user selected, click **Performance Report** on the right.
31
+
With the user selected, click **Annotator Performance Report** on the right.
32
32
33
33

34
34
35
+
## Export data
36
+
37
+
You can use the **Export** drop-down to to export the following:
38
+
39
+
***Report** - Download the information in the dashboard as CSV or JSON.
40
+
41
+
***Timeline** - Download a detailed timeline of all the user's annotation actions within the time frame, including when the began and submitted each annotation.
42
+
43
+
***Comments Received** - Download a CSV file with all of the comments that other users have left on the user's annotations.
44
+
35
45
## Metrics
36
46
37
47
### Data used
@@ -47,15 +57,14 @@ The metrics are calculated from the following data:
47
57
48
58
### Performance summaries
49
59
50
-

51
-
52
60
| Metric | Calculation | Description |
53
61
| --- | --- | --- |
54
-
|**Total Time**| Sum of `lead_times`| The total time spent annotating during the selected time frame. This is calculated based on annotations that meet the criteria for **Submitted Annotations** (see below). <br /><br />The total time does not include time spent on annotations that have not been submitted and/or updated. For example, it does not include time spent on drafts or time spent on skipped annotations. <br /><br />However, if they return to an annotation draft or a previously skipped annotation, then their earlier time spent on the annotation is included when calculating their total annotation time. |
62
+
|**Total Time**| Sum of `lead_times`| The total time spent annotating during the selected time frame. This is calculated based on annotations that meet the criteria for **Submitted Annotations** (see below). <br /><br />All annotations have a `lead_time`. The lead time reflects how much time a user spent labeling from the moment the task was opened until they click **Submit** or **Update**. This includes idle time. <br /><br />The total time does not include time spent on annotations that have not been submitted and/or updated. For example, it does not include time spent on drafts or time spent on skipped annotations. <br /><br />However, if they return to an annotation draft or a previously skipped annotation, then their earlier time spent on the annotation is included when calculating their total annotation time. |
55
63
|**Submitted Annotations**| Sum of `submitted_or_reviewed`| The total number of annotations the user submitted during the selected time frame. <br /><br />This includes annotations that have been submitted and updated. <br /><br />It does not include annotations that have been skipped. It also does not include annotations that were submitted and have since been rejected by a reviewer. However, if the annotator updates a rejected annotation and that fix is then accepted by a reviewer, the corrected annotation is included within their Submitted Annotation count. <br /><br />Note that each annotation is only included in their submitted count once. Label Studio does not count the same annotation twice based if it is later updated. |
56
64
|**Total Time (Median)**| Sum of `submitted_or_reviewed` * the median of `lead_times`| The number of submitted annotations multiplied by their median annotation time. |
57
65
|**Time per Annotation (Median)**| Median of `lead_times`| The median time they spent on each submitted annotation. |
58
66
|**Time per Annotation (Average)**| Average of `lead_times`| The average time they spent on each submitted annotation. |
67
+
|**Performance Score**| Calculated from reviewer actions | The Performance Score reflects the overall performance of annotators in terms of review actions (**Accept**, **Reject**, **Fix+Accept**). <br /><br />The calculation is as follows:<ul><li>Each annotation review action (**Accept**, **Reject**, **Fix+Accept**) contributes to the score.</li><li>The score is calculated by summing the scores of all review actions and dividing by the total number of review actions. For example: </li><ul><li>If an annotation is rejected twice and then accepted once, the Performance Score would be (0 + 0 + 1) / 3 = 33%.</li><li>If an annotation is rejected once and then fixed+accepted with a score of 42%, the Performance Score would be (0 + 0.42) / 2 = 21%.</li></ul></ul> |
0 commit comments