You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardexpand all lines: docs/source/guide/export.md
+2-6
Original file line number
Diff line number
Diff line change
@@ -20,8 +20,7 @@ Image annotations exported in JSON format use percentages of overall image size,
20
20
!!! note
21
21
Some export formats export only the annotations and not the data from the task. For more information, see the [export formats supported by Label Studio](#Export-formats-supported-by-Label-Studio).
22
22
23
-
24
-
<!-- md annotation_ids.md -->
23
+
{% insertmd includes/annotation_ids.md %}
25
24
26
25
<divclass="opensource-only">
27
26
@@ -192,12 +191,9 @@ Results are stored in a tab-separated tabular file with column names specified b
192
191
193
192
Export object detection annotations in the YOLOv3 and YOLOv4 format. Supports object detection labeling projects that use the `RectangleLabels` tag.
194
193
195
-
196
194
{% insertmd includes/task_format.md %}
197
195
198
-
199
-
<!-- md image_units.md -->
200
-
196
+
{% insertmd includes/image_units.md %}
201
197
202
198
## Manually convert JSON annotations to another format
203
199
You can run the [Label Studio converter tool](https://github.com/HumanSignal/label-studio-converter) on a directory or file of completed JSON annotations using the command line or Python to convert the completed annotations from Label Studio JSON format into another format.
Copy file name to clipboardexpand all lines: docs/source/guide/install.md
+1-3
Original file line number
Diff line number
Diff line change
@@ -20,7 +20,7 @@ Install Label Studio on premises or in the cloud. Choose the installation method
20
20
21
21
Label Studio is also available an [enterprise product](https://heartex.com/), which you can explore instantly through a [free trial](https://humansignal.com/free-trial).
The most important change to be aware of is changes to rename "completions" to "annotations". See the [updated JSON format for completed tasks](export.html#Raw_JSON_format_of_completed_tasks).
229
229
230
230
If you customized the Label Studio Frontend, see the [Frontend reference guide](frontend_reference.html) for required updates to maintain compatibility with version 1.0.0.
Copy file name to clipboardexpand all lines: docs/source/guide/install_enterprise_docker.md
+1-1
Original file line number
Diff line number
Diff line change
@@ -19,7 +19,7 @@ See [Secure Label Studio](security.html) for more details about security and har
19
19
20
20
To install Label Studio Community Edition, see [Install Label Studio](https://labelstud.io/guide/install). This page is specific to the Enterprise version of Label Studio.
Copy file name to clipboardexpand all lines: docs/source/guide/release_notes.md
+114
Original file line number
Diff line number
Diff line change
@@ -19,6 +19,120 @@ meta_description: Review new features, enhancements, and bug fixes for on-premis
19
19
Before upgrading, review the steps outlined in [Upgrade Label Studio Enterprise](upgrade_enterprise) and ensure that you complete the recommended tests after each upgrade.
<divclass="onprem-highlight">Paginated multi-image labeling and a new Task Reservation setting </div>
47
+
48
+
*Dec 17, 2024*
49
+
50
+
Helm Chart version: 1.7.3
51
+
52
+
### New features
53
+
54
+
#### Paginated multi-image labeling
55
+
56
+
Paginated multi-image labeling allows you to label an album of images within a single task. When enabled, a page navigation tool is available within the labeling interface.
57
+
58
+
While you can use paginated multi-image labeling with any series of related images, it can also be especially useful for for document annotation.
59
+
60
+
For example, you can pre-process a PDF to convert it into image files, and then use the pagination toolbar to navigate the PDF. For more information, see our [Multi-Page Document Annotation template](/templates/multi-page-document-annotation).
61
+
62
+
To enable this feature, use the `valueList` parameter on the [`<Image> tag`](/tags/image).
63
+
64
+

65
+
66
+
#### Set task reservation time
67
+
68
+
There is a new project setting under **Annotation > Task Reservation**.
69
+
70
+
You can use this setting to determine how many minutes a task can be reserved by a user. You can also use it for projects that have become stalled due to too many reserved tasks. For more information, see [Project settings - Task Reservation](https://docs.humansignal.com/guide/project_settings_lse#lock-tasks).
71
+
72
+
By default, the task reservation time is set to one day (1440 minutes). This setting is only available when task distribution is set to **Auto**.
73
+
74
+

75
+
76
+
### Enhancements
77
+
78
+
- When using the **Send Test Request** action for a connected ML backend model, you will now see more descriptive error messages.
79
+
80
+
- The placeholder text within labeling configuration previews is now more descriptive of what should appear, rather than providing example text strings.
81
+
82
+
- Improved the inter-annotator agreement API so that it is more performant and can better handle a high number of annotators.
- TextArea elements have been updated to reflect the look and feel of other labeling elements.
87
+
88
+
### Bug fixes
89
+
90
+
- Fixed an issue where SSO/SAML users were not being redirected back to the originally requested URL.
91
+
92
+
- Fixed an issue where a timeout on the inter-annotator agreement API would cause missing data in the Annotator Summary table on the Members page.
93
+
94
+
- Fixed an issue where the default date format used when exporting to CSV was incompatible with Google Sheets.
95
+
96
+
- Fixed an issue where commas in comment text breaking were causing errors when exporting to CSV from the Annotator Performance report.
97
+
98
+
- Fixed an issue that was causing 404 errors in the Activity Log.
99
+
100
+
- Fixed an issue where users were unable to deselect tools from the toolbar by clicking them a second time.
101
+
102
+
- Fixed an issue where users were presented with Reviewer actions even if the annotation was still in Draft state.
103
+
104
+
- Fixed an issue with the Source Storage editor in which some fields were overlapping in the user interface.
105
+
106
+
- Fixed an issue with the Data Manager filters when the columns are different from those in the labeling config and when `$undefined$` is present in the task data.
107
+
108
+
- Fixed an issue where filter options in the Data Manager would disappear on hover.
109
+
110
+
- Fixed an issue which caused XML comments to incorrectly be considered in the label config validation.
111
+
112
+
- Fixed an issue causing an error when marking a comment as read.
113
+
114
+
- Fixed an issue where an error message would appear when selecting or unselecting the **Get the latest news & tips from Heidi** option on the Account Settings page.
115
+
116
+
- Fixed an issue where annotators were seeing a tooltip message stating that the project was not ready yet, even though the project had already been completed.
117
+
118
+
- Fixed an issue where project-level roles did not affect role upgrades performed at the Organization level.
Copy file name to clipboardexpand all lines: docs/source/guide/troubleshooting.md
+2-2
Original file line number
Diff line number
Diff line change
@@ -214,7 +214,7 @@ Check that you are using the correct annotation units.
214
214
215
215
{% details <b>Image annotation units</b> %}
216
216
217
-
<!-- md image_units.md -->
217
+
{% insertmd includes/image_units.md %}
218
218
219
219
{% enddetails %}
220
220
@@ -503,4 +503,4 @@ You must ensure that the ML backend can access your Label Studio data. If it can
503
503
* You are unable to see predictions when loading tasks in Label Studio.
504
504
* Your ML backend appears to be connected properly, but cannot seem to complete any auto annotations within tasks.
505
505
506
-
To remedy this, ensure you have set the `LABEL_STUDIO_URL` and `LABEL_STUDIO_API_KEY` environment variables. For more information, see [Allow the ML backend to access Label Studio data](ml#Allow-the-ML-backend-to-access-Label-Studio-data).
506
+
To remedy this, ensure you have set the `LABEL_STUDIO_URL` and `LABEL_STUDIO_API_KEY` environment variables. For more information, see [Allow the ML backend to access Label Studio data](ml#Allow-the-ML-backend-to-access-Label-Studio-data).
Copy file name to clipboardexpand all lines: docs/source/includes/nested-classification.md
+1-18
Original file line number
Diff line number
Diff line change
@@ -1,20 +1,3 @@
1
-
<!-- Unfortunately included md files doesn't support code highlighting, do it manually -->
2
-
<scriptsrc="/js/highlight.min.js"></script>
3
-
<script>
4
-
hljs.highlightAll();
5
-
$(function() {
6
-
$('.code-badge-language').each(function (o, v) {
7
-
console.log(o)
8
-
if ($(v).html() ==='undefined')
9
-
$(v).html('')
10
-
if ($(v).html() ==='bash')
11
-
$(v).html('shell')
12
-
if ($(v).html() ==='html')
13
-
$(v).html('xml')
14
-
})
15
-
});
16
-
</script>
17
-
18
1
## Enhance classification templates with nested choices
19
2
20
3
You can add conditional or nested choices to any classification template. If you want classification options to appear only if certain conditions are met, such as specific choices being selected by annotators, adapt one of these conditional and nested classification examples.
@@ -132,4 +115,4 @@ Add a third [Choices](/tags/choices.html) control tag to prompt the annotator to
Copy file name to clipboardexpand all lines: docs/source/templates/intent_classification.md
+1-1
Original file line number
Diff line number
Diff line change
@@ -63,7 +63,7 @@ Use the [Choices](/tags/choices.html) control tag to classify the intent for eac
63
63
```
64
64
Because of the `perRegion="true"` argument, each choice applies to a different segment labeled as a segment. The `required="true"` argument ensures that each labeled audio segment has a choice selected before the annotation can be submitted.
0 commit comments