@@ -52,16 +52,6 @@ This is a yet another high quality EN-only base model for entity extraction.
52
52
It is a 12-layer pretrained pretrained [ Transformer] [ 7 ] model optimized for conversation.
53
53
Its architecture is pretrained for example-based use ([ KNN] [ 3 ] ), thus it can be used out of box.
54
54
55
- ### pretrained.20210105.microsoft.dte.00.12.bert_example_ner_multilingual.onnx (experimental)
56
- This is a high quality multilingual base model for entity extraction.
57
- It is a 12-layer pretrained pretrained [ Transformer] [ 7 ] model optimized for conversation.
58
- Its architecture is pretrained for example-based use ([ KNN] [ 3 ] ), thus it can be used out of box.
59
-
60
- ### pretrained.20210105.microsoft.dte.00.12.tulr_example_ner_multilingual.onnx (experimental)
61
- This is a high quality multilingual base model for entity extraction.
62
- It is a 12-layer pretrained pretrained [ Transformer] [ 7 ] model optimized for conversation.
63
- Its architecture is pretrained for example-based use ([ KNN] [ 3 ] ), thus it can be used out of box.
64
-
65
55
### pretrained.20210205.microsoft.dte.00.06.bert_example_ner.en.onnx (experimental)
66
56
This is a high quality EN-only base model for entity extraction. It's smaller and faster than its 12-layer alternative.
67
57
It is a 6-layer pretrained pretrained [ Transformer] [ 7 ] model optimized for conversation.
@@ -72,16 +62,6 @@ This is a high quality EN-only base model for entity extraction. It's smaller an
72
62
It is a 6-layer pretrained pretrained [ Transformer] [ 7 ] model optimized for conversation.
73
63
Its architecture is pretrained for example-based use ([ KNN] [ 3 ] ), thus it can be used out of box.
74
64
75
- ### pretrained.20210205.microsoft.dte.00.06.bert_example_ner_multilingual.onnx (experimental)
76
- This is a high quality multilingual base model for entity extraction. It's smaller and faster than its 12-layer alternative.
77
- It is a 6-layer pretrained pretrained [ Transformer] [ 7 ] model optimized for conversation.
78
- Its architecture is pretrained for example-based use ([ KNN] [ 3 ] ), thus it can be used out of box.
79
-
80
- ### pretrained.20210205.microsoft.dte.00.06.tulr_example_ner_multilingual.onnx (experimental)
81
- This is a high quality multilingual base model for entity extraction. It's smaller and faster than its 12-layer alternative.
82
- It is a 6-layer pretrained pretrained [ Transformer] [ 7 ] model optimized for conversation.
83
- Its architecture is pretrained for example-based use ([ KNN] [ 3 ] ), thus it can be used out of box.
84
-
85
65
## Models Evaluation
86
66
For a more quantitative comparison analysis of the different models see the following performance characteristics.
87
67
@@ -136,13 +116,17 @@ For a more quantitative comparison analysis of the different models see the foll
136
116
| ------------------------------------------------------------ | ---------- | ------ | ----------------------- | --------------- |
137
117
| pretrained.20210205.microsoft.dte.00.06.bert_example_ner.en.onnx | BERT | 6 | ~ 23 ms | 259M |
138
118
| pretrained.20210205.microsoft.dte.00.12.bert_example_ner.en.onnx | BERT | 12 | ~ 40 ms | 427M |
119
+ | pretrained.20210218.microsoft.dte.00.06.bert_example_ner.en.onnx | BERT | 6 | ~ 23 ms | 259M |
120
+ | pretrained.20210218.microsoft.dte.00.12.bert_example_ner.en.onnx | BERT | 12 | ~ 40 ms | 425M |
139
121
140
122
- The following table shows how accurate is each model relative to provided training sample size using [ Snips NLU] [ 4 ] system, evaluated by ** macro-average-F1** .
141
123
142
124
| Training samples per entity type | 10 | 20 | 50 | 100 | 200 |
143
125
| ------------------------------------------------------------ | ----- | ----- | ----- | ----- | ----- |
144
- | pretrained.20210205.microsoft.dte.00.06.bert_example_ner.en.onnx | 0.662 | 0.678 | 0.680 | 0.684 | 0.674 |
126
+ | pretrained.20210205.microsoft.dte.00.06.bert_example_ner.en.onnx | 0.615 | 0.636 | 0.647 | 0.661 | 0.665 |
145
127
| pretrained.20210205.microsoft.dte.00.12.bert_example_ner.en.onnx | 0.637 | 0.658 | 0.684 | 0.698 | 0.702 |
128
+ | pretrained.20210218.microsoft.dte.00.06.bert_example_ner.en.onnx | 0.637 | 0.658 | 0.673 | 0.686 | 0.684 |
129
+ | pretrained.20210218.microsoft.dte.00.12.bert_example_ner.en.onnx | 0.661 | 0.664 | 0.670 | 0.685 | 0.681 |
146
130
147
131
148
132
0 commit comments