You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: README.md
+19-9Lines changed: 19 additions & 9 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -217,7 +217,10 @@ This is especially helpful when investigating the writing style of authors.
217
217
218
218
<aname="Export_visualization"></a>
219
219
### Export the tree
220
-
CTL offers you to export a constituent tree into various file formats, which are listed below. Most of these formats result in a visualization of the tree, while the remaining file formats are used for data exchange.
220
+
CTL offers you to export a constituent tree into various **file formats**, which are listed below. Most of these formats result in a visualization of the tree, while the remaining file formats are used for data exchange.
@@ -235,6 +238,8 @@ CTL offers you to export a constituent tree into various file formats, which are
235
238
|**TXT**|*Plain-Text*| Pretty-print text visualization|
236
239
|**TEX**|*LaTeX-Document*| LaTeX-typesetting |
237
240
241
+
</details>
242
+
238
243
The following example shows an export of the tree into a PDF file:
239
244
240
245
```python
@@ -250,9 +255,14 @@ In the case of raster/vector images, CTL automatically removes unnecessary margi
250
255
## Available models and languages
251
256
CTL currently supports eight languages: English, German, French, Polish, Hungarian, Swedish, Chinese and Korean. The performance of the respective models can be looked up in the <ahref="https://github.com/nikitakit/self-attentive-parser#available-models">benepar repository</a>.
252
257
258
+
## CTL in the Research Landscape
259
+
CTL has been used in several research works published at leading conferences, including EMNLP 2025, ICLR 2024 and ACL 2024:
260
+
261
+
- Meinan Liu, Yunfang Dong, Xixian Liao, and Bonnie Webber. 2025. **[Multi-token Mask-filling and Implicit Discourse Relations](https://aclanthology.org/2025.findings-emnlp.670/)**. In Findings of the Association for Computational Linguistics: EMNLP 2025, pages 12546–12560, Suzhou, China. Association for Computational Linguistics.
262
+
263
+
- Mulligan, Karl, and Kyle Rawlins. **[Analyzing naturally-sourced Questions Under Discussion](https://journals.linguisticsociety.org/proceedings/index.php/ELM/article/view/5828)**. Experiments in Linguistic Meaning, vol. 3, 24 Jan 2025.
253
264
254
-
## CTL in the Research Landscape
255
-
CTL has been used in several research works that have appeared at renowned conferences such as ICLR 2024 and ACL 2024:
265
+
- Judita Preiss **[Hybrid Approach to Literature-Based Discovery: Combining Traditional Methods with LLMs](https://www.mdpi.com/2076-3417/15/16/8785/pdf?version=1754653754)**. Appl. Sci. 2025, 15, 8785.
256
266
257
267
- Yuang Li, Jiaxin Guo, Min Zhang, Ma Miaomiao, Zhiqiang Rao, Weidong Zhang, Xianghui He, Daimeng Wei, and Hao Yang. 2024. **[Pause-Aware Automatic Dubbing using LLM and Voice Cloning](https://aclanthology.org/2024.iwslt-1.2/)**. In Proceedings of the 21st International Conference on Spoken Language Translation (IWSLT 2024), pages 12–16, Bangkok, Thailand (in-person and online). Association for Computational Linguistics.
258
268
@@ -267,13 +277,13 @@ The code and the <a href="https://github.com/Halvani/Constituent-Treelib/blob/ma
267
277
If you find this repository helpful, please invest a few minutes and cite it in your paper/project:
268
278
```bibtex
269
279
@software{Halvani_Constituent_Treelib:2024,
270
-
author = {Halvani, Oren},
271
-
title = {{Constituent Treelib - A Lightweight Python Library for Constructing, Processing, and Visualizing Constituent Trees.}},
Please also give credit to the authors of benepar and <ahref="https://github.com/nikitakit/self-attentive-parser#citation">cite their work</a>. In science, the principle is: **give and take**..
0 commit comments