Skip to content

Commit 0d22cdd

Browse files
committed
Updated shared task webpage
1 parent 2fdc805 commit 0d22cdd

File tree

1 file changed

+88
-9
lines changed

1 file changed

+88
-9
lines changed

_pages/mm-argfallacy2025.md

Lines changed: 88 additions & 9 deletions
Original file line numberDiff line numberDiff line change
@@ -132,10 +132,19 @@ For more details, refer to the MAMKit [GitHub repository](https://github.com/nlp
132132
The test set for **mm-argfallacy-2025** is now available! To use it, please:
133133

134134
1. Create a fresh environment
135-
2. Install the latest version of `mamkit` (v0.1.2):
135+
2. Clone the repository and install the requirements:
136136

137137
```bash
138-
pip install mamkit
138+
git clone [email protected]:nlp-unibo/mamkit.git
139+
cd mamkit
140+
pip install -r requirements.txt
141+
pip install --editable .
142+
```
143+
144+
<ol start="3"> <li>Access MAMKit in your Python code:</li> </ol>
145+
146+
```python
147+
import mamkit
139148
```
140149

141150
Then, retrieve the data using the following code:
@@ -179,22 +188,92 @@ def loading_data_example():
179188

180189
**Note**: By "updated version," we mean that the datasets have undergone a refinement in the alignment process, which has resulted in adjustments to the number of samples included compared to the original versions published in the referenced papers.
181190

182-
# Evaluation
183-
For argumentative fallacy detection, we will compute the binary F1-score on predicted sentence-level labels.
184-
For argumentative fallacy classification, we will compute the macro F1-score on predicted sentence-level labels.
185-
Metrics will be computed on the hidden test set to determine the best system for each sub-task and input mode.
191+
# Evaluation
192+
193+
For argumentative fallacy detection, we will compute the binary F1-score on predicted sentence-level labels.
194+
For argumentative fallacy classification, we will compute the macro F1-score on predicted sentence-level labels.
195+
Metrics will be computed on the hidden test set to determine the best system for each sub-task and input mode.
196+
197+
Evaluation will be performed via the [CodaLab platform](https://codalab.lisn.upsaclay.fr/competitions/22739).
198+
On CodaLab, participants will find the leaderboard, along with the results of the provided baselines.
199+
Submission guidelines can be found under the *Evaluation* section of the CodaLab competition page.
200+
201+
🚨 **Important**: In the evaluation website, you will also find a link to a **mandatory participation survey**.
202+
Filling out this survey is required in order to participate in the task.
203+
We also provide the survey link here for convenience: [https://tinyurl.com/limesurvey-argfallacy](https://tinyurl.com/limesurvey-argfallacy)
204+
205+
### Baseline Results on Test Set
206+
207+
#### Argumentative Fallacy Classification (AFC) – Macro F1-score
208+
209+
---
210+
211+
| Input Modality | Model | F1-Score |
212+
|----------------|----------------------------------|----------|
213+
| Text-only | BiLSTM w/ GloVe | 47.21 |
214+
| Text-only | RoBERTa | 39.25 |
215+
| Audio-only | BiLSTM w/ MFCCs | 15.82 |
216+
| Audio-only | WavLM | 6.43 |
217+
| Text + Audio | BiLSTM (GloVe + MFCCs) | 21.91 |
218+
| Text + Audio | MM-RoBERTa + WavLM | 38.16 |
219+
220+
---
221+
222+
#### Argumentative Fallacy Detection (AFD) – Binary F1-score
223+
224+
---
225+
226+
| Input Modality | Model | F1-Score |
227+
|----------------|----------------------------------|----------|
228+
| Text-only | BiLSTM w/ GloVe | 24.62 |
229+
| Text-only | RoBERTa | 27.70 |
230+
| Audio-only | BiLSTM w/ MFCCs | 0.00 |
231+
| Audio-only | WavLM | 0.00 |
232+
| Text + Audio | BiLSTM (GloVe + MFCCs) | 23.37 |
233+
| Text + Audio | MM-RoBERTa + WavLM | 28.48 |
234+
235+
---
236+
237+
238+
# Submission
239+
240+
All evaluated submissions are required to commit to submitting a system description paper. You can choose between two options:
241+
242+
- **Non-Archival Paper**:
243+
A 2-page paper describing your system, with unlimited pages for appendices and bibliography. These papers will *not* be published in the workshop proceedings, but your system will be mentioned in the Overview Paper of the shared task, upon acceptance.
244+
245+
- **Archival Paper**:
246+
A 4-page paper describing your system, also with unlimited pages for appendices and bibliography. These papers *will* be published in the official ACL workshop proceedings and must be presented at the workshop (poster or oral session).
247+
⚠️ *In accordance with ACL policy, at least one team member must register for the workshop in order to present an archival paper if aaccepted to be published at the ACL proceedings.*
248+
249+
All papers must use the official [ACL style templates](https://github.com/acl-org/acl-style-files), available in both LaTeX and Word. We strongly recommend using the official [Overleaf template](https://www.overleaf.com/project/5f64f1fb97c4c50001b60549) for convenience.
250+
251+
Submissions will be made via a dedicated submission website, which will be published soon.
252+
253+
- 🗓️ **Submissions open**: May 1st, 2025 (the day after the end of the evaluation period)
254+
- 🗓️ **Submissions close**: May 15th, 2025
255+
- 📢 **Notification of acceptance**: May 20th, 2025
256+
- 📝 **Camera-ready deadline**: May 25th, 2025
257+
258+
**Important notes**:
259+
- All accepted **archival papers** will be presented during the workshop’s poster session and require at least one registered author.
260+
- **Non-archival papers** do *not* require registration and are not presented at the workshop, but their systems will be acknowledged in the Overview Paper.
261+
262+
We look forward to receiving your submissions!
186263

187264
# Key Dates (Anywhere on Earth)
188265

189266
- **Release of Training Data**: February 25th
190267
- **Release of Test Set**: ~~March 24th~~ → April 7th
191268
- **Evaluation Start**: ~~April 14th~~ → April 21st
192269
- **Evaluation End**: ~~April 25th~~ → April 30th
193-
- **Paper Submission Due**: May 15th
270+
- **Paper Submissions Open**: May 1st
271+
- **Paper Submission Close**: May 15th
272+
- **Notification of acceptance**: May 20th
273+
- **Camera-ready Due**: May 25th
194274
- **Workshop**: July 31st
195275

196-
# Submission
197-
Will be updated soon.
276+
198277

199278

200279

0 commit comments

Comments
 (0)