Results
PoS-Tagging: Official Ranking
Participants’ results with respect to Tagging Accuracy (TA) and Unknown Words Tagging Accuracy (UWTA).
Participants | EAGLES-like | Distrib | ||
---|---|---|---|---|
TA | UWTA | TA | UWTA | |
FBKirst_Zanoli_POS | 98.04 | 95.02 | 97.68 | 94.65 |
ILCcnrUniPi_Lenci_POS | 97.65 | 94.12 | 96.70 | 93.14 |
UniBoCilta_Romagnoli_POS | 96.79 | 91.48 | 94.80 | 90.72 |
UniBoDslo_Tamburini_POS | 97.59 | 92.16 | 97.31 | 92.99 |
UniRoma1_Bos_POS | 96.76 | 87.41 | 96.21 | 88.69 |
UniStuttIMS_Schiehlen_POS | 97.15 | 89.29 | 97.07 | 92.23 |
UniTn_Baroni_POS | 97.89 | 94.34 | 97.37 | 94.12 |
UniVe_Delmonte_POS | 91.85 | 84.46 | 91.42 | 86.80 |
Yahoo_Ciaramita_POS_s1 | 96.78 | 87.78 | 96.61 | 88.24 |
Yahoo_Ciaramita-POS_s2 | 95.27 | 81.83 | 95.11 | 84.16 |
UniPiSynthema_Deha_POS | 88.71 | 79.49 | – | – |
UniTo_Lesmo_POS | 94.69 | 87.33 | – | – |
Parsing: Official Ranking for the Dependency Parsing Subtask
Participants’ results with respect to the CoNLL standard metrics: Labeled Attachment Score (LAS), Unlabeled Attachment Score (UAS) and Label Accuracy (LAS2).
Participants | LAS | UAS | LAS2 | Total |
---|---|---|---|---|
UniTo_Lesmo_PAR | 86.94 | 90.90 | 91.59 | 1-1-1 |
UniPi_Attardi_PAR | 77.88 | 88.43 | 83.00 | 2-2-2 |
IIIT_Mannem_PAR | 75.12 | 85.81 | 82.05 | 3-4-3 |
UniStuttIMS_Schiehlen_PAR | 74.85 | 85.88 | 81.59 | 4-3-4 |
UPenn_Champollion_PAR | * | 85.46 | * | *-5-* |
UniRoma2_Zanzotto_PAR | 47.62 | 62.11 | 54.90 | 5-6-5 |
Parsing: Official Ranking for the Constituency Parsing Subtask
Participants’ results with respect to the standard brackets precision-recall-F_score metrics.
Participants | Br-R | BR-P | BR-F | Errors |
---|---|---|---|---|
UniNa_Corazza_PAR | 70.81 | 65.36 | 67.97 | 26 |
FBKirst_Pianta_PAR | 38.92 | 45.49 | 41.94 | 48 |
Word Sense Disambiguation: Official Ranking.
Participants’ results with respect to Precision, Recall and F-measure.
Participants | Run | Prec. | Rec. | F |
---|---|---|---|---|
UniBa_Basile_WSD | r1 | 0.560 | 0.414 | 0.470 |
r2 | 0.503 | 0.372 | 0.427 |
Temporal Expressions: Official Ranking for the Recognition only Subtask
Participants’ results with respect to Value, Precision, Recall and F-measure.
Participants | Value | Prec. | Rec. | F. | |
---|---|---|---|---|---|
FBKirst_Negri_TIME | 85.7 | 95.7 | 89.8 | 92.6 | report |
UniPg_Faina_TIME | 50.1 | 77.7 | 70.3 | 73.8 | report |
UniAli_Puchol_TIME | 48.8 | 78.4 | 67.4 | 72.5 | report |
UniAli_Saquete_TIME | 41.9 | 82.5 | 53.2 | 64.7 | report |
Temporal Expressions: Official Ranking for the Recognition + Normalization Subtask
Participants’ results with respect to Value, Precision, Recall and F-measure.
Participants | Value | Prec. | Rec. | F. |
---|---|---|---|---|
FBKirst_Negri_TIME | 61.9 | 68.5 | 63.3 | 67.4 |
UniAli_Saquete_TIME | 22.1 | 51.5 | 35.6 | 42.1 |
UniPg_Faina_TIME | 11.9 | 24.9 | 19.6 | 21.9 |
Named Entity Recognition: Official Ranking
Participants’ results with respect to F-measure, Precision and Recall.
Participants | Run | F | Prec. | Rec. |
---|---|---|---|---|
FBKirst_Zanoli_NER | r2 | 82.14 | 83.41 | 80.91 |
r1 | 81.28 | 82.97 | 79.65 | |
UniDuE_Roessler_NER | r1 | 72.27 | 71.62 | 72.94 |
r2 | 71.93 | 73.28 | 70.62 | |
Yahoo_Ciaramita_NER | r1 | 68.99 | 71.28 | 66.85 |
r2 | 68.15 | 70.44 | 66.00 | |
UniDo_Jungermann_NER | r2 | 67.90 | 70.93 | 65.12 |
r1 | 67.79 | 70.93 | 64.91 | |
UniAli_Kozareva_NER | – | 66.59 | 62.73 | 70.95 |
LDC_Walker_NER | r1 | 63.10 | 83.05 | 50.88 |
r2 | 62.70 | 82.12 | 50.70 |