Argitalpenak

2024

GoLLIE: Annotation Guidelines improve Zero-Shot Information-Extraction. Sainz, O., García-Ferrero, I., Agerri, R., de Lacalle, O. L., Rigau, G., & Agirre, E. (2024). ICLR 2024.

How Well Can BERT Learn the Grammar of an Agglutinative and Flexible-Order Language? The Case of Basque. Urbizu, G., Zulaika, M., Saralegi, X., Corral A. In proceedings of the 2024 Joint International Conference on Computational Linguistics, Language Resources and Evaluation (LREC-COLING 2024). Torino, Italy. 2024.

2023

Strategies for bilingual intent classification for small datasets scenarios. López de Lacalle, M., Saralegi, X., Saizar, A., Urbizu, G. and Corral, A. Procesamiento del Lenguaje Natural, Revista nº 71, septiembre de 2023, pp. 137-147 

Scaling Laws for BERT in Low-Resource Settings. Urbizu, G., San Vicente, I., Saralegi, X., Agerri, R. and Soroa, A. In Findings of the Association for Computational Linguistics: ACL 2023, pages 7771–7789 July 9-14, 2023

Not Enough Data to Pre-train Your Language Model? MT to the Rescue! Urbizu, G., San Vicente, I., Saralegi,X., and Corral, A. In Findings of the Association for Computational Linguistics: ACL2023, pages 3826–3836 July 9-14, 2023