I am a fourth (and final) year PhD student in the Center for Information and Language Processing at LMU Munich advised by Alex Fraser. My research is in natural language processing and machine learning.

I am interested in enhancing pretrained models for machine translation, multilinguality and domain adaptation by exploiting unlabeled data. My most recent research focuses on parameter-efficient methods for transfer learning. I am particularly interested in combining information from different languages, domains or tasks to enable positive transfer using modular approaches, such as adapters, averaging weights of pretrained models, etc.

During my PhD, I have done internships as Research Scientist at the Allen Institute for AI, working with the AllenNLP team (twice, remote) and at Amazon Web Services (AWS) in Santa Clara, CA working with the AI human language technology group.

Prior to joining LMU Munich, I graduated with a diploma (Bachelor and MEng equivalent) on Electrical and Computer Engineering from the National Technical University of Athens (NTUA). My thesis advisor was Alex Potamianos.


January 2023: Our paper AdapterSoup: Weight Averaging to Improve Generalization of Pretrained Language Models was accepted at EACL 2023 (findings)! This is the result of my 2nd internship in Allen AI!

October 2022: Our paper m4 Adapter: Multilingual Multi-Domain Adaptation for Machine Translation with a Meta-Adapter was accepted at EMNLP 2022 (findings).

October 2022: Check out our work on Language-Family Adapters for Multilingual NMT!

September 2022: Invited talk at Lilt AI on our recent NAACL paper! (slides).

August 2022: Happy to share that I started an internship at Amazon AI in Santa Clara, California! I will be working on speech translation with Prashant Mathur and the rest of the Doubtfire team.

June 2022: Invited talk at the DG CNECT workshop on large language models (slides).

May 2022: Excited to start another internship at Allen AI (working with the same team)!

April 2022: 1 paper accepted at NAACL 2022: Efficient Hierarchical Domain Adaptation for Pretrained Language Models (main conference). I wrote a blog post about it, give it a read!

July 2021: Started an internship at Allen AI, working with Jesse Dodge and Matt Peters!

Selected Publications