About

I am a final year PhD student in the Center for Information and Language Processing at LMU Munich, where I am advised by Alex Fraser, working on natural language processing & machine learning. My research is in machine translation, multilinguality and domain adaptation. I am currently focusing on parameter-efficient fine-tuning methods; I am particularly excited about combining information from different languages, domains or tasks to enable positive transfer using modular approaches.

I am currently also a research intern at Google Deepmind in Berlin, hosted by Sebastian Ruder and Priyanka Agrawal.

During my PhD, I interned (twice) at the Allen Institute for AI with Jesse Dodge and Matt Peters; I was part of the AllenNLP team. I recently also interned at Amazon AI in Santa Clara, CA with Brian Thompson, Prashant Mathur and Marcello Federico, as part of the AI human language technology group.

Prior to joining LMU Munich, I graduated with a diploma (Bachelor and MEng) in Electrical and Computer Engineering (ECE) from the National Technical University of Athens (NTUA) in Greece.

I will soon finish my PhD and be on the job market for industry research positions!

News

May 2023: Our paper Mitigating Data Imbalance and Representation Degeneration in Multilingual Machine Translation is out.

May 2023: Our paper On the Copying Problem of Unsupervised NMT: A Training Schedule with a Language Discriminator Loss was accepted to IWSLT 2023 and our paper Improving Isochronous Machine Translation with Target Factors and Auxiliary Counters (from my Amazon internship) was accepted to Interspeech 2023!

May 2023: Jesse gave a talk at the LTI CMU Colloquium, discussing our recent papers on efficient domain adaptation of pretrained language models (1, 2); you can check it out here.

April 2023: Very happy to start a research internship in Google Berlin, as part of Google DeepMind!

March 2023: Our paper Language-Family Adapters for Low-Resource Multilingual NMT was accepted to LoResMT, EACL 2023!

February 2023: Check out or work on Isochronous Automatic Dubbing (from my internship at Amazon last fall)!

February 2023: I am co-organizing a shared task on dubbing in IWSLT 2023 (co-located with ACL next summer) along with former teammates from Amazon.

January 2023: Our paper AdapterSoup: Weight Averaging to Improve Generalization of Pretrained Language Models from my internship at Allen AI was accepted to EACL 2023 (findings)!

October 2022: Our work m4 Adapter: Multilingual Multi-Domain Adaptation for Machine Translation with a Meta-Adapter was accepted to EMNLP 2022 (findings)!

October 2022: Check out our work on Language-Family Adapters for Low-Resource Multilingual NMT!

September 2022: Invited talk at Lilt AI on our recent NAACL paper. (slides)

August 2022: Happy to share that I started an internship at Amazon AI in Santa Clara, California! I will spend this fall working on speech translation with the Doubtfire team.

June 2022: Invited talk at the DG CNECT workshop on large language models (slides).

May 2022: Excited to start another internship at Allen AI, working with Jesse Dodge and Matt Peters!

April 2022: Our paper Efficient Hierarchical Domain Adaptation for Pretrained Language Models from my internship in Allen AI was accepted to NAACL 2022 (main track); I wrote a blog post about it, give it a read!

Selected Publications

More