About me

I am a second-year PhD student in the Center for Information and Language Processing at LMU Munich under the supervision of Alex Fraser. I am broadly interested in transfer learning, self-supervised training for low-resource machine translation, in both supervised and unsupervised scenarios, as well as domain adaptation. My PhD thesis is currently focused on improving cross-lingual pretraining for low-resource languages. In this line of work, I proposed a curriculum for pretraining masked language models to improve unsupervised translation for high-resource/low-resource language pairs. I have also worked on enhancing cross-lingual pretraining by including lexical information from non-contextualized embeddings. Lately, I have been working on multilingual neural machine translation for low-resource directions.

Before that, I completed my diploma (combined BEng and MEng) at the National Technical University of Athens in Athens, Greece, department of Electrical and Computer Engineering. In my thesis, I explored ways of improving transfer learning with language Modeling for several classification tasks, most notably on emotion recognition, under the supervision of Alex Potamianos.


July 2021: Excited to share that I will be starting a research internship in AllenAI in July, working with Jesse Dodge and Matt Peters!

March 2021: Our paper “Improving the Lexical Ability of Pretrained Language Models for Unsupervised Neural Machine Translation” has been accepted to appear in NAACL 2021!

September 2020: Our paper “Reusing a Pretrained Language Model on Languages with Limited Corpora for Unsupervised NMT” has been accepted to appear in EMNLP 2020! Also, our work “Domain Adversarial Fine-Tuning as an Effective Regularizer” was accepted in the findings of EMNLP.

July 2020: Our system ranked first in the WMT 2020 Unsupervised Translation Shared Task. The task was to create a purely unsupervised machine translation model that translates between Upper Sorbian and German.