Hi, I’m Alexandra!

I am a third-year PhD student in the Center for Information and Language Processing at LMU Munich advised by Alex Fraser. My research is in natural language processing and machine learning.

I am interested in self-supervised training for supervised (and unsupervised) machine translation and domain adaptation, mainly using monolingual or cross-lingual language models. My work revolves around using prior knowledge from unsupervised pretraining to improve performance in low-resource scenarios.

Before coming to Germany, I received my Diploma on Electrical and Computer Engineering from the National Technical University of Athens (NTUA). My thesis was advised by Alex Potamianos. During my final undergrad year, I was working in Behavioral Signals as a Machine Learning engineer.

Recently, I interned with the AllenNLP team of Allen Institute for AI, where I was lucky to be advised by Jesse Dodge and Matt Peters.

News

April 2022: Our paper on efficient domain adaptation of language models has been accepted to appear at NAACL 2022 as a long paper🥳. Take a look at the AI2 blog post about it here!

December 2021: Our pre-print on efficient domain adaptation of language models (done during my internship in Allen AI) is out.

July 2021: Excited to share that I will be starting a research internship in Allen AI in July, working with Jesse Dodge and Matt Peters!

March 2021: Our paper “Improving the Lexical Ability of Pretrained Language Models for Unsupervised Neural Machine Translation” has been accepted to appear in NAACL 2021!

September 2020: Our paper “Reusing a Pretrained Language Model on Languages with Limited Corpora for Unsupervised NMT” has been accepted to appear in EMNLP 2020! Also, our work “Domain Adversarial Fine-Tuning as an Effective Regularizer” was accepted in the findings of EMNLP.

July 2020: Our system ranked first in the WMT 2020 Unsupervised Translation Shared Task. The task was to create a purely unsupervised machine translation model that translates between Upper Sorbian and German.

Selected Publications

  • Efficient Hierarchical Domain Adaptation for Pretrained Language Models
    Alexandra Chronopoulou, Matthew E. Peters and Jesse Dodge.
    NAACL 2022.
    [paper]
  • Improving the Lexical Ability of Pretrained Language Models for Unsupervised Neural Machine Translation
    Alexandra Chronopoulou, Dario Stojanovski and Alexander Fraser.
    NAACL 2021.
    [paper] [code]
  • Reusing a Pretrained Language Model on Languages with Limited Corpora for Unsupervised NMT
    Alexandra Chronopoulou, Dario Stojanovski and Alexander Fraser.
    EMNLP 2020.
    [paper] [code] [slides]
  • An Embarrassingly Simple Approach for Transfer Learning from Pretrained Language Models
    Alexandra Chronopoulou, Christos Baziotis and Alexandros Potamianos.
    NAACL 2019.
    [paper] [code]