About
I am Alexandra, a Research Scientist at Google DeepMind in the Gemini team working on post-training of LLMs, and more widely on Natural Language Processing (Machine Learning).
I earned my PhD in Computer Science from the University of Munich, where I was advised by Alex Fraser. My PhD research was mostly on combining information from various languages and domains to enable positive transfer during parameter-efficient fine-tuning of language models, especially under resource constraints.
During my PhD, I interned at Google DeepΜind in Berlin, hosted by Sebastian Ruder and Priyanka Agrawal. Prior to that, I interned (twice) at the Allen Institute for AI with Jesse Dodge and Matt Peters; I was part of the AllenNLP team. I also spent a few months at Amazon AI in Santa Clara, CA , working with Brian Thompson, Prashant Mathur and Marcello Federico as an intern in the AI human language technology group.
Before starting the PhD, I graduated with a diploma (Bachelor and MEng) in Electrical & Computer Engineering from the National Technical University of Athens (NTUA).
News
January 2025: I am co-organizing Repl4NLP 2025. The workshop will be co-located with NAACL 2025 in Albuquerque, New Mexico.
December 2024: My PhD thesis titled “Efficient Multilingual and Domain Adaptation of Language Models under Resource Constraints” is now available online.
November 2024: Excited to share that our paper Language and Task Arithmetic with Parameter-Efficient Layers for Zero-Shot Summarization got best paper award at the Multilingual Representation Learning workshop at EMNLP 2024!
October 2024: New preprint on Model Merging of Large Language Models.
October 2024: Our paper Language and Task Arithmetic with Parameter-Efficient Layers for Zero-Shot Summarization has been accepted to appear at the MRL workshop in EMNLP 2024! See you in Miami! 🌴
October 2024: It’s a wrap! 🎓 I successfully defended my PhD thesis on Efficient Multilingual and Domain Adaptation of Language Models under Resource Constraints. My thesis will (hopefully) be online soon!
April 2024: Check out the Gemini 1.5 Pro API, a top-tier LLM according to the LMSys Leaderboard (technical report)
January 2024: Excited to share that I have joined Google Bard in NYC as a Research Scientist!
Selected Publications
- Gemini 1.5: Unlocking multimodal understanding across millions of tokens of context
Gemini Team, Google (I was one of the > 1000 authors)
technical report - Language and Task Arithmetic with Parameter-Efficient Layers for Zero-Shot Summarization
Alexandra Chronopoulou, Jonas Pfeiffer, Joshua Maynez, Xinyi Wang, Sebastian Ruder, Priyanka Agrawal
EMNLP MRL Workshop 2024 - AdapterSoup: Weight Averaging to Improve Generalization of Pretrained Language Models [slides]
Alexandra Chronopoulou, Matthew E. Peters, Alexander Fraser, Jesse Dodge
EACL 2023 (Findings) - Language-Family Adapters for Low-Resource Multilingual Neural Machine Translation [slides]
Alexandra Chronopoulou, Dario Stojanovski, Alexander Fraser
EACL LoResMT workshop 2023 - Efficient Hierarchical Domain Adaptation for Pretrained Language Models [code] [blog] [slides]
Alexandra Chronopoulou, Matthew E. Peters, Jesse Dodge
NAACL 2022
More
My undergrad thesis supervisor was Alexandros Potamianos. I spent a good part of 2018 and 2019 in the Speech and Language Processing group of ECE, NTUA. Thesis: Transfer Learning with Deep Neural Networks for Sentiment Analysis and Semantic Modeling. During my last undergrad year I was also working as a Machine Learning Engineer at Behavioral Signal Technologies.
I am from Athens, Greece (go VVV!) and I enjoy a variety of things including books, good movies, sports (tennis, padel, skiing), concerts, exploring new places, and most activities that are ocean-related.