About

I am Alexandra, a Research Scientist at Google DeepMind in the GenAI team working on post-training of LLMs (Gemini) and more widely on Natural Language Processing (Machine Learning).

I earned my PhD in Computer Science from the University of Munich, where I was advised by Alex Fraser. My PhD research was mostly on combining information from various languages and domains to enable positive transfer during parameter-efficient fine-tuning of language models, especially under resource constraints.

During my PhD, I interned at Google DeepΜind in Berlin, hosted by Sebastian Ruder and Priyanka Agrawal. Prior to that, I interned (twice) at the Allen Institute for AI with Jesse Dodge and Matt Peters; I was part of the AllenNLP team. I also spent a few months at Amazon AI in Santa Clara, CA , working with Brian Thompson, Prashant Mathur and Marcello Federico as an intern in the AI human language technology group.

Before starting the PhD, I graduated with a diploma (Bachelor and MEng) in Electrical & Computer Engineering from the National Technical University of Athens (NTUA).

News

August 2025: I will be in San Diego in December to attend NeurIPS and participate at the panel of the Model Merging tutorial.

July 2025: The technical report of our most advanced model, Gemini 2.5 pro, has just been published!

June 2025: The paper Model Merging of Large Language Models of our intern Prateek Yadav has been accepted to Transactions on Machine Learning Research (TMLR).

January 2025: I am co-organizing Repl4NLP 2025. The workshop will be co-located with NAACL 2025 in Albuquerque, New Mexico.

December 2024: My PhD thesis titled “Efficient Multilingual and Domain Adaptation of Language Models under Resource Constraints” is now available online.

November 2024: Excited to share that our paper Language and Task Arithmetic with Parameter-Efficient Layers for Zero-Shot Summarization got best paper award at the Multilingual Representation Learning workshop at EMNLP 2024!

October 2024: New preprint on Model Merging of Large Language Models.

October 2024: Our paper Language and Task Arithmetic with Parameter-Efficient Layers for Zero-Shot Summarization has been accepted to appear at the MRL workshop in EMNLP 2024! See you in Miami! 🌴

October 2024: It’s a wrap! 🎓 I successfully defended my PhD thesis on Efficient Multilingual and Domain Adaptation of Language Models under Resource Constraints. My thesis will (hopefully) be online soon!

Selected Publications

More