About
I am currently a Research Scientist at Google DeepMind in the GenAI team working on LLM post-training (Gemini). I am mainly working on (system) instruction following, in both agentic and non-agentic settings.
I earned my PhD in Computer Science from the University of Munich, where I was advised by Alex Fraser. My research was mostly on combining information from various languages and domains to enable positive transfer during parameter-efficient fine-tuning of language models, especially under resource constraints. During my PhD, I interned at Google DeepMind in Berlin. Prior to that, I interned (twice) at the Allen Institute for AI in the AllenNLP team. I also spent a few months at Amazon AI in Santa Clara, CA as an intern in the AI human language technology group.
Before my doctoral studies, I did my undergrad in Electrical & Computer Engineering at the National Technical University of Athens (NTUA).
News
March 2026: Will give a talk at the AI4Science Summer School back home in Athens, Greece on the 16th and 17th of July!
February 2026: I participated in a panel at Barnard College regarding: Deciding between “industry” (product) and “research” (academia), invited by Lauren Beltrone.
December 2025: I attended NeurIPS in San Diego and participated in the panel for the Model Merging tutorial.
October 2025: New paper out led by our intern Frederick Zhang on Do LLMs Really Need 10+ Thoughts for “Find the Time 1000 Days Later”? Towards Structural Understanding of LLM Overthinking.
July 2025: The technical report of our most advanced model, Gemini 2.5 pro, has just been published!
June 2025: The paper Model Merging of Large Language Models of our intern Prateek Yadav has been accepted to Transactions on Machine Learning Research (TMLR).
January 2025: I co-organized Repl4NLP 2025. The workshop was co-located with NAACL 2025 in Albuquerque, New Mexico.
Selected Publications
- Gemini 2.5: Pushing the Frontier with Advanced Reasoning, Multimodality, Long Context, and Next Generation Agentic Capabilities
Gemini Team, Google (I was one of the > 1000 authors)
technical report 2025 - What Matters for Model Merging at Scale?
Prateek Yadav, Tu Vu, Jonathan Lai, Alexandra Chronopoulou, Manaal Faruqui, Mohit Bansal, Tsendsuren Munkhdalai
Transactions on Machine Learning Research (TMLR) 2025 - Gemini 1.5: Unlocking multimodal understanding across millions of tokens of context
Gemini Team, Google (I was one of the > 1000 authors)
technical report 2024 - Language and Task Arithmetic with Parameter-Efficient Layers for Zero-Shot Summarization
Alexandra Chronopoulou, Jonas Pfeiffer, Joshua Maynez, Xinyi Wang, Sebastian Ruder, Priyanka Agrawal
EMNLP MRL Workshop 2024 - AdapterSoup: Weight Averaging to Improve Generalization of Pretrained Language Models [slides]
Alexandra Chronopoulou, Matthew E. Peters, Alexander Fraser, Jesse Dodge
EACL 2023 (Findings)
More
- My undergrad thesis supervisor was Alexandros Potamianos. I spent a good part of 2018 and 2019 in the Speech and Language Processing group of ECE, NTUA. Thesis: Transfer Learning with Deep Neural Networks for Sentiment Analysis and Semantic Modeling. During my last undergrad year, I was also working as a Machine Learning Engineer at Behavioral Signal Technologies.
- I am from Athens, Greece (go VVV!) and I enjoy a variety of things including books, good movies, sports (tennis, padel, skiing), concerts, exploring new places, and most activities that are ocean-related.
Contact
Feel free to reach out! You can email me at alexandra.xron@gmail.com.
