Isabel Papadimitriou

(she)


Email: isabelpapadimitriou at g dot harvard dot edu

my google scholar page

toizzy on github

isabelpapad on twitter

Hello!

I am a Kempner Fellow at Harvard, and an incoming assistant professor at UBC Linguistics (September 2025). I did my PhD at Stanford in the Natural Language Processing group, advised by Dan Jurafsky.

I am looking to recruit students this cycle (applying 2024, starting Sept 2025). If you are applying for a PhD in NLP or computational linguistics, apply to UBC and mention that you'd like to work with me!

I work on understanding and defining the capabilities of large language models in relation to the human language system.

I am especially interested in pursuing an interdisciplinary research program, combining computational empirical machine learning methods with theories of human language. My principal interests include: how language models learn and use generalizable grammatical abstractions, the interaction between structure and meaning representations in high-dimensional vector spaces, and using multilingual settings to test the limits of abstraction in language models.

I did my undergraduate at Berkeley, where I got BAs in Computer Science and in History. My history thesis was based on research at the archives of the League For Democracy in Greece, a London-based solidarity organisation supporting the left in the Greek Civil War. It received the Kirk Underhill Prize.

Talks and News

Sep 2024: Started at the Kempner Institute at Harvard. If you are in the Boston area I would love to chat!

Mission: Impossible Language Models got best paper at ACL 2024!

Nov 2023: Talk at Georgia Tech Modern Languages Colloquium[slides]

I was selected for the 2023 Rising Stars in EECS

Stanford CS 224N, "Insights between NLP and Linguistics" [slides]

July 2023: Talks at Brown Computer Science, Carnegie Mellon LTI

June 2023: Talk at Decoding Communication in Nonhuman Species Workshop [slides] [recording]

Apr 2023: Talk at NYU CAP Lab

Dec 2022: Talk at Cornell University C.Psyd Group

July 2022: SIGTYP 2022 keynote [slides] [recording]

April 2022: Talk at UT Austin Computational Linguistics Group

Oct 2022: Talk at UC Santa Barbara Computational Linguistics Group

Papers

Mission: Impossible Language Models - Julie Kalini, Isabel Papadimitriou , Richard Futrell, Kyle Mahowald, and Christopher Potts, ACL 2024 Best Paper

Injecting structural hints: Using language models to study inductive biases in language learning - Isabel Papadimitriou and Dan Jurafsky, Findings of EMNLP 2023

Multilingual BERT has an accent: Evaluating English influences on fluency in multilingual models - Isabel Papadimitriou* , Kezia Lopez*, and Dan Jurafsky, Findings of EACL 2023, SIGTYP 2023 [slides]

Oolong: Investigating What Makes Crosslingual Transfer Hard with Controlled Studies - Zhengxuan Wu*, Isabel Papadimitriou*, Alex Tamkin*, EMNLP 2023 [pdf]

The Greek possessive modal eho as a special agentive modality - Isabel Papadimitriou and Cleo Condoravdi, LSA 2023 (poster) [abstract]

When classifying grammatical role, BERT doesn't care about word order... except when it matters - Isabel Papadimitriou , Richard Futrell, and Kyle Mahowald, ACL 2022 (oral presentation) [pdf] [code]

Language, Section 2.1 - Isabel Papadimitriou and Christopher D. Manning

In On the Opportunities and Risks of Foundation Models (full list of co-authors)

Deep Subjecthood: Higher-Order Grammatical Features in Multilingual BERT - Isabel Papadimitriou , Ethan A. Chi, Richard Futrell, and Kyle Mahowald, EACL 2021 (oral presentation) [pdf] [code]

Learning Music Helps You Read: Using transfer to study linguistic structure in language models - Isabel Papadimitriou and Dan Jurafsky, EMNLP 2020 (oral presentation) [pdf] [code]

Teaching

I was the TA for CS324H, History of Natural Language Processing, taught by Dan Jurafsky and Chris Manning in Winter 2024

I was a TA for CS224N, Natural Language Processing with Deep Learning, taught by Chris Manning in Winter 2023

I was the TA for the Independent Study in Machine Translation seminar taught by Noah Goodman in Winter 2020

At Berkeley I TAed CS70 Discrete Math and Probability Fall 2015, taught by Satish Rao and Jean Walrand

The template is by Vasilios Mavroudis. Thanks!