I’m a 3rd year PhD candidate in Computer Science at the University of Washington. I’m advised by Noah Smith and Luke Zettlemoyer. I am supported by the Bloomberg Data Science PhD Fellowship.
I was previously a visiting researcher at Meta AI Research, a Predoctoral Young Investigator at AI2, and a data scientist and software engineer in startups in Boston and Seattle. And in another world, I did research in neuroscience!
These days, I’m excited about developing models that are modular, embarrassingly parallel, and sparse. Much of my research investigates language variation in large datasets, and how the composition of training data affects the overall behavior of language models. I strongly believe that being careful about our data will lead to stronger and more reliable language technologies.
I have co-authored papers that were cited as “outstanding” and “honorable mention for best paper” at ACL 2020 and ACL 2021.
Check out my publications to learn more.
- June 2023: Invited talk at Bocconi University
- May 2023: Invited talk at Samaya AI
- May 2023: Invited talk at Google Brain
- May 2023: Invited talk at Stanford NLP
- Apr 2023: Invited talk at USC ISI
- Mar 2023: “Scaling Expert Language Models with Unsupervised Domain Discovery” is live.
- Mar 2023: Passed my Generals Examination!
- Jan 2023: “Editing Models with Task Arithmetic” was accepted to ICLR 2023.
- Oct 2022: Our new paper, “lo-fi: distributed fine-tuning without communication” is live!
- Oct 2022: Three papers (“Whose Language Counts as High Quality”, “M2D2”, and “Nearest Neighbor Zero-Shot Inference”) accepted to EMNLP 2022!
- Sept 2022: Talk at USC
- Aug 2022: Talk at Mosaic ML, on “Branch-Train-Merge”
- Aug 2022: Our new paper “Branch-Train-Merge” just dropped!
- June 2022: Our new paper “Nearest Neighbor Zero-Shot Inference” is live!
- March 2022: Two papers (“DEMix” and “Time Waits for No One!”) accepted to NAACL 2022!
- March 2022: Talk at IBM Research Zurich.
- April 2022: I’ll be giving a guest lecture in the Data Processing + Values course at UW on our quality filtering paper.
- January 2022: Our new preprint, “Whose Language Counts As High Quality?”, just dropped!