I’m a 3rd year PhD candidate in Computer Science at the University of Washington, and a visiting researcher at Facebook AI Research. I’m advised by Noah Smith and Luke Zettlemoyer. I am supported by the Bloomberg Data Science PhD Fellowship.

I was previously a Predoctoral Young Investigator at AI2, and a data scientist and software engineer in startups in Boston and Seattle. And in another world, I did research in neuroscience!

These days, I’m excited about developing models that are modular and embarrassingly parallel. Much of my research investigates language variation in large datasets, and how the composition of training data affects the overall behavior of language models. I strongly believe that being careful about our data will lead to stronger and more reliable language technologies.

I have co-authored papers that were cited as “outstanding” and “honorable mention for best paper” at ACL 2020 and ACL 2021.

Check out my publications to learn more.