I’m a 4th year PhD candidate in Computer Science at the University of Washington. I’m advised by Noah Smith and Luke Zettlemoyer. I am supported by the Bloomberg Data Science PhD Fellowship.

I was previously a visiting researcher at Meta AI Research, a Predoctoral Young Investigator at AI2, and a data scientist and software engineer in startups in Boston and Seattle. And in another world, I did research in neuroscience!

These days, I’m excited about developing models that are modular, embarrassingly parallel, and sparse. Much of my research investigates language variation in large unlabeled datasets, and how the composition of training data affects the overall behavior of language models. I strongly believe that being careful about our data will lead to stronger and more reliable language technologies.

I have co-authored papers that were cited as “outstanding” and “honorable mention for best paper” at ACL 2020 and ACL 2021.

Check out my publications to learn more.