I work on the AllenNLP team, where we do research and engineering in natural language processing (NLP), a subfield of artificial intelligence concerned with models of human language. I’m currently excited about a variety of problems, like efficiency, evaluation, and adapting models between distant domains. I will be joining the University of Washington in Fall 2020 as a PhD student in the CSE department.
I completed my masters in NLP at the University of Washington, advised by Noah Smith. Before graduate school, I was a data scientist and software engineer at several companies in Seattle and Boston. Before that, I did research in computational neuroscience at the University of Chicago, working with Jason MacLean and Nicholas Hatsopoulos.
|Jul 8, 2020||Our paper “Don’t Stop Pretraining” won an Honorable Mention for Best Paper Award at ACL 2020!|
|Apr 9, 2020||Starting my PhD in Computer Science in Fall 2020 at the University of Washington.|
|Apr 8, 2020||Our paper “Don’t Stop Pretraining: Adapt Language Models to Domains and Tasks” will appear at ACL 2020! Code and paper in flight.|
|Sep 8, 2019||Our paper “Show Your Work: Improved Reporting of Experimental Results” will appear at EMNLP 2019! Check out the paper here, and the code here.|
|Aug 1, 2019||We will be presenting VAMPIRE at WeCNLP 2019!|