Emily M. Bender

Date

Emily Menon Bender (born 1973) is an American language expert and teacher at the University of Washington. She leads a research group called the Computational Linguistics Laboratory. Her work focuses on how computers can understand and process human language.

Emily Menon Bender (born 1973) is an American language expert and teacher at the University of Washington. She leads a research group called the Computational Linguistics Laboratory. Her work focuses on how computers can understand and process human language.

She has written many articles about possible problems with large language models and how to use language technology responsibly. She also co-wrote the 2025 book The AI Con: How to Fight Big Tech's Hype and Create the Future We Want.

Education

Bender received a bachelor's degree in Linguistics from the University of California, Berkeley in 1995. She earned a master's degree from Stanford University in 1997 and a doctorate from Stanford in 2000 for her work studying how sentence structures vary and how people use language in African American Vernacular English (AAVE). Her research was guided by Tom Wasow and Penelope Eckert.

Career and research

Before working at the University of Washington, Bender worked at Stanford University, UC Berkeley, and in industry at YY Technologies. She holds several roles at the University of Washington, where she has been a faculty member since 2003. These include professor in the Department of Linguistics, adjunct professor in the Department of Computer Science and Engineering, faculty director of the Master of Science in Computational Linguistics, and director of the Computational Linguistics Laboratory. Bender is the Howard and Frances Nostrand Endowed Professor.

Bender was president of the Association for Computational Linguistics in 2024. She was elected a Fellow of the American Association for the Advancement of Science in 2022.

Bender has written research papers about the linguistic structures of Japanese, Chintang, Mandarin, Wambaya, American Sign Language, and English.

Bender created the LinGO Grammar Matrix, an open-source tool that helps develop detailed grammar systems. In 2013, she published Linguistic Fundamentals for Natural Language Processing: 100 Essentials from Morphology and Syntax, and in 2019, she co-authored Linguistic Fundamentals for Natural Language Processing II: 100 Essentials from Semantics and Pragmatics with Alex Lascarides. These books explain basic linguistic ideas in ways that are easy for natural language processing experts to understand.

In 2021, Bender presented a paper titled On the Dangers of Stochastic Parrots: Can Language Models Be Too Big? 🦜 with Timnit Gebru and others at the ACM Conference on Fairness, Accountability, and Transparency. Google tried to block the paper from being published, which led to some disagreements about the details. The paper discussed ethical issues in building natural language processing systems using large text collections. Since then, Bender has worked to raise awareness about AI ethics and has spoken against overhyping large language models.

The Bender Rule, which came from a question Bender often asked during research talks, advises computational scholars to "always name the language you're working with."

She explains the difference between linguistic form and linguistic meaning. Form refers to the structure of language, such as grammar, while meaning refers to the ideas that language represents. In a 2020 paper, she argued that machine learning models trained only on form, without connecting to meaning, cannot truly understand language. She has also said that tools like ChatGPT cannot meaningfully understand the text they process or generate.

  • Bender, Emily M. (2000). Syntactic Variation and Linguistic Competence: The Case of AAVE Copula Absence. Stanford University. ISBN 978-0493085425.
  • Sag, Ivan; Wasow, Thomas; Bender, Emily M. (2003). Syntactic theory: A formal introduction. Center for the Study of Language and Information. ISBN 978-1575864006.
  • Bender, Emily M. (2013). Linguistic Fundamentals for Natural Language Processing: 100 Essentials from Morphology and Syntax. Synthesis Lectures on Human Language Technologies. Springer. ISBN 978-3031010224.
  • Bender, Emily M.; Lascarides, Alex (2019). Linguistic Fundamentals for Natural Language Processing II: 100 Essentials from Semantics and Pragmatics. Synthesis Lectures on Human Language Technologies. Springer. ISBN 978-3031010446.
  • Bender, Emily M.; Hanna, Alex (2025). The AI Con: How to Fight Big Tech's Hype and Create the Future We Want. Harper. ISBN 9780063418561.
  • Bender, Emily (2000). "The Syntax of Mandarin Bă: Reconsidering the Verbal Analysis." Journal of East Asian Linguistics. 9 (2): 105–145. doi: 10.1023/A:1008348224800. S2CID 115999663 – via Academia.edu.
  • Bender, Emily M.; Flickinger, Dan; Oepen, Stephan (2002). The Grammar Matrix: An open-source starter-kit for the rapid development of cross-linguistically consistent broad-coverage precision grammars. Proceedings of the 2002 workshop on Grammar engineering and evaluation. Vol. 15.
  • Siegel, Melanie; Bender, Emily M. (2002). Efficient deep processing of Japanese. Proceedings of the 3rd workshop on Asian language resources and international standardization. Vol. 12.
  • Goodman, M. W.; Crowgey, J.; Xia, F; Bender, E. M. (2015). "Xigt: Extensible interlinear glossed text for natural language processing." Lang Resources & Evaluation. 49 (2): 455–485. doi: 10.1007/s10579-014-9276-1. S2CID 254372685.
  • Xia, Fei; Lewis, William D.; Goodman, Michael Wayne; Slayden, Glenn; Georgi, Ryan; Crowgey, Joshua; Bender, Emily M. (2016). "Enriching A Massively Multilingual database of interlinear glossed text." Lang Resources & Evaluation. 50 (2): 321–349. doi: 10.1007/s10579-015-9325-4. S2CID 254379828.
  • On the Dangers of Stochastic Parrots: Can Language Models Be Too Big? 🦜

More
articles