Christopher David Manning was born on September 18, 1965. He is an Australian-American computer scientist and applied linguist who studies natural language processing, artificial intelligence, and machine learning. Manning holds the Thomas M. Siebel Professorship in Machine Learning and is a professor of Linguistics and Computer Science at Stanford University. He served as the director of the Stanford Artificial Intelligence Laboratory (SAIL) from 2018 to 2025.
Manning is known as a leading researcher in natural language processing. He helped create GloVe word vectors, a type of attention mechanism used in artificial neural networks like the transformer model, tree-structured recursive neural networks, and systems for analyzing textual entailment. He authored two textbooks: Foundations of Statistical Natural Language Processing (1999) and Introduction to Information Retrieval (2008). He also developed an online course called CS224N: Natural Language Processing with Deep Learning. Starting in 2002, Manning created open-source software tools for computational linguistics, including CoreNLP, Stanza, and GloVe.
Manning earned a BA (Hons) degree in mathematics, computer science, and linguistics from the Australian National University in 1989. He received a PhD in linguistics from Stanford University in 1994, under the guidance of Joan Bresnan. He worked as an assistant professor at Carnegie Mellon University (1994–1996) and a lecturer at the University of Sydney (1996–1999) before returning to Stanford. At Stanford, he became an associate professor in 2006 and a full professor in 2012. He was elected an AAAI Fellow in 2010, an ACL Fellow in 2011, and an ACM Fellow in 2013. Manning served as president of the Association for Computational Linguistics in 2015 and received an honorary doctorate from the University of Amsterdam in 2023. In 2024, he was awarded the IEEE John von Neumann Medal for advances in computational representation and analysis of natural language. In 2025, he was elected a Fellow of the National Academy of Engineering and the American Academy of Arts and Sciences.
Manning’s linguistic research includes his dissertation Ergativity: Argument Structure and Grammatical Relations (1994) and a monograph titled Complex Predicates and Information Spreading in LFG (1999). He also contributed to the development of Universal Dependencies, a project named after him as Manning’s Law.
Manning has mentored several PhD students, including Dan Klein, Sepandar Kamvar, Richard Socher, and Danqi Chen. In 2021, he joined AIX Ventures as an Investing Partner. AIX Ventures is a venture capital fund that invests in artificial intelligence startups.