Ling7800/CSCI 7000: Computational Lexical Semantics

Fall 2018

Martha Palmer
Time and Location: Tue/Thur, 11:00 - 12:15, ECCR 150
Assessment: Four homeworks, one Paper presentation, and a term project.
Office Hours: Martha Palmer, Monday/Tuesday 2-3, Hellems 295

Semantic Role Labeling (eBook), Martha Palmer, Daniel Gildea, Nianwen Xue,
Synthesis Lectures on Human Language Technologies ,
ed., Graeme Hirst, Morgan & Claypool, 2010. ISBN: 9781598298321
available on line on campus through Chinook

Representation and Inference for Natural Language. A First Course in Computational Semantics.
Patrick Blackburn and Johan Bos, 2005, CSLI Publications. ISBN: 1-57586-496-7
selected chapters, available from the CU bookstore and D2L


Lexical semantics is becoming an increasingly important part of Natural Language Processing (NLP), as the field is beginning to address semantics at a large scale. This graduate seminar covers key issues in computational lexical semantics. We start with an introduction to theoretical models of lexical semantics and events, considering both their adequacy as linguistic models and their place in NLP. We focus particularly on computational lexical resources such as PropBank, VerbNet, FrameNet and the Generative Lexicon, and examine their strengths and limitations with respect to NLP applications. We will introduce apporoaches to developing automatic classifiers that are intended to make use of these resources and to offer richer representations of sentences in context. These techniques can be fully supervised (requiring hand-labeled training data), semi-supervised, or unsupervised (learning lexical information from unlabeled text).j We will also discuss the impact of Word Embeddings as an approximmation of semantic similarity and the resulting implications for future research directions.

Suggested Schedule and Readings

Introduction and Module 1: the Lexical Semantics of Verbs - Chap 1

Module 2: Available Computational Lexicons - Chap 2

Module 3: Beyond shallow semantics

Module 4: More Empirical Approaches

Module 5: Future Directions

Module 6: Term Project Paper Presentations

Advanced topics: possible term projects and/or post-class readings for interest:

Steven White, Drew Reisinger, Keisuke Sakaguchi, Tim Vieira, Sheng Zhang, Rachel Rudinger, Kyle Rawlins, and Benjamin Van Durme. 2016. Universal Decompositional Semantics on Universal Dependencies. In Empirical Methods in Natural Language Processing (EMNLP 2016), Austin, TX.
Travis Wolfe, Mark Dredze, and Benjamin Van Durme. 2017. Pocket Knowledge Base Population. In The Proceedings of the Annual Meeting of the Association for Computational Linguistics (ACL 2017), Vancouver, BC.
Aaron Steven White, Pushpendre Rastogi, Kevin Duh, and Benjamin Van Durme. 2017. Inference is Everything: Recasting Semantic Resources into a Unified Evaluation Framework. In The Proceedings of the 8th International Conference on Natural Language Processing (IJCNLP).
Sheng Zhang, Rachel Rudinger, Kevin Duh, and Benjamin Van Durme. 2017. Ordinal Common-sense Inference. Transactions of the Association for Computational Linguistics, 5:379$(G!9(B395.
Ellie Pavlick and Chris Callison-Burch. Most babies are little and most problems are huge: Compositional Entailment in Adjective Nouns ACL 2016, Berlin, Germany, August, 2016.
Marc Brysbaert, Amy Beth Warriner, and Victor Kuperman. 2013. Concreteness ratings for 40 thousand generally known English word lemmas. Behavior research methods, pages 1-8.
Felix Hill and Anna Korhonen. 2014. Concreteness and subjectivity as dimensions of lexical meaning. In the Proceedings of ACL 2014
David R. Dowty, 1986, The effects of aspectual class on the temporal structure of discourse: semantics or pragmatics? Linguistics and Philosophy, February 1986, Volume 9, Issue 1, pp 37-61

Machine Learning Background

Machine Learning links:
Deep Learning Summer School, Montreal 2016

32nd International Conference on Machine Learning, Lille, 2015

Christopher Manning's Videos
Language Vectors
Deep Learning

Yoav Goldberg, primer and tutorial
T1: Practical Neural Networks for NLP: From Theory to Code

Machine Learning Papers

Background in Ontologies

Description Logic, including CLASSIC and OWL