Introduction to Probability and Information Theory

Instructor(s): Robert Malouf

Linguistics as a field is in the process of undergoing a dramatic methodological shift. Quantitative advances in allied fields (such as psycholinguistics, sociolinguistics, and computational linguistics) have led to a new interest in empirical methods among theoretical linguists. This introductory lecture course will provide students with the solid foundation in probability, statistics and information theory needed to understand and contribute to current research in linguistics. We will start from the beginning with basic random variables, building up to the properties of probability distributions that are important for linguistics and their use in simple statistical tests. Next we will cover Shannon's information theory, with special attention to information entropy and its uses for quantifying linguistic information. Finally, we will conclude with an extended discussion of an interesting linguistic application: the use of the concepts covered in this course to the analysis of morphological complexity.

This is an introductory course and it has no prerequisites.