You are here

Gradient Symbolic Computation


Gradient Symbolic Computation

Classical, discrete representations (e.g., syntactic trees) have been the foundation for much of modern linguistic theory, providing key insights into the structure of linguistic knowledge and language processing. However, such frameworks fail to capture the gradient computational principles that underlie human cognition and behavior--not simply performance, but also our competence. 

This course will introduce a new formalism, Gradient Symbolic Computation, in which discrete linguistic structures are built of symbols that have continuous activation values: a given structural position can host a gradient blend of multiple symbols; a given symbol can occupy a gradient blend of multiple structural positions. This course will examine how this formalism can provide new insights into key questions concerning linguistic competence and performance (with a focus on the former). For example, a pandemic issue in linguistic theory is that multiple incompatible structural analyses are simultaneously required in order to account for various facets of a single linguistic phenomenon. Gradient Symbolic Computation aims to explain these facets from a single representation: a gradient blend of multiple structures.

The basic computational principles underlying this approach will be introduced, followed by readings and lectures that examine its application to topics in phonology/phonetics and syntax/semantics, with a focus on the latter.

Course Status: Closed

This course is currently at capacity. Login to be added to the course's waiting list.

Course Number:


Course Session:

Four-week Session


3:10 pm-5:00 pm
3:10 pm-5:00 pm


None (except open-mindedness and a fondness for suspending disbelief).