Date of Award

Winter 12-2022

Author's School

Graduate School of Arts and Sciences

Author's Department


Degree Name

Master of Arts (AM/MA)

Degree Type



To act effectively, humans store event schemas and use them to predict the near future. How are schemas learned and represented in memory, and used in online comprehension? One means to answer these questions is modeling event comprehension. What are, then, computational principles of event comprehension? We proposed three candidate properties: 1) abstract representation of visual features, 2) predictive mechanism and prediction error as feedback, and 3) contextual cues to guide prediction, and adapted a computational model embodying these properties. The model learned to predict activity dynamics from one pass through an 18-hour corpus of naturalistic human activity. Evaluated on another 3.5 hours of activities, it updated at times corresponding with human segmentation and formed human-like event categories—despite being given no feedback about segmentation or categorization. These results establish that a computational model embodying the three proposed properties can naturally reproduce two important features of human event comprehension.


English (en)

Chair and Committee

Professor Jeffrey Zacks, Chair

Committee Members

Professor Todd Braver, Professor Wouter Kool