This item is under embargo and not available online per the author's request. For access information, please visit


Date of Award

Spring 5-15-2021

Author's School

Graduate School of Arts and Sciences

Author's Department

Philosophy/Neuroscience, and Psychology

Degree Name

Doctor of Philosophy (PhD)

Degree Type



A primary concern for any psychological research project is determining how to measure unobservable mental entities, such as "implicit memory", or "intelligence". Psychologists say that a measure has construct validity when they believe that a measurement method measures the construct they intend it to measure, where a construct is any theoretical term that refers to a mental entity. Construct validity, then, is the process of justifying one's belief that a measure has construct validity. My dissertation seeks to answer three related questions, 1. What is construct validity?, 2. What is the best epistemic theory of justification for construct validation?, and 3. How does construct validity relate to the justification of constructs?

There are two problems any theory of construct validation must face that my dissertation solves; problems that can be traced back to Cronbach & Meehl's (1955) seminal paper "Construct Validity of Psychological Tests". First, despite the widespread use of the term, there is significant ambiguity among psychologists about what exactly construct validity is. Second, the practice of construct validation is premised on an inference that many philosophers think is circular (Tal, 2013). How we establish the validity of a measure depends on what the construct being measured is. However, determining what a construct is, i.e. building the surrounding theory that defines the construct, requires already having measures of the construct. In other words, the question "Does the test measure intelligence?", and the question, "What is intelligence?" are questions that presuppose answers to one another.

I argue that construct validity--the adequacy of a test as a measure of a particular construct--needs to be distinguished from construct legitimacy--the adequacy of the construct itself, relative to the theory of which it is a part--. This distinction is necessary because while both interact, in that increasing one feature can increase the other, both respond to distinct types of evidence.

In order to solve the coordination problem, I evaluate two epistemic theories already utilized by psychologists: operationalism and hypothesis-testing. Both operationalism and hypothesis-testing fail because they do no accurately account for the bidirectional relationship between measure and construct. Operationalism fails to distinguish between measure and construct at all, while hypothesis-testing cannot accommodate construct revision driven by measures.

Rejecting operationalism and hypothesis-testing, I turn to a more recent contender from the philosophy of measurement: coherentism. Coherentism, as formulated by Change (2004), is the view that measures and constructs are validated in tandem, through a process of mutual refinement, or epistemic iteration. The coordination problem is solved by shifting focus from the validity of a measure at a single point in time to validation over time. Further, epistemic iteration accommodates the bidirectional relationship between construct and measure. It is not just the case that changes in theory causes changes in measurement, but also that changes in measurement can change in the theory. The primary limitation of epistemic iteration is that it relies on a vague notion of scientific progress. A revision is considered to be progressive as long as it coheres with the epistemic goals of a research field. In order to avoid an overly permissive account in which any revision would be progressive, I advocate for adapting a coherentist epistemic theory of justification. I show how one such theory, Thagard's (2007) explanatory coherence, can be applied to psychology in order to constrain and evaluate epistemic goals.


English (en)

Chair and Committee

Carl Craver

Committee Members

Casey O'Callaghan, Allan Haslett, Ron Mallon, Henry Roediger,

Available for download on Sunday, May 15, 2022