Return to Past and Current Projects

Bayesian Optimization For Exploratory Experimentation In The Behavioral Sciences

To understand and predict human behavior, scientists typically perform controlled experiments that compare a small, carefully chosen set of experimental conditions. For example, in designing instructional software, a comparison might be made between two techniques for teaching students. The finding that one technique obtains reliably better outcomes has both practical and theoretical implications. However, this result does not answer the question one often wishes to ask: what is the very best possible technique? Scientists often wish to explore a wide range of conditions to understand their functional relationship to behavior and to determine the optimal conditions--those leading to the most robust learning, the fastest performance, the fewest errors, the best decisions and choices. This project will develop an exploratory experimentation methodology that will allow researchers to efficiently identify optimal conditions. A key product of the project will be black-box software that researchers in various disciplines of the cognitive sciences can use to apply exploratory experimentation to problems in their own field. Beyond its value in laboratory experimentation, the project has implications for web-based education platforms, from MOOCs to Khan Academy. In these platforms where vast quantities of data are collected, experimentation is a necessity to determine how to best serve students---a formal version of what classroom teachers do informally as they adapt their teaching style to the needs of a student. Exploratory experimentation facilitates the efficient discovery of individualized strategies that maximize learning gains. Beyond the behavioral sciences, this project has applicability to a core experimental method on the web, A/B testing, in which two alternative versions of a web page are served to determine which is more effective on some measure of viewer behavior. The proposed techniques allow for expanding such testing, which one might call A-Z testing, leading to customization of a user's online experience on a scale heretofore not achieved.

The project will extend Bayesian optimization methods to human experimental research. Bayesian optimization has long been used in the geostatistics community for inferring unobserved properties (e.g., oil reserves below the earth's surface) from costly measurements (e.g., drilling tests). In the current project, the "landscapes" being explored are defined over possible conditions (e.g., training strategies), the unobserved properties are internal cognitive states of the human observer, and the measurements are obtained via behavioral evaluations (e.g., assessments of learning). To apply Bayesian optimization methods to a range of human experimental research, mathematical models will be developed for multiple behavioral response measures, including choice, ranking, rating, latency, and free recall. The exploratory nature of the approach requires heuristics for sequentially selecting experimental conditions to obtain maximally informative data given prior observations. Various heuristics have been proposed and will be evaluated in the context of behavioral research. Experimental studies will be conducted to demonstrate the breadth of the approach in domains including: concept acquisition, color aesthetics, formal instruction, and the design of usable and engaging software.

Students

Camden Elliott-Williams (Computer Science, Colorado)
Mohammad Khajah (Computer Science, Colorado)
Hunter Liese (Computer Science, Colorado)
Brett Roads (Computer Science, Colorado)

Collaborators

Rob Goldstone (Psychology, Indiana U.)
Yun-En Liu (Computer Science, U Washington)
Robert Lindsey (Imagen Technologies)
Derek Lomas (Design Lab, UCSD)
Hal Pashler (Psychology, UCSD)
Karen Schloss (Psychology, U Wisconsin Madison)
WootMath (Boulder)