top of page

Amanda Kay Montoya


I am an Assistant Professor at UCLA in the Department of Psychology - Quantitative Area. I received my PhD in Quantitative Psychology from the Ohio State University in 2018. My primary adviser was Dr. Andrew Hayes. I completed my M.A. in Psychology and my M.S. in Statistics at Ohio State in 2016. I graduated from the University of Washington with a B.S. in Psychology and a minor in Mathematics in 2013. My research interests include mediation, moderation, conditional process models, structural equation modeling, and meta-science.

Mediation, Moderation, and Conditional Process Analysis
Global School of Empirical Research Methods
08/27/18 - 08/31/18


Casual Language in Evaluating Moderation/Interaction Hypothesis
Modern Modeling Methods Conference
June 27, 2023
10:15 - 11:45AM EST
Mediation, Moderation, and Conditional Process Analysis - I
St. Gallen
June 12 - June 16, 2023



Published in to Frontiers in Psychology, led by QRClab graduate student Tristan Tibbe, we introduce two bias-corrected bootstrap confidence interval methods for use with the indirect effect, describing their relation to bias assumptions made by the current bias-corrected bootstrap confidence interval and comparing their performance to existing methods used in the area of mediation analysis.


Under revision at Advances in Methods and Practices in Psychology, led by QRClab graduate student Jessica Fossum, we compare power estimates from six commonly used tests of the indirect effect for mediation analysis, concluding that power estimates from the joint significance test, Monte Carlo confidence interval, and percentile bootstrap confidence interval are similar enough to not have to use bootstrapping for power analysis

Screen Shot 2021-07-12 at 8.38.25 PM.png

Published in Collabra: Psychology, in collaboration with Dr. William Leo Donald Krenzer (Duke University) and QRClab graduate student Jessica Fossum, we explore how registered reports have been implemented at journals, explores the typical time to publication of the journal's first registered report, and common barriers in adopting registered reports. 


Published in Multivariate Behavior Research, I discuss three factors research should consider when selecting a design for mediation analysis: validity, causality, and power. Depending on the circumstance between-subject designs may have stronger validity than within-subject designs, and there are similar trade-offs with causality. In most cases within-subject designs have greater power to detect the indirect effect compared to a between-subject design with the same number of participants, but this not true in all cases. I provide an R script for conducting power analysis in within-subject designs. 

bottom of page