Meet UConn’s RMME Faculty

Engage with world-class RMME researchers and professors

Body text....

Kylie Angln headshot

Dr. Kylie Anglin

Assistant Professor – Research Methods, Measurement, and Evaluation

Dr. Kylie Anglin is an Assistant Professor of Research Methods, Measurement, and Evaluation at the University of Connecticut. She received her PhD in Research, Statistics, and Evaluation from the University of Virginia, where she participated in the Institute for Education Sciences (IES) Pre-doctoral Training Program and received an NAEd/Spencer dissertation fellowship. Dr. Anglin’s research develops methods for efficiently monitoring program implementation in impact evaluations using natural language processing techniques, as well as methods for improving the causal validity and replicability of impact estimates. Her work appears in journals such as the Journal of Research on Educational Effectiveness, Prevention Science, AERA Open, and Evaluation Review. Dr. Anglin’s recent publications include: “A Natural Language Processing Approach to Measuring Treatment Adherence and Consistency Using Semantic Similarity,” in AERA Open; “Gather-Narrow-Extract: A Framework for Studying Local Policy Variation Using Web-Scraping and Natural Language Processing,” in the Journal of Research on Educational Effectiveness; and Design-Based Approaches to Causal Replication Studies,” in Prevention Science.


Email: kylie.anglin@uconn.edu

Brenna Butler headshot

Brenna Butler

Instructor - Research Methods, Measurement and Evaluation

Dr. Brenna Butler is an Evaluation Specialist for Penn State Extension, housed within the College of Agricultural Sciences. Dr. Butler’s primary responsibilities include designing, implementing, and analyzing the results of evaluations and assessments for educational programs and grants within Penn State Extension that deliver science-based information to Pennsylvanians. She also coaches Extension faculty and staff on best practices for conducting a wide range of evaluation skills across the evaluation life cycle (e.g., logic model development) to implement into their program development processes for their educational outreach programs. Dr. Butler’s research interests include best practices in data visualization and evaluation report writing for both technical and non-technical audiences. Dr. Butler received her Bachelor of Science from Penn State University in Psychology with a Quantitative Emphasis. She received her Ph.D. from the University of Tennessee in Educational Psychology with a concentration in Evaluation, Statistics, and Measurement.

Email: brenna.butler@uconn.edu

ismael carreras headshot

Ismael Carreras

Instructor - Research Methods, Measurement and Evaluation

Dr. Ismael Carreras is the Associate Dean for Strategic Analysis at the Faculty of Arts and Sciences at Harvard University. He has over two decades of applied research and statistical analysis for educational and industry clients, with particular interests in data visualization and communication. Dr. Carreras has taught graduate level coursework at Boston College and Northern Illinois University in areas such as Introductory Statistics, Intermediate Statistics, Design of Experiments and Attitude and Opinion Measurement. He holds an M.Ed. and Ph.D. in Educational Research, Measurement and Evaluation from the Lynch School of Education at Boston College, and a B.A. in Psychology from Bates College.

Email: ismael.carreras@uconn.edu

Scott Donaldson headshot

Scott Donaldson

Instructor - Research Methods, Measurement and Evaluation

Dr. Scott Donaldson is a Senior Research Associate in the Department of Population and Public Health Sciences at the Keck School of Medicine of USC. Scott completed a Postdoctoral Scholarship in Evaluation, Statistics, and Measurement at the University of California, San Diego School of Medicine, and completed a PhD in Psychology with a concentration in Evaluation and Applied Research Methods from Claremont Graduate University. He completed an MS in Applied Psychology from the University of Southern California and completed a BA in Psychology from the University of California, Los Angeles. His research leverages quantitative methodologies, such as psychometrics, meta-analysis, and other advanced statistical approaches to design and evaluate health and well-being programs. He currently works in the USC Social Media Analytics Lab on a monitoring and evaluation grant funded by the California Department of Public Health.

Email: forthcoming

Eric Loken headshot

Dr. Eric Loken

Associate Professor

Dr. Eric Loken teaches courses in regression, multilevel modeling, and item response theory. He received his Ph.D. from Harvard University and studies advanced statistical modeling with applications to large scale educational testing.

Research interests: Latent Variable Models, Bayesian Inference, Methods for Reproducible Science

sharon_loza_headshot

Sharon Loza

Instructor - Research Methods, Measurement, and Evaluation

Dr. Sharon Loza has worked on multiple state, national, and global education initiatives focused on improving child outcomes, with a focus on children at risk for and with various disabilities. She has led federal, state and privately funded research and evaluation agendas for various programs aimed at improving student achievement. Dr. Loza has held roles as research scientist and trai

ning and technical assistance provider with the Frank Porter Graham Child Development Institutes, a researcher with RTI International and UNICEF, and program administrator for programs that aim to increase the capacity of children, families, teachers, and human services workforce, using frameworks and principles of developmental and implementation science and evidence-based practices, to improve student and teacher outcomes. Dr. Loza currently serves as the Director for the North Carolina Early Intervention Program. She brings a wide range of experiences in the practical application of evaluation and research methods to support efforts related to understanding the impact of programs and policy on children’s educational attainment and success. She received her PhD in Educational Leadership, Policy, and Human Development with an emphasis in Educational Research and Policy Analysis from North Carolina State University and served as a pre-doctoral fellow at the University of North Carolina at Chapel Hill and post-doctoral fellow with the National Center for Pyramid Model Innovations.

Email: sharon.loza@uconn.edu

 

Betsy McCoach headshot

Dr. D. Betsy McCoach

Professor

Dr. D. Betsy McCoach teaches courses in structural equation modeling, advanced latent variable modeling, instrument design, and quantitative research methods. Betsy has extensive experience in latent variable modeling, longitudinal data analysis, multilevel modeling, factor analysis, and instrument design. She has authored or co-authored over 100 journal articles and 25 book chapters. She has also published several books, including  Introduction to Modern Modeling Methods (2021, Sage),  Multilevel Modeling of Educational Data(2008, co-edited with Ann O’Connell)  Instrument Development in the Affective Domain (2013, co-authored with Bob Gable). Betsy is the founder and conference chair of the Modern Modeling Methods (M3) conference. Betsy serves as a co-principal investigator and lead research methodologist on several federally funded research grants, including the National Center for Research on Gifted Education.

Research interests: Latent Variable Modeling, Longitudinal Analysis, Multilevel Modeling, Instrument Design, Assessing/Measuring School Effectiveness, Gifted Education, Underachievement

Bianca_Montrosse-Moorhead Headshot

Dr. Bianca Montrosse-Moorhead

Associate Professor

Dr. Montrosse-Moorhead teaches courses in research methods, assessment, and evaluation. She previously served as an assistant professor of educational research at Western Carolina University, as a research and evaluation specialist at the Southeast Regional Educational Laboratory at the University of North Carolina at Greensboro, and as a doctoral fellow at the University of North Carolina at Chapel Hill. Dr. Montrosse-Moorhead currently conducts research on evaluation as a a means to develop stronger evidence-based program evaluation practices, models, and theories. Additionally, her scholarship explores the practical application of evaluation and research methods, both in order to better understand the impact of K-12 policies, practices and programs, and to provide credible, relevant, and useful evidence to the policy community. She received her Ph.D. in psychology with an emphasis in evaluation and applied research methods from Claremont Graduate University in 2009, where she worked with and studied under Drs. Tina Christie and Michael Scriven.

Research Interests: Program and Policy Evaluation, Research on Evaluation, Evaluation Specific Methodology, Educational Equity

Avery Newton headshot

Avery Newton

Instructor - Research Methods, Measurement, and Evaluation

Dr. Avery Newton is a Strategic Data Fellow affiliated with Providence Public Schools and Harvard’s Center for Education Policy Research. In this role, she collaborates to improve data governance, research partnerships, and analytic capacity within the district. Her expertise lies in education research and evaluation, strategic data leadership, and career exploration in K-12 settings. Having worked with over 60 different districts, educational programs, and institutions over the last 10 years, she has a proven track record of rigorous and community-centered impact research. Her work has been published in three books, appears in various education policy / research journals, and has been funded by local and institutional grants as well as the National Institutes of Health [NIH]. Outside of work, she is an active community volunteer, as well as a drummer, hiker, and mother to a fabulous toddler. Dr. Newton holds a B.A. in Education Policy, Sociology, and Theory (a self-designed major) from the College of William and Mary and a Ph.D. in Measurement, Evaluation, Statistics, & Assessment [MESA] from Boston College.

Email: avery.newton@uconn.edu

 

Sarah Newton Headshot

Sarah D. Newton

Associate Director - Online Programs in Research Methods, Measurement, and Evaluation
Instructor - Research Methods, Measurement and Evaluation

Dr. Sarah D. Newton is the Associate Director of Online Programs in Research Methods, Measurement, & Evaluation (RMME), as well as a Postdoctoral Research Associate in the Department of Educational Psychology at the University of Connecticut. She provides research design, measurement, data collection/management, statistical analysis/modeling, and methodological support for multiple grant-funded research projects at UConn, including Evaluating the Impact of Integrated Behavior and Reading Multi-Tiered Systems of Support in Elementary Schools (IES), Project BUMP UP (U.S. Department of Education), and Project EAGLE (U.S. Department of Education). Dr. Newton’s methodological work includes publications like “Does the package matter? A comparison of five common multilevel modeling software packages” in the Journal of Educational and Behavioral Statistics, as well as contributions to book chapters such as “Multilevel Modeling” in the Cambridge handbook of research methods and statistics for the social and behavioral sciences (Vol. 1), “Multilevel model selection: Balancing model fit and adequacy” in Methodology for multilevel modeling in educational research: Concepts and applications, “Evaluation of model fit and adequacy” in Multilevel modeling methods with introductory and advanced applications, and “Confirmatory factor analysis” in The BERA/SAGE handbook of educational research (Vol. 2). She also teaches various courses in research methodology and quantitative methods/analysis for RMME Programs. Dr. Newton earned her PhD and MA in Educational Psychology (with an RMME concentration) at the University of Connecticut. In addition, she holds an MS in Criminal Justice and a BA in Criminology, with completed course requirements in Psychology, from Central Connecticut State University. Her methodological research interests focus on model/data fit and model adequacy as complementary tools for multilevel model evaluation and selection; information criteria performance in multilevel modeling contexts; latent variable modeling; affective instrument design; and reliability/validity theory.

Email: sarah.newton@uconn.edu

Briana Oshiro

Instructor - Research Methods, Measurement, and Evaluationn

Dr. Briana Oshiro is a Post-doctoral Researcher at McGill University in Montreal, Canada. She earned her PhD in Research Methods, Measurement, & Evaluation (RMME) at the University of Connecticut. Her dissertation was entitled, “Factor Retention Criteria as Upper and Lower Bounds on the Estimate of Number of Factors in Exploratory Factor Analysis: A Simulation Study.” Prior to completing her doctorate, she earned a Master of Arts Degree in RMME and a Master of Science in Mathematics—both at the University of Connecticut. Dr. Oshiro’s research interests include factor analysis, measurement invariance, and secondary data analysis.

Email: briana.oshiro@uconn.edu

UConn Online RMME Program, Chris Rhoads

Dr. Christopher Rhoads

Associate Professor

Dr. Christopher Rhoads teaches courses in statistics and research design. His research interests focus on methods for improving causal inference in educational research, particularly in the areas of experimental design and the analysis of multi-level data structures. He has published in journals such as Journal of Educational and Behavioral Statistics, British Journal of Mathematical and Statistical Psychology and Statistics, Politics and Policy. Rhoads’ current work involves exploring the implications of “contamination” of experimental interventions for the design and analysis of experiments with clustering; using prior information about the correlation structure to improve power and precision in experiments with clustering; determining optimal experimental designs for regression discontinuity studies; generalizing the results of RCT to other populations in multi-level settings and methods for integrating implementation fidelity variables into the analysis of education RCTs.

Research interests: Multilevel Modeling, Design of Field Experiments in Education Research, Non-experimental Designs for Causal Inference, External Validity of RCT Studies