News & Updates

RMME Community Members Publish Article: Omitted Response Patterns

Merve Sarac (an RMME alumna) and Dr. Eric Loken (a current RMME faculty member) recently published a new article, entitled: “Examining Patterns of Omitted Responses in a Large-scale English Language Proficiency Test” in the International Journal of Testing. Congratulations to Merve and Eric on this excellent accomplishment!

 

Abstract:

This study is an exploratory analysis of examinee behavior in a large-scale language proficiency test. Despite a number-right scoring system with no penalty for guessing, we found that 16% of examinees omitted at least one answer and that women were more likely than men to omit answers. Item-response theory analyses treating the omitted responses as missing rather than wrong showed that examinees had underperformed by skipping the answers, with a greater underperformance among more able participants. An analysis of omitted answer patterns showed that reading passage items were most likely to be omitted, and that native language-translation items were least likely to be omitted. We hypothesized that since reading passage items were most tempting to skip, then among examinees who did answer every question there might be a tendency to guess at these items. Using cluster analyses, we found that underperformance on the reading items was more likely than underperformance on the non-reading passage items. In large-scale operational tests, examinees must know the optimal strategy for taking the test. Test developers must also understand how examinee behavior might impact the validity of score interpretations.

New Program Evaluation Student, Emily Acevedo, Completes Dissertation

One of RMME’s newest members of the Graduate Certificate Program in Program Evaluation has reached an academic milestone! Dr. Emily Acevedo, a current kindergarten teacher in New York, recently completed her dissertation entitled, “Teacher’s Implementation of Play-Based Learning Practices and Barriers Encountered in Kindergarten Classrooms,” at Walden University. Congratulations to Dr. Acevedo on this outstanding accomplishment!

RMME Faculty Member, Dr. D. Betsy McCoach Releases New Books

Congratulations to RMME faculty member, Dr. D. Betsy McCoach, who recently released two new outstanding statistical modeling books. Both works include contributions from RMME Community members. Be sure to check these out today:

Introduction to Modern Modelling Methods – Co-authored by RMME alumnus, Dr. Dakota Cintron, this book introduces readers to multilevel modeling, structural equation modeling, and longitudinal modeling. A fantastic resource for quantitative researchers!

Multilevel Modeling Methods with Introductory and Advanced Applications – Including contributions from current and former RMME faculty (D. Betsy McCoach, Chris Rhoads, H. Jane Rogers, Aarti P. Bellara), as well as RMME alumni (Sarah D. Newton, Anthony J. Gambino, Eva Yujia Li), this text offers readers a comprehensive introduction to multilevel modeling. It is an excellent resource for aspiring and established multilevel modelers, covering foundational skills through cutting-edge, advanced multilevel techniques. A must-have for every multilevel modeler’s bookshelf!

RMME Community Members Publish Article: Mixture Models & Classification

RMME Alumnus, Dr. Dakota W. Cintron, and RMME faculty members, Drs. Eric Loken and D. Betsy McCoach recently published a new article entitled, “A Cautionary Note about Having the Right Mixture Model but Classifying the Wrong People“. This article will appear in Multivariate Behavioral Research and is currently available online. Congratulations to Dakota, Eric, and Betsy!

 

Abstract:

 

RMME Instructor, Dr. Leslie Fierro, Serves as Evaluation Panelist

On June 2, 2022, Dr. Leslie Fierro (RMME Instructor and Co-Editor of New Directions for Evaluation) contributed to a panel session entitled, “Issues in Evaluation: Surveying the Evaluation Policy Landscape in 2022”. The Government Accountability Office (GAO) and Data Foundation co-sponsored this webinar in which panelists discussed the state of evaluation policy today. Visit this website and register to watch the recording of this excellent webinar for free! Congratulations on this work, Dr. Fierro!

Drs. D. Betsy McCoach & Sarah D. Newton Offer Spring 2022 Workshops to I-MTSS Research Network Early Career Scholars

RMME Community members, Dr. D. Betsy McCoach and Dr. Sarah D. Newton, collaborated with colleagues this spring to offer several methodological workshops for members of the I-MTSS Research Network’s Early Career Scholars Program. Workshops included:

 

Workshop Title Description Facilitators
Learning how to “p” (December 2021) Everyone uses p’s, but very few know how to p. In this session, we will discuss the good, the bad, and the ugly of p-values and we will provide more nuanced guidance on how to make sense of your research results. Betsy McCoach and Yaacov Petscher
Hungry for Power (November 2021) All researchers seek power— statistical power, that is. In this session, we will explore the power game and how to “play” it. Betsy McCoach and Yaacov Petscher
A Bird’s Eye View of Nesting
(January 2022)
Nested data are the norm in educational studies. Some consider nesting a nuisance, but nested data also provides opportunities to ask and answer a wide variety of research questions that are important to educational researchers. Betsy McCoach and Yaacov Petscher
Data Cleanup in Aisle 2! (Mop and Bucket Not Included)
(February 2022)
This workshop will help participants to develop a clearer sense of the data cleaning and preparation process: (1) Setting up workflows and structures for success, (2) Identifying data entry errors; (3) Creating, recoding, and naming variables for analysis; (4) Conducting preliminary analyses; (5) Knowing your software; and (6) Understanding your planned analysis and its needs (with special attention given to multilevel modeling). Sarah D. Newton and Kathleen Lynne Lane
What’s Your Logic? Tell Me–What’s Your Logic? (May 2022) The current workshop focuses on how to use logic models to convey the theory of change (TOC) underlying a program/intervention of interest in research and/or evaluation contexts. In this hands-on workshop, participants will collaborate in groups to build a TOC model for the I-MTSS Research Network project with which they are most familiar. Participants will then share and briefly describe their work for the larger group. Sarah D. Newton and Nathan Clemens

 

Congratulations on your contributions to a successful workshop series!

 

Upcoming RMME/STAT Colloquium (4/29): Luke Keele, “Approximate Balancing Weights for Clustered Observational Study Designs”

RMME/STAT Joint Colloquium

Approximate Balancing Weights for Clustered Observational Study Designs

Dr. Luke Keele
University of Pennsylvania

Friday, April 29, at 3:00PM ET

https://uconn-cmr.webex.com/uconn-cmr/j.php?MTID=m35b82d4dc6d3e77536aa48390a02485b

In a clustered observational study, a treatment is assigned to groups and all units within the group are exposed to the treatment. Clustered observational studies are common in education where treatments are given to all students within some schools but withheld from all students in other schools. Clustered observational studies require specialized methods to adjust for observed confounders. Extant work has developed specialized matching methods that take key elements of clustered treatment assignment into account. Here, we develop a new method for statistical adjustment in clustered observational studies using approximate balancing weights. An approach based on approximate balancing weights improves on extant matching methods in several ways. First, our methods highlight the possible need to account for differential selection into clusters. Second, we can automatically balance interactions between unit level and cluster level covariates. Third, we can also balance high moments on key cluster level covariates. We also outline an overlap weights approach for cases where common support across treated and control clusters is poor. We introduce an augmented estimator that accounts for outcome information. We show that our approach has dual representation as an inverse propensity score weighting estimator based on a hierarchical propensity score model. We apply this algorithm to assess a school-based intervention through which students in treated schools were exposed to a new reading program during summer school. Overall, we find that balancing weights tend to produce superior balance relative to extant matching methods. Moreover, an approximate balancing weight approach tends to require less input from the user to achieve high levels of balance.

 

Loader Loading...
EAD Logo Taking too long?

Reload Reload document
| Open Open in new tab

Upcoming RMME Evaluation Colloquium (4/1): Cassandra Myers & Joan Levine, “UConn’s Institutional Review Board: A Closer Look at Ethics in Research & Evaluation”

RMME Evaluation Colloquium

UConn’s Institutional Review Board: A Closer Look at Ethics in Research & Evaluation

Cassandra Myers, The HRP Consulting Group
Joan Levine, University of Connecticut

Friday, April 1, at 3:00PM ET

https://uconn-cmr.webex.com/uconn-cmr/j.php?MTID=me8fe20df2d511c754f1bd3f3539991b4

The UConn-Storrs Human Research Protection Program (HRPP) is dedicated to the protection of human subjects in research activities conducted under its auspices. The HRPP reviews human subjects research to ensure appropriate safeguards for the ethical, compliant, and safe conduct of research, as well as the protection of the rights and welfare of the human subjects who volunteer to participate. As the regulatory framework for the protections of human subjects is complex and multi-faceted, this session’s goals are to review the regulatory framework and how it applies to research and evaluation, the requirements for consent, when consent can be waived, and how to navigate the IRB process at UConn. This session will also review historical case studies to understand current requirements and how these events still affect populations, policies, and regulations.

 

Loader Loading...
EAD Logo Taking too long?

Reload Reload document
| Open Open in new tab

Upcoming RMME/STAT Colloquium (3/25): Elizabeth Stuart, “Combining Experimental and Population Data to Estimate Population Treatment Effects”

RMME/STAT Joint Colloquium

Combining Experimental and Population Data to Estimate Population Treatment Effects

Dr. Elizabeth Stuart
Johns Hopkins Bloomberg School of Public Health

Friday, March 25, at 3:00PM ET

https://uconn-cmr.webex.com/uconn-cmr/j.php?MTID=mb26cc940795502d8ae9ff7e274d435bb

With increasing attention being paid to the relevance of studies for real-world practice (especially in comparative effectiveness research), there is also growing interest in external validity and assessing whether the results seen in randomized trials would hold in target populations. While randomized trials yield unbiased estimates of the effects of interventions in the sample of individuals in the trial, they do not necessarily inform what the effects would be in some other, potentially somewhat different, population. While there has been increasing discussion of this limitation of traditional trials, relatively little statistical work has been done developing methods to assess or enhance the external validity of randomized trial results. In addition, new “big data” resources offer the opportunity to utilize data on broad target populations. This talk will discuss design and analysis methods for assessing and increasing external validity, as well as general issues that need to be considered when thinking about external validity. The primary analysis approach discussed will be a reweighting approach that equates the sample and target population on a set of observed characteristics. Underlying assumptions and methods to assess robustness to violation of those assumptions will be discussed. Implications for how future studies should be designed in order to enhance the ability to assess generalizability will also be discussed.

 

Loader Loading...
EAD Logo Taking too long?

Reload Reload document
| Open Open in new tab

RMME Instructor, Ummugul Bezirhan, Earns 2022 Dissertation Prize!

Congratulations to RMME instructor, Ummugul Bezirhan! She recently earned the Psychometric Society’s 2022 Dissertation Prize for her research entitled, “Conditional dependence between response time and accuracy in cognitive diagnostic models”. She will present this work as a keynote speaker at the upcoming International Meeting of the Psychometric Society (IMPS), which will be held from July 11-15, 2022, at the University of Bologna, in Bologna, Italy. See this Psychometric Society announcement for more information.

We are so thrilled to congratulate Dr. Bezirhan on this fantastic accomplishment–congratulations, Gul!