Author: Newton, Sarah

Upcoming RMME/STAT Colloquium (12/18): Paul De Boeck, “Response Accuracy and Response Time in Cognitive Tests”

RMME/STAT Joint Colloquium:

Response Accuracy and Response Time in Cognitive Tests

Paul De Boeck
The Ohio State University

December 18th at 12:00 EST

https://uconn-cmr.webex.com/uconn-cmr/j.php?MTID=m1b6efd4435a2cd17535d693bd2ac1a14

It is an old and still unresolved issue how much a cognitive test score reflects ability and how much it reflects speed. The well-known speed-accuracy tradeoff does not make an answer to the question easier. In the presentation I will report the results of my research steps to investigate the problem. Briefly summarized, the findings are as follows. First, the correlation of ability and speed across persons depends on the test. Second, based on different kinds of modeling and different kinds of data, there seem to be remaining item-wise dependencies (i.e., conditional dependencies) between response accuracy and response time after controlling for the underlying latent variables. Third, the remaining dependencies depend on the difficulties of the test items and the dependencies also are curvilinear. I will present an explanation for the findings, and a tentative, complex answer to the old question of what is being measured in a cognitive test.

Pamela Peters, RMME Ph.D. Student, Earns Research Award

Pamela Peters recently took 3rd place for her paper entitled, “Development of the Assessment of Teachers’ Attitudes Toward Twice-Exceptionality,” at the National Association for Gifted Children’s Research and Evaluation Network Graduate Student Research Gala (Doctoral-level, Completed Research category). Congratulations on this accomplishment, Pam!

Upcoming RMME/STAT Colloquium (11/20): Bengt Muthen, “Recent Advances in Latent Variable Modeling”

Recent Advances in Latent Variable Modeling

Bengt Muthen

Friday, November 20, 2020

11:30am -1:00pm

Abstract:  This talk gives an overview of some recent and ongoing latent variable research.  Borrowing ideas from multilevel factor analysis, longitudinal SEM in a single-level, wide format is formulated in a new way that finds a well-fitting model 45 years after the writing of the classic Wheaton, Muthen, Alwin, and Summers article.  This segues into a generalization of latent transition analysis using the multilevel notion of a random intercept while staying in a single-level, wide format.  Turning back to multilevel modeling, the talk considers time series analysis of intensive longitudinal data.  This is illustrated by intervention data on electricity consumption and a randomized intervention related to positive and negative affect where cycles play a major role.  Finally, the new feature in Mplus Version 8.5 of Bayesian analysis of count, nominal, and binary logit models is presented.

 

This session is jointly sponsored by the Statistics department and the Research Methods, Measurement, and Evaluation program as part of the Statistical Applications and Quantitative Research Methods colloquium series.

RMME’s 11/6 Application Deadline for Online Programs

Looking to enhance your skills in program evaluation, quantitative research, measurement and/or data analysis? Know a colleague who wants to develop this in-demand skill set? Get prepared for the future of Research Methods, Measurement, & Evaluation (RMME) at the University of Connecticut!

 

With 100% Online or Campus-based options for our 12-credit Graduate Certificate in Program Evaluation and 30-credit RMME Master’s Degree, we offer:

– Flexibility for working professionals—Study anytime, anywhere;

– Courses designed and taught by expert RMME faculty;

– Opportunities for individualized course selection to facilitate your personal career goals;

 And more!

 

Furthermore, with advanced planning, it is even possible to earn BOTH the Graduate Certificate in Program Evaluation and the RMME Master’s Degree WITH NO ADDITIONAL COURSEWORK (beyond the 30 credits required for the master’s degree).

For more information, please visit:

100% Online Program Evaluation Certificate

100% Online RMME Master’s Degree

Or email: methods@uconn.edu

 

Start your journey today—Spring 2021 application deadlines for both the Program Evaluation Certificate and RMME Master’s Degree programs are November 6, 2020, 11:59pm EST!

RMME Participates in the UConn Fall 2020 Virtual Graduate School and Law School Fair

Come meet our faculty and learn more about UConn’s Research Methods, Measurement, & Evaluation (RMME) program! On Thursday, October 15th 2020, from 2:00 pm – 5:00 pm EDT, several RMME faculty members/graduates will participate in UConn’s Virtual Graduate School Fair. With one-on-one appointments and group information sessions (M.A. from 3:00pm – 3:30pm; Ph.D. from 3:30pm – 4:00pm), this event offers everything you need to decide that RMME is for you!

View our virtual profile at: https://app.joinhandshake.com/employers/572568 

RMME Community Members Discuss Research at NERA 2017

RMME Community members (now, both RMME PhD alumni), David Alexandro and Xiaowen Liu, discuss presented research at NERA 2017. Congratulations on this successful presentation, from the Research Methods, Measurement, & Evaluation Community!

 

RMME PhD Students, David Alexandro and Xiaowen Liu Discuss a Poster Presentation at NERA 2017

 

Presenter: David Alexandro

Authors: Charles Martie, David Alexandro, William Estepar-Garcia, & Ajit Gopalakrishnan

Poster Presentation Title: Every Target and Milestone Matters: Developing Connecticut’s Evidence-Based Early Indication Tool (EIT)

Poster Abstract: Early warning systems typically focus on students’ dropout risk. The Connecticut State Department of Education extended this model to create the Early Indication Tool (EIT), a K-12 system that predicts student performance, identifies students who are at-risk of missing milestones and/or dropping out, and ultimately facilitates more timely interventions.