Month: June 2021

*RMME Master’s Student, Daniel Doerr, Secures New Position

Congratulations to Daniel Doerr, a part-time Master’s student in the Research Methods, Measurement, & Evaluation program at UConn! Daniel currently serves as the Director of Student Affairs Planning, Assessment, and Evaluation, in the Office of the Vice President for Student Affairs at the University of Connecticut. However, he recently accepted new employment as an Associate Performance Auditor, with the State of Connecticut Auditors of Public Accounts Office. Daniel will start work in his new position on July 16th. 

Please join the RMME community, as we congratulate Daniel on this career-changing milestone!

RMME Master’s Student, Daniel Doerr, Secures New Position

Congratulations to Daniel Doerr, a part-time Master’s student in the Research Methods, Measurement, & Evaluation program at UConn! Daniel currently serves as the Director of Student Affairs Planning, Assessment, and Evaluation, in the Office of the Vice President for Student Affairs at the University of Connecticut. However, he recently accepted new employment as an Associate Performance Auditor, with the State of Connecticut Auditors of Public Accounts Office. Daniel will start work in his new position on July 16th. 

Please join the RMME community, as we congratulate Daniel on this career-changing milestone!

*RMME Faculty & Students Publish New Article: “Evaluator Education Curriculum: Which Competencies Ought to Be Prioritized in Master’s and Doctoral Programs?”

Congratulations to Bianca Montrosse-MoorheadAnthony J. GambinoLaura M. YahnMindy Fan, and Anne T. Vo on their recent publication: “Evaluator Education Curriculum: Which Competencies Ought to Be Prioritized in Master’s and Doctoral Programs?” This article appears in the American Journal of Evaluation (https://doi.org/10.1177/10982140211020326).

For more information, visit: https://journals.sagepub.com/doi/10.1177/10982140211020326

 

Abstract:

A budding area of research is devoted to studying evaluator curriculum, yet to date, it has focused exclusively on describing the content and emphasis of topics or competencies in university-based programs. This study aims to expand the foci of research efforts and investigates the extent to which evaluators agree on what competencies should guide the development and implementation of evaluator education. This study used the Delphi method with evaluators (n = 11) and included three rounds of online surveys and follow-up interviews between rounds. This article discusses on which competencies evaluators were able to reach consensus. Where consensus was not found, possible reasons are offered. Where consensus was found, the necessity of each competency at both the master’s and doctoral levels is described. Findings are situated in ongoing debates about what is unique about what novice evaluators need to know and be able to do and the purpose of evaluator education.

RMME Faculty & Students Publish New Article: “Evaluator Education Curriculum: Which Competencies Ought to Be Prioritized in Master’s and Doctoral Programs?”

Congratulations to Bianca Montrosse-MoorheadAnthony J. GambinoLaura M. YahnMindy Fan, and Anne T. Vo on their recent publication: “Evaluator Education Curriculum: Which Competencies Ought to Be Prioritized in Master’s and Doctoral Programs?” This article appears in the American Journal of Evaluation (https://doi.org/10.1177/10982140211020326).

For more information, visit: https://journals.sagepub.com/doi/10.1177/10982140211020326

 

Abstract:

A budding area of research is devoted to studying evaluator curriculum, yet to date, it has focused exclusively on describing the content and emphasis of topics or competencies in university-based programs. This study aims to expand the foci of research efforts and investigates the extent to which evaluators agree on what competencies should guide the development and implementation of evaluator education. This study used the Delphi method with evaluators (n = 11) and included three rounds of online surveys and follow-up interviews between rounds. This article discusses on which competencies evaluators were able to reach consensus. Where consensus was not found, possible reasons are offered. Where consensus was found, the necessity of each competency at both the master’s and doctoral levels is described. Findings are situated in ongoing debates about what is unique about what novice evaluators need to know and be able to do and the purpose of evaluator education.

*Upcoming RMME/STAT Colloquium (6/18): Jon Krosnick, “The Collapse of Scientific Standards in the World of High Visibility Survey Research”

RMME/STAT Joint Colloquium

The Collapse of Scientific Standards in the World of High Visibility Survey Research

Dr. Jon Krosnick
Stanford University

Friday, June 18th, at 12:00PM ET

https://uconn-cmr.webex.com/uconn-cmr/j.php?MTID=m6b0af866c35360de3b7819e6204bc121

In parallel to the explosion of the replication crisis across the sciences, survey research has experienced its own crisis of credibility – and very publicly. Election after election, pre-election polls in recent years in the U.S., Britain, Israel, and elsewhere have been widely viewed as inaccurate. After each failure to accurately predict election outcomes, the survey research profession has implemented a self-study to try to explain its inaccuracies, presumably in order to learn useful lessons for improving practices. And yet inaccuracies have continued unabated. This talk will review the evidence of inaccuracy and propose and test an explanation that has received little attention: that leading survey researchers have all but abandoned well-validated scientific procedures for data collection and data analysis and have misrepresented their procedures as having more scientific integrity than they in fact have. Interestingly, the lessons learned have implications for academic research in the social sciences, in medicine, and in other fields.

 

Loader Loading...
EAD Logo Taking too long?

Reload Reload document
| Open Open in new tab

Upcoming RMME/STAT Colloquium (6/18): Jon Krosnick, “The Collapse of Scientific Standards in the World of High Visibility Survey Research”

RMME/STAT Joint Colloquium

The Collapse of Scientific Standards in the World of High Visibility Survey Research

Dr. Jon Krosnick
Stanford University

Friday, June 18th, at 12:00PM ET

https://uconn-cmr.webex.com/uconn-cmr/j.php?MTID=m6b0af866c35360de3b7819e6204bc121

In parallel to the explosion of the replication crisis across the sciences, survey research has experienced its own crisis of credibility – and very publicly. Election after election, pre-election polls in recent years in the U.S., Britain, Israel, and elsewhere have been widely viewed as inaccurate. After each failure to accurately predict election outcomes, the survey research profession has implemented a self-study to try to explain its inaccuracies, presumably in order to learn useful lessons for improving practices. And yet inaccuracies have continued unabated. This talk will review the evidence of inaccuracy and propose and test an explanation that has received little attention: that leading survey researchers have all but abandoned well-validated scientific procedures for data collection and data analysis and have misrepresented their procedures as having more scientific integrity than they in fact have. Interestingly, the lessons learned have implications for academic research in the social sciences, in medicine, and in other fields.

 

Loader Loading...
EAD Logo Taking too long?

Reload Reload document
| Open Open in new tab