Abstract
Background: In increasing numbers of studies, participants are measured on the same variables of interest – such as task performance and effort associated with task performance – several times. Three statistical approaches that are encountered in health professions education research in this context are: aggregation, multiplication, and repeated measures analysis. In the aggregation approach, repeated measurements are reduced to average scores (i.e., aggregates) for each measure and these are used in subsequent analysis. In the multiplication approach, repeated measurements are treated as if they came from different instead of from the same respondents. In the repeated measures approach, the repeated measurements are treated as is. While the aggregation and multiplication approach are frequently encountered, the repeated measures approach is not. Method: Through a simulated data example that incorporates features from studies on this kind of data encountered in the literature, this article compares the three aforementioned approaches in terms of information and statistical validity. Results: The comparison illustrates that, contrary to repeated measures analysis, the aggregation and multiplication approach fail to capture essential information from repeated measurements data and can result in erroneous implications for future research and practice (i.e., ecological fallacy). Discussion: The findings from this comparison have implications for all statistical techniques that are based on correlation and regression; failing to account for repeated measurements structures in data can distort correlations and all statistics based on them. © 2017 King Saud bin AbdulAziz University for Health Sciences
Recommended Citation
Leppink, Jimmie
(2019)
"When Negative Turns Positive and Vice Versa: The Case of Repeated Measurements,"
Health Professions Education: Vol. 5:
Iss.
1, Article 4.
DOI: 10.1016/j.hpe.2017.03.004
Available at:
https://hpe.researchcommons.org/journal/vol5/iss1/4