ES EN
Vol. 20. Núm. 2. - 2014. Páginas 109-115

Empirical recovery of argumentation learning progressions in scenario-based assessments of English language arts

[Replicación empírica de las progresiones de aprendizaje de la capacidad para argumentar en una evaluación basada en escenarios de competencias de lectoescritura]

Peter W. van Rijn1 , E. Aurora Graf2 , Paul Deane2


1Educational Testing Serv. Global, Amsterdam, The Netherlands ,2Educational Testing Serv., Princeton, New Jersey, USA


https://doi.org/10.1016/j.pse.2014.11.004

Abstract

We investigate methods for studying learning progressions in English language arts using data from scenario-based assessments. Particularly, our interest lies in the empirical recovery of learning progressions in argumentation for middle school students. We collected data on three parallel assessment forms that consist of scenario-based task sets with multiple item formats, where students randomly took two of the three assessments. We fitted several item response theory models, and used model-based measures to classify students into levels of the argumentation learning progression. Although there were some differences in difficulty between parallel tasks, good agreement was found among the classifications of the parallel forms. Overall, we managed to recover empirically the order of the levels in the argumentation learning progression as they were assigned to tasks of the assessments by the theoretical framework. 

 

Resumen

En este trabajo se investigan métodos para estudiar las progresiones de aprendizaje de competencias delecto-escritura utilizando evaluaciones basadas en escenarios. En particular, nos interesa poder replicar las progresiones en el aprendizaje de la capacidad para argumentar en estudiantes de enseñanza secundaria obligatoria. Se han recogido datos aplicando tres formas paralelas de una prueba que consiste en conjuntos de tareas basadas en escenarios con preguntas de distinto formato; cada estudiante respondió a dos de estas tres formas, que fueron asignadas aleatoriamente a cada uno de ellos. Se han ajustado a los datos varios modelos de teoría de respuesta al ítem y se han utilizado medidas basadas en esta teoría para clasificar a los estudiantes en los niveles correspondientes de la progresión en el aprendizaje de la capacidad para argumentar. Aunque se han detectado algunas diferencias en las tareas de las formas paralelas, se ha encontrado un grado razonable de acuerdo entre las clasificaciones realizadas en base a las formas paralelas de la prueba. En general, se ha replicado empíricamente el orden de los niveles de la progresión en el aprendizaje de la argumentación, tal y como fueron asignados los niveles a las tareas de la prueba en el marco teórico. 

 

References
 
Adams et al., 1997
 
R. Adams
M. Wilson
W.-C. Wang
The multdimensional random coefficients multinomial logit model
Applied Psychological Measurement
21
1997
1-23
Alonzo and Gotwals, 2012
 
Alonzo, A., & Gotwals, A. (Eds.) (2012). Learning progressions in science: Current challenges and future directions . Rotterdam, The Netherlands: Sense.
Altman, 1991
 
D.G. Altman
Practical statistics for medical research
1991
Arieli-Attali et al., 2012
 
Arieli-Attali, M., Wylie, E.C., & Bauer, M.I. (2012, April). The use of three learning progressions in supporting formative assessment in middle school mathematics. Paper presented at the annual meeting of the American Educational Research Association (AERA), Vancouver, CA.
Avalon et al., 2007
 
M.E. Avalon
L.S. Meyers
B.W. Davis
N. Smits
Distractor similarity and item-stem structure: Effects on item difficulty
Applied Measurement in Education
20
2007
153-70
Bennett, 2010
 
R. Bennett
Cognitively based assessment of, for, and as learning (CBAL): A preliminary theory of action for summative and formative assessment
Measurement
8
2010
70-91
Black et al., 2011
 
P. Black
M. Wilson
S.-Y. Yao
Road maps for learning: A guide to navigation of learning progressions
Measurement: Interdisciplinary Research and Perspectives
9
2011
71-123
Carr and Alexeev, 2011
 
M. Carr
N. Alexeev
Fluency, accuracy, and gender predict developmental trajectories of arithmetic strategies, Journal of Educational Psychology
103
2011
617-31
Cizek and Bunch, 2007
 
G.J. Cizek
M.B. Bunch
Standard setting: A guide to establishing and evaluating performance standards on tests
2007
Clements and Sarama, 2004
 
D.H. Clements
J. Sarama
Learning trajectories in mathematics education
Mathematical Thinking and Learning
6
2004
81-9
Common, 2010
 
Common Core State Standards Initiative (2010). Common core state standards for Englishlanguage arts and literacy in history/social studies, science, and technical subjects. Retrieved from http://www.corestandards.org/ELA-Literacy/.
Deane et al., 2011
 
Deane, P., Fowles, M., Baldwin, D., & Persky, H. (2011). The CBAL summative writing assessment: A draft eighth-grade design (Research Memorandum 11-01). Princeton, NJ: Educational Testing Service.
Deane and Song, 2014
 
P. Deane
Y. Song
A case study in principled assessment design: Designing assessments to measure and support the development of argumentative reading and writing skills
Psicología Educativa
20
2014
99-108
Dudycha and Carpenter, 1973
 
A.L. Dudycha
J.B. Carpenter
Effects of item format on item discrimination and difficulty, Journal of
Applied Psychology
58
1973
116-21
Duschl et al., 2011
 
R. Duschl
S. Maeng
A. Sezen
Learning progressions and teaching sequences: A review and analysis
Studies in Science Education
47
2011
123-82
Embretson and Gorin, 2001
 
S.E. Embretson
J. Gorin
Improving construct validity with cognitive psychology principles
Journal of Educational Measurement
38
2001
343-68
Fleiss et al., 2003
 
J.L. Fleiss
B. Levin
M. Paik
Statistical methods for rates and proportions
3rd ed
2003
Haberman, 2013
 
Haberman, S. (2013). A general program for item-response analysis that employs the stabilized Newton-Raphson algorithm (ETS Research Report 13-32). Princeton, NJ: Educational Testing Service.
Mislevy et al., 2003
 
Mislevy, R.J., Almond, R.G., & Lukas, J.F. (2003). A brief introduction to evidence centered design (Research Report 03-16). Princeton, NJ: Educational Testing Service.
Mislevy and Haertel, 2006
 
R.J. Mislevy
G.D. Haertel
Implications of evidence-centered design for educational testing
Educational Measurement: Issues and Practice
25
4
2006
6-20
Muraki, 1992
 
E. Muraki
A generalized partial credit model: Application of an EM algorithm
Applied Psychological Measurement
16
1992
159-76
Newstead et al., 2006
 
S.E. Newstead
P. Bradon
S.J. Handley
I. Dennis
J.S.B.T. Evans
Predicting the difficulty of complex logical reasoning problems
Thinking & Reasoning
12
2006
62-90
OECD, 1999
 
OECD (1999). Measuring student knowledge and skill. a new framework for assessment. Retrieved from h t t p & # 5 8 ; & # 4 7 ; & # 4 7 ; w w w & # 4 6 ; o e c d & # 4 6 ; o r g & # 4 7 ; e d u & # 4 7 ; s c h o o l/programmeforinternationalstu-dentassessmentpisa/33693997.pdf.
OECD, 2013a
 
OECD (2013a, March). PISA 2015: Draft reading literacy framework. Retrieved from h t t p & # 5 8 ; & # 4 7 ; & # 4 7 ; w w w & # 4 6 ; o e c d & # 4 6 ; o r g & # 4 7 ; p i s a & # 4 7 ; p i s a p r o d u c t s & # 4 7 ;Draft%20PISA%202015%20Reading%20Fra-mework%20.pdf.
OECD, 2013b
 
OECD (2013b, March). PISA 2015: Draft science framework. Retrieved from http://www. o e c d & # 4 6 ; o r g & # 4 7 ; p i s a & # 4 7 ; p i s a p r o d u c t s & # 4 7;Draft%20PISA%202015%20Science%20Framework%20.pdf.
Reckase, 2009
 
M. Reckase
Multidimensional item response theory
2009
Rupp et al., 2006
 
A.A. Rupp
T. Ferne
H. Choi
How assessing reading comprehension with multiple-choice questions shapes the construct: A cognitive processing perspective
Language Testing
23
2006
441-74
Smith et al., 2006
 
C. Smith
M. Wiser
C. Anderson
J. Krajcik
Implications of research on children's learning for standards and assessment: A proposed learning progression for matter and the atomic-molecular theory
Measurement: Interdisciplinary Research and Perspectives
4
2006
1-98
Song et al., 2013
 
Song, Y., Deane, P., Graf, E.A., & van Rijn, P.W. (2013). Using argumentation learning progressions to support teaching and assessments of English language arts (R & D Connections No. 22). Princeton, NJ: Educational Testing Service.
Steedle and Shavelson, 2009
 
J. Steedle
R. Shavelson
Supporting valid interpretations of learning progression level diagnoses
Journal of Research in Science Teaching
46
2009
669-715
Van der Schoot, 2001
 
Van der Schoot, F.C. J. A. (2001, April). The application of an IRT-based method for standard setting in a three-stage procedure. Paper presented at the annual meeting of the National Council of Measurement in Education (NCME), New Orleans, LA.
Wainer and Thissen, 1996
 
H. Wainer
D. Thissen
How is reliability related to the quality of test scores? What is the effect of local dependence on reliability?
Educational Measurement: Issues and Practice
15
1
1996
22-9
Way et al., 1988
 
W. Way
T.N. Ansley
R.A. Forsyth
The comparative effects of compensatory and noncompensatory two-dimensional data on unidimensional IRT estimates
Applied Psychological Measurement
12
1988
239-52
West et al., 2012
 
West, P., Rutstein, D., Mislevy, R., Liu, J., Levy, R., DiCerbo, K., … Behrens, J. (2012). A Bayesian network approach to modeling learning progressions. In A. Alonzo & A. Gotwals (Eds.), Learning progressions in science: Current challenges and future directions (p. 257-292). Rotterdam, The Netherlands: Sense.
Wilmot et al., 2011
 
D. Wilmot
A. Schoenfeld
M. Wilson
D. Champney
W. Zahner
Validating a learning progression in mathematical s for college readiness
Mathematical Thinking and Learning
1
2011
259-91
Wilson, 2009
 
M. Wilson
Measuring progressions: Assessment structures underlying a learning progression
Journal of Research in Science Teaching
46
2009
716-30
Yen, 1993
 
W. Yen
Scaling performance assessments: Strategies for managing local item dependence
Journal of Educational Measurement
30
1993
187-213
Zwick et al., 2001
 
R. Zwick
D. Senturk
J. Wang
S.C. Loomis
An investigation of alternative methods for item mapping in the National Assessment of Educational Progress
Educational Measurement: Issues and Practice
20
2
2001
15-25

Copyright © 2024. Colegio Oficial de la Psicología de Madrid

© Copyright 2024. Colegio Oficial de la Psicología de Madrid ContactoPolítica de privacidadPolítica de cookies

Utilizamos cookies propias y de terceros para mejorar nuestros servicios y conocer sus preferencias mediante el análisis de sus hábitos de navegación. Si continua navegando, consideramos que acepta su uso. Puede acceder a política de cookies para obtener más información.

Aceptar