Innovative Assessments that Support Students’ STEM Learning
MetadataShow full item record
The present study aimed to represent the innovative assessments that support students’ learning in STEM education through using the integrative framework for Cognitive Diagnostic Modeling (CDM). This framework is based on three components, cognition, observation, and interpretation (National Research Council, 2001). Specifically, this dissertation demonstrates how this framework combines psychometrics and cognitive psychology, and utilizes the integrative nature of cognition, observation, and interpretation in science and math assessments. At present, STEM assessments do not fully support students’ learning (National Research Council, 2001; Songer & Ruiz-Primo, 2012). We need innovative well-defined assessments that respond to students’ and educators’ needs, particularly for formative purposes. Using the three components of the assessment triangle, cognition, observation, and interpretation (National Research Council, 2001), this study articulated an integrative framework grounded in both a psychometric model and cognitive theory. This framework can both validate learning theory and provide assessment information with diagnostic and formative implications. Guided by this framework, the CDM approach can uniquely support any innovative assessment, as it combines psychology of learning and statistical methods to make inferences about students’ specific knowledge structures and processing skills (Alvers, 2012; de la Torre & Minchen, 2014). Nevertheless, this framework must be carefully applied to assure that the assessment procedures are cohesive, thus providing good support for students’ learning. Specifically, the research questions are: What is the integrative framework for CDM? and How can this CDM integrative framework be applied to develop and validate STEM assessments? To answer these research questions, this dissertation includes three publications that demonstrate what an integrative framework for CDM is and how this framework can be applied. The first paper, “Models for cognitive diagnostic modeling,” focuses on fundamental knowledge about the models used for CDM. Grounded in a systematic review of the literature, the integrative framework for assessments is defined based on the components of cognition, observation, and interpretation. This integrative CDM framework, based upon cognitive science and statistical techniques, can be used as a guide for performing CDM analysis. The second and third papers focus on applying CDM in science and math assessments. Specifically, the second paper, “A cognitive diagnostic modeling approach to instructional sensitivity,” emphasizes using CDM for elementary science data analysis in relation to instructional sensitivity. To answer the research questions, Are assessments sensitive enough to detect student learning differences due to instruction? If so, do they have formative value for teachers and students?, we examined the formative value of instructional sensitivity of assessment items from two elementary science modules. To determine whether items varying in instructional sensitivity yield different formative values for diagnosing student learning, we created booklets with items of different instructional proximity (from close to far proximal), and administered them in 38 classrooms (824 students) using a pretest-posttest design. Incorporated into the CDM analysis, the item and test indices show that the data fit well with the specified Q-matrix. Attributes with higher gain had been heavily addressed in the intended curriculum (i.e., a greater number of learning activities) compared to those with relatively smaller gain. The third paper, “Examining the relationship of characteristics of word problems and item parameters in the context of an online math game,” demonstrates the application of CDM for math word problem game data. The main research question was How are item characteristics of word problems associated with item parameters? The sample included 225 Grade 4-6 players and their performance on 22 items across two booklets. We performed a correlation to investigate the relationship of item characteristics and CDM item parameters. Results showed that consistency and model type were significantly correlated with item difficulty. The sequence analysis with students’ action log data provided visualization of their modeling strategies that further validated the results of CDM and correlation. This framework consists of two elements—CDM as theory-driven model and statistical procedures—and tends to improve the capacity of CDM by increasing the formative values of assessment and validate any cognitive/learning theory that guides the design of assessments. This dissertation illustrates how such a framework can be applied to develop and validate STEM assessments. Specifically, it can be used to generate indices of the instructional sensitivity of assessment items which allow score interpretations at a finer grid. Moreover, it can be used to examine the relationship between item characteristics and item parameters that can in turn guide assessment development and interpretation.
- Education - Seattle