Skip to Content

Assessing Data Modeling and Statistical Reasoning

Assessment Intervention Learning Policy Training

Abstract

Assessing Data Modeling aims to develop an assessment system to evaluate elementary and middle school students skills and understandings related to data modeling and statistical reasoning. Data modeling is the recruitment of statistical reasoning for the purpose of investigating questions about the world. The assessments focus on the intersections of mathematics and science. Data modeling is explicitly represented in the mathematics standards, and the assessments will be useful to those who seek ways to tap students understanding of statistical reasoning. Data modeling is also central to scientific inquiry, so these assessments will contribute to a solution of the thorny problem of how to assess students coordination of data with inquiry.

Research and development efforts take place in middle schools in Nashville, TN, Phoenix, AZ, and Holyoke, MA, which serve minority youth in financially struggling communities. We work with teachers to design formative and summative assessments that diagnose students skills and knowledge. The formative assessments feature contexts for instruction as well as observation. Each assessment includes Teacher Notes that suggest ways to leverage the assessment as an opportunity for instruction. Teacher Notes are based on current research-based understandings of student reasoning and on new research that we conduct.

We employ the Berkeley Evaluation and Assessment Research model to develop construct maps (progress variables) for each of five strands of data modeling: (a) measurement, (b) representation, (c) data structures, (d) statistical inference and (e) chance. Progress variables are hypothetical developmental trajectories of learning that reflect an emerging research base about how students in this age band typically reason about these concepts. The measurement strand reflects the central role that measurement plays in the quantification of natural systems and its close connections to data and statistics in the NCTM standards.

Display is critical to many forms of statistical reasoning. The focus on data structure follows from the observation that data can be structured in ways that either facilitate or impede inquiry. Our focus on inference is at the core of statistical reasoning: making inference in light of uncertainty distinguishes statistics from other forms of mathematical reasoning. The chance construct suggests a trajectory for reasoning about uncertainty, which is foundational for statistical reasoning. These progress variables guide the development of formative and summative items, which will be tested in grades 5-8 classrooms. We anticipate substantial modification to progress variables and items during the first two years of the project as we conduct small-scale work with teachers to iteratively refine items based on student response (written, interviews, think-alouds) to items and teacher response to with the accompanying scoring guides and Teacher Notes. During the third year, we will transition toward larger sample sizes (500 students at each grade) appropriate to determining psychometric characteristics of item functioning. Outcome variables will be scaled with a multidimensional item response model. The fourth year will feature refinement of items and scales.

The BEAR Center will develop a technical report and an electronic database of items cross-indexed with progress variables in each strand of data modeling. Hence, the project will produce an assessment system that will increase both diagnostic and instructional capacity in an area vital to education in both mathematics and science.


 
©