Sayali Phadke (Penn State Behrend), Jodie Styers (Penn State Behrend)
Abstract
Background. Research in mathematics education informs statistics education research (Ben-Zvi et. al. 2007). Principles and Standards for School Mathematics (NCTM 2000) and the Common Core State Standards (2010) highlight the importance of engaging students in opportunities to develop and support their ability to reason, construct arguments, and produce proofs. The Mathematics Association of America’s Instructional Practices Guide (2018) encourages teachers to incorporate assessment opportunities that allow students to practice their communication skills, particularly with building logical arguments. A synthesis of these research-based recommendations is also reflected in the Guidelines for Assessment and Instruction in Statistics Education (GAISE 2016). Proposed recommendations for Statistics and Data Science (2025 GAISE) emphasize effective written and oral communication. Although the recommendations focus on data analysis, they are easily extended to essential computational skills in statistics (Nolan & Temple Lang 2010). Coding requires students to construct logical arguments in the language (Hardin et. al. 2021). Recent work in statistics education harnesses the value of written justification to better understand students’ thought processes (Woodard & Lee (2020), Theobold, (2021)) and elicits students’ reasoning through code (Theobold et. al. 2023). This project implements homework questions that provide students with opportunities to reason and prove while coding in R.
Methods. This study was conducted in an Introduction to R course. Student participants engaged in homework assignments that included traditional (procedural) homework questions alongside task-based questions that engaged students in the practice of proof and reason (NCTM 2000a). Traditional questions required students to describe the expected output from an R command or to choose an appropriate R command. Task-based questions required students to construct an argument and provide reasoning as to how a given piece of code connected to the output shown.
Findings. Analysis of data is on-going. This study will feature a mixed-methods approach. An existing rubric (Egodawatte 2010) will be used to assign scores to students’ responses on both question types. Quantitative analysis will include a comparison of scores on the two question types and a comparison of performance on research-related questions and course grades. Grounded theory (Glaser 2007) will facilitate the qualitative analysis. Researchers will code students’ responses to the task-based questions and use the constant comparison method (Glass 1965) to identify themes within the reasoning.
Implications For Teaching and For Research. The outcomes of this study have implications for teachers’ pedagogical practices. Incorporating opportunities for students to prove and reason strengthens students’ understanding of the content (Boaler 2022) which can positively impact achievement. Analyzing rubric grades and trends in students’ reasoning will identify areas of strength and areas for continued growth. Better understanding students’ current abilities to reason and proof will inform future pedagogical decisions to strengthen students’ arguments. Continued iterations of formative assessments that feature task-based questions aimed at strengthening students’ reasoning will facilitate future research projects.