• The author states that the reason why students have major difficulty in learning statistics and that distinguishes statistics from other disciplines is that the important fundamental concepts of statistics are quintessentially abstract. In his view, concepts that are fundamental in statistics cannot be directly demonstrated, experienced, or drawn. Other factors are listed as making the problem worse: 1) intro stats courses involve more abstract concepts which are used frequently; 2) students must deal with truly abstract concepts AND immediately relate and apply these concepts to reality; 3) problems in statistics are always open to interpretation and have several solutions, none of which are truly known as being the correct ones; 4) the difference between statistics and mathematics lies in the type of numbers that are obtained- in mathematics, the numbers are obtained from calculations whereas in statistics, numbers are obtained from experiments; and 5) statistical notation and terminology are ambiguous and confusing. Watt's solution to the above problem is for statisticians to improve the notation and terminology by making the terms more meaningful and removing ambiguities.

  • This book has been written to fill a substantial gap in the current literature in mathematical education. Throughout the world, school mathematical curricula have incorporated probability and statistics as new topics. There have been many research papers written on specific aspects of teaching, presenting novel and unusual approaches to introducing ideas in the classroom; however, there has been no book giving an overview. Here we have decided to focus on probability, making reference to inferential statistics where appropriate; we have deliberately avoided descriptive statistics as it is a separate area and would have made ideas less coherent and the book excessively long. The following chapters are included: - The educational perspective - Probabilistic perspective - Empirical research in understanding probability - Analysis of the probability curriculum - The theoretical nature of probability - Computers in probability education - Psychological Research in probabilistic understanding

  • This opening chapter presents the aims and rationale of the book within an appropriate theoretical framework. Initially, we provide the reader with an orientation of what the book intends to achieve. The next section highlights some important issues in mathematical education, establishing a framework against which ideas in the book have been developed. Partly, the research has been inspired by the first series on mathematical education: Freudenthal's Didactical Phenomenology of Mathematical Structures. Though he considers many topics in mathematics he excludes (perhaps surprisingly) probability. Finally, summaries of each of the chapters are related to these didactic approaches.

  • There are unusual features in the conceptual development of probability in comparison to other mathematical theories such as geometry or arithmetic. A mathematical approach only began to emerge rather late, about three centuries ago, long after man's first experiences of chance occurrences. A large number of paradoxes accompanied the emergence of concepts indicating the disparity between intuitions and formal approaches within the sometimes difficult conceptual development. A particular problem had been to abandon the endeavour to formalize one specific interpretation and concentrate on studying the structure of probability. Eventually, a sound mathematical foundation was only published in 1933 but this has not clarified the nature of probability. There are still a number of quite distinctive philosophical approaches which arouse controversy to this day. In this part of the book all these aspects are discussed in order to present a mathematical or probabilistic perspective. The scene is set by presenting the philosophical background in conjunction with historical development; the mathematical framework offers a current viewpoint while the paradoxes illuminate the probabilistic ideas.

  • The analysis of historical development and philosophical ideas has shown the multifaceted character of probability. Kolmogorov's axiomatic structure does not reflect the complexity of ideas. The abundance of paradoxes not only occurred in the historical development of the discipline, it is also apparent in the individual learning process. Misconceptions are obstacles to comprehending and accepting theoretical ideas. Empirical research on probabilistic thinking aims to clarify and classify such misconceptions from both the theoretical as well as the individual's perspective. We present major research ideas of psychology and didactics of mathematics from a critical perspective. Our method of interpreting subjects' responses to experimental situations will be a complementarity of intuitions and official mathematics which is especially helpful for transferring ideas to actual teaching.

  • This chapter presents an epistemological analysis of the nature of stochastical knowledge. In particular, the mutual relationship between the elementary concept of probability (in its empirical form of relative frequency and in its theoretical form of Laplace's approach) and the basic idea of chance is demonstrated. An important consequence for teaching elementary probability is that there cannot be a logical and deductive course for introducing the basic concepts and then constructing the theory upon them; developing stochastic knowledge in the classroom has to take into account a holistic and systematic perspective. The concept of task system is elaborated as an appropriate curricular means for treating the theoretical nature of stochastic knowledge in the classroom.

  • This chapter is concerned with the impact of computers on probability in general secondary education. Mathematics educators have been producing ideas for using computers and calculators in probability education for two decades. Although there are many teaching suggestions, empirical research on this topic is uncommon and critical reports of practical experience rarely go beyond an enthusiastic description. A critical review of ideas, software and experience which would be helpful for further research and development is the major objective of this chapter. We will deal with pedagogical aspects, the subject matter and its change, and the role of changing technology. Various approaches will be reviewed; computers used as general mathematical utilities, simulation as a scientific method, and simulation for providing an empirical background for probability. Graphical methods may enhance the idea of visualization. The emphasis is on general orientation in the field.

  • General principles on how statistics should be taught and how those who learn statistics can and should use statistics in society are discussed. According to the author mathematics should be the servant of statistics, not the master. Moreover, the ultimate content must be philosophical - data analysis without a problem is a pure waste.

  • We often forget how science and engineering function. Ideas come from previous exploration more often than from lightning strokes. Important questions can demand the most careful planning for confirmatory analysis. Broad general inquiries are also important. Finding the question is often more important than finding the answer. Exploratory data analysis is an attitude, a flexibility, and a reliance on display, NOT a bundle of techniques, and should be so taught. Confirmatory data analysis, by contrast, is easier to teach and easier to computerize. We need to teach both; to think about science and engineering more broadly; to be prepared to randomize and avoid multiplicity.

  • Piaget worked out his logical theory of cognitive development, Koehler the Gestalt laws of perception, Pavlov the principles of classical conditioning, Skinner those of operant conditioning, and Bartlett his theory of remembering and schemata - all without rejecting null hypotheses. But, by the time I took my first course in psychology at the University of Munich in 1969, null hypothesis tests were presented as the indispensable tool, as the sine qua non of scientific search. Post-World War 2 German psychology mimicked a revolution of research practice that had occurred between 1940 and 1955 in American psychology. What I learned in my courses and textbooks about the logic of scientific inference was not without a touch of morality, a scientific version of the 10 commandments: Thou shalt not draw inferences from a nonsignificant result. Thou shalt always specify the level of significance before the experiment; those who specify it afterward (by rounding up obtained p values) are cheating. Thou shalt always design the experiments so that thou canst perform significance testing.