Journal Article

  • In a recent issue of this journal, Hart [2] mentioned that "she made a habit of asking students of all levels what a standard deviation is". She complained that in most students the only answer was: "It's a measure of spread", upon which they provide a formula. We are still more pessimistic. We doubt whether most students do realize that the standard deviation is a special measure of spread: one that measures how strongly the data depart from central tendency. Our doubt has been induced by the way in which many textbooks introduce the concept of variability. Most introductions put a stronger emphasis on the heterogeneity among the observations than on their deviations from the central tendency. An example may illustrate our point.

  • The following classroom examples illustrate how to teach statistics in connection with mathematical concepts already present in the curriculum. Each examples begins with material for middle school or junior high school students and is extended in ways that are appropriate for students throughout the high school grades.

  • Recent years have witnessed a strong movement away from what might be termed classical statistics to a more empirical, data-oriented approach to statistics, sometimes termed exploratory data analysis, or EDA. This movement has been active among professional statisticians for twenty or twenty-five years but has begun permeating the area of statistical education for non-statisticians only in the past five to ten years. At this point, there seems to be little doubt that EDA approaches to applied statistics will gain support over classical approaches in the years to come. That is not to say that classical statistics will disappear. The two approaches begin with different assumptions and have different objectives, but both are important. These differences will be outlined in this article.

  • The establishment of relationships among variables is basic to prediction and scientific explanation. Correlational reasoning - the reasoning processes one uses in determining the strength of mutual or reciprocal relationship between variables - is, therefore, a fundamental aspect of scientific reasoning. Suppose, for instance, that a scientist is interested in finding out whether a correlation exists between the body weight of rats and the presence of a substance X in their blood. The establishment of a correlation requires an initial recognition of the four possible associations: (a) = heavy weight and presence of substance X; (b) = heavy weight and absence of substance X; (c) = light weight and presence of substance X; and (d) = light weight and absence of substance X. When variables can be dichotomized such as this, one may construct a 2x2 association table of the sort used to compute simple contingencies. In view of the fundamental role played by correlational reasoning in the investigative process, we asked ourselves the following question: How do high school science and mathematics students approach tasks that require correlational reasoning for successful solution? An answer to this question will indicate how students apply this important aspect of scientific reasoning and might suggest how this reasoning pattern could be enhanced through instruction.

  • Psychologists interested in such diverse areas as scientific reasoning, attribution theory, depression, and judgment have central to their theories the ability of people to judge the degree of covariation between two variables. We performed seven experiments to help determine what heuristics people use in estimating the contingency between two dichotomous variables. Assume that the two variables are Factor 1 and Factor 2, each of which may be present or absent. In Experiment 1 we hypothesized that people assess contingency solely based on the number of instances in which both Factor 1 and Factor 2 are present. By manipulating column and row tables of a 2x2 matrix, we were able to place various values in this "present-present" cell, also called Cell A. If subjects do base their contingency estimate on Cell A, we would expect a monotonic relation between Cell A frequency and the contingency estimate. This test of the Cell A heuristic led us to conclude that it could not represent a complete explanation of contingency estimation. Although Experiment 2 resulted in a rejection of one possible explanation of the results of Experiment 1, Experiment 2 and 3 together provided us with an essential finding: Very low cell frequencies are greatly overestimated. In Experiment 4 participants in a contingency estimation task involving no memory demands used rather complex heuristics in judging contingency. When the memory demands were increased in Experiment 5, the comparatively simple Cell A heuristic emerged as the modal strategy. Two factors, the use of simple heuristics by most subjects and the overestimation of small cell frequencies, combined to explain the results of Experiments 2 and 3. In Experiment 6 we showed that in a contingency estimation task, salience can augment the impact of one type of data but not another. In Experiment 7 we learned that the versus at the end of the data stream, can influence the final estimate. From this group of experiments we concluded that the "framing" of the task affects the contingency estimate; a number of factors that bear no logical relation to the contingency between two factors nevertheless influence one's perception of the contingency. Finally, we related our findings to a variety of analogous findings in the research areas of memory, attribution theory, clinical judgment, and depression.

  • No! Statistics is no more a branch of mathematics than is economics, and should no more be taught by mathematicians. It is a separate discipline that makes heavy and essential use of mathematical tools, but has origins, subject matter, foundational questions and standards that are distinct from those of mathematics. It is true that many advanced texts and research papers in statistics use formidable mathematics, but this is misleading. After all, many a graduate microeconomics text cites the Kuhn-Tucker theorem on the first page, and many research papers in physics are intensely mathematical. Statistics is as much a distinct discipline as are economics and physics. Its subject matter is data and inference from data. It is unprofessional for mathematicians who lack training and experience in working with data to teach statistics.

  • The classroom activity described here is a structured problem series developed for students to discover concepts themselves. Among psychology students, introductory statistics is a course which often is less appealing than other courses. As a result, one of the major challenges in teaching it to undergraduates is making the material both interesting and relevant to the student's personal experience. This is particularly true in relation to other courses in the major, where the self-referential nature of the content insures at least some degree of relevance. During the past three years, I have taught introductory statistics courses to classes which included not only psychology majors but also education and biology students. The students of these courses and feedback from students has convinced me that a few key features of the course structure and manner of presentation of the material are primarily responsible for making the courses effective and enjoyable. These features all relate the material to the direct experience of the students. This approach has a strong justification of both educational theory (e.g., Dewey, 1938) and from psychological research (e.g., Craik & Lockhart, 1972); material made meaningful in this way is more likely to be assimilated and retained. In particular, the aspect of individual experience to which the statistical material is conceptually related is the manner in which knowledge is gained. This will be elaborated later in the article; the justification of this approach can be made in terms of the nature of the discipline as well as pedagogically. Statistical inference is directly concerned with specifying principles by which scientific knowledge is gained; be relating the content of statistics to one's own experience of gaining knowledge, one sees more clearly the core of the discipline. This paper first describes the classroom activities which have been features of this approach; it then reviews the manner in which statistical principles have been conceptually related to the students' experience of gaining knowledge.

  • Students in a small experimental design class obtained information about statistical and research applications concerning a variety of products advertised by different companies. The resulting data were perceived to have several advantages for the students: (a) it made collecting and interpreting data more interesting and less mysterious, (b) it helped them to understand how research design and statistics are used in real-life situations, and (c) it helped them to make more discerning judgments about advertisers' claims for their products.

  • Explaining abstract, theoretical distributions to beginning students is sometimes difficult. This article describes a demonstration that helps to make the central limit theorem for generating sampling distributions concrete and understandable.

  • A significant number of students in introductory statistics courses may function at Piaget's concrete operational level of thought. These students may find it difficult to understand the complex correlations and interactions between variables that typify many statistical procedures. A technique for introducing analysis of variance (ANOVA) in a concrete fashion is presented. This technique leads students to an intuitive understanding and their relationship to each other.

Pages