Sorry, you need to enable JavaScript to visit this website.

Book

  • Statistical literacy, the art of drawing reasonable inferences from an abundance of numbers provided daily by the media, is indispensable for an educated citizenship, as are reading and writing. Unlike reading and writing, however, sound statistical reasoning is rarely taught, and if it has been taught, it was with little success. This book presents and discusses new empirical and theoretical results about the topic of eveyday statistical reasoning, that is, how people think and act on probabilistic information. It focuses on how porcesses of statistical reasoning work in detail and how training programs can exploit natural cognitive capabilities to improve statistical reasoning. (From preface)

  • Over the past decade there has been an increasingly strong call for statistics<br>education to focus on statistical literacy, statistical reasoning, and statistical<br>thinking. Our goal in creating this book is to provide a useful resource for educators<br>and researchers interested in helping students at all educational levels to develop<br>these cognitive processes and learning outcomes. This book includes cutting-edge<br>research on teaching and learning statistics, along with specific pedagogical<br>implications. We designed the book for academic audiences interested in statistics<br>education as well as for teachers, curriculum writers, and technology developers. (From preface)

  • On July 23, 1996, 36 researchers from different countries and 6 continents met in Granada, Spain, for an invitational Round Table conference sponsored by the International Association for Statistical Education (IASE). During the five days of the conference, we listened to presentations, viewed software demonstrations, and discussed challenging issues regarding the use of technology with students who are learning statistics. (From preface.)

  • A computer-assisted instructional (CAI) course in applied statistics has been taught for 15 years in the Faculty of Education at the University of Manitoba. The CAI courseware was originally created to be the primary mode of instruction for the course, and it is very extensive in terms of content and style of presentation. The course includes 14 modules of instruction and 10 examinations, and it takes the average student about 80-90 hours of online instruction to complete. Originally programmed in IBM's Coursewriter II authoring language for use on an IBM 1500 system, the course continues to exist in this language, with some enhancement provided through the development of an in-house interpreter. Under the present CAI system, the course requires about 2.3 megabytes of memory, not counting the memory needed to store the interpreter, run-time system, and graphics. Estimates suggests that it would take approximately 14 megabytes of memory to hold only the course code on a MacIntosh microcomputer using Course of Action software. The future of the course in its present mode is not certain for a number of non-technical reasons, including maintenance costs on old hardware, curriculum changes, and the capabilities of microcomputers. Two tables depict the time requirements for students and computer memory requirements for each of the 14 topics covered by the one-semester introductory course. (4 references) (Author/EW)

  • Describes the use of simulation to introduce the bootstrapping technique of creating confidence intervals for the mean, median, and variance. These bootstrap confidence intervals are compared to the traditional confidence intervals for the purpose of analyzing accuracy of the technique. A Minitab program to produce confidence intervals is included. (MDH)

  • This document consists of three modules concerned with aspects of statistics. The first provides knowledge of the effect of imperfect correlation and random error on differences between means, and the reasons for the necessity of random allocation of objects to experimental and control conditions in scientific experimentation. The second unit shows how to: 1) Use frequency distributions and histograms to summarize data; 2) Calculate means, medians, and modes as measures of central location; 3) Decide which measures of central location may be most appropriate in a given instance; and 4) Calculate and interpret percentiles. The third module is designed to enable the student to: 1) discuss how approximation is pervasive in statistics; 2) compare "structural" and "mathematical" approximations to probability models; 3) describe and recognize a hypergeometric probability distribution and an experiment in which it holds; 4) recognize when hypergeometric probabilities can be approximated adequately by binomial, normal, or Poisson probabilities; 5) recognize when binomial probabilities can be approximated adequately by normal or Poisson probabilities; 6) recognize when the normal approximation to binomial probabilities requires the continuity correction to be adequate; and 7) calculate with a calculator or computer hypergeometric or binomial probabilities exactly or approximately. Exercises and tests, with answers, are provided in all three units. (MP)

  • An approach to teaching probability. First the history of probability concepts are outlined, then the concept of probability is introduced. Two other chapters deal with using combinatorics to solve probability problems and factors affecting probabilistic judgements in children and students. Finally curricula in probability and statistics for grades 5 to 12 are discussed. At the end there is an extended bibliography of English, French and German literature.

  • This manual designed for grade 5 is part of a series for a program to integrate the teaching and learning of mathematical and computer concepts and skills in the elementary school. The manual contains 27 lessons. Each lesson includes information on the topic, suggested grade level, mathematics concepts and skills, objective, prerequisite skills needed, and activities. Topics contained in the lessons include: (1) problem solving; (2) geometry; (3) numbers; (4) number concepts; (5) statistics; (6) measurement; and (7) probability, statistics, and graphing. Software programs used for the activities are specified for each lesson. (KR)

  • These proceedings contain lectures concerned with problem solving, applications of undergraduate mathematics, and aspects of current research in mathematics. The four working groups considered: (1) the role of feelings in learning mathematics; (2) the problem of rigor in mathematics teaching; (3) microcomputers in teacher education; and (4) the role of microcomputers in developing statistical thinking. Additionally, the two topic groups considered natural language and mathematics in human evolution and gender differences in learning outcomes on the Second International Mathematics Study. (PK)

  • This paper critically discusses explorative data analysis (EDA) from the point of view of an empirical descriptive scientific theory. EDA deals mainly with the exploration of data by means of predominantly graphical representations, i.e. the search for striking elements and structures in data sets and for simple collective descriptions of the phenomena revealed. The proper analysis of EDA in this piece of work is intended to lay a foundation for further didactic research and development work in this field, in particular as to whether it can effectively be made available to a wider circle of people.

Pages

list