Sorry, you need to enable JavaScript to visit this website.

• ### Analysis Tool: One-Way Analysis of Variance for Independent or Correlated Samples

This page will compute the One-Way ANOVA for up to five samples. The design can be either for independent samples or correlated samples (repeated measures or randomized blocks). This page will also perform pair-wise comparisons of sample means via the Tukey HSD test

• ### Analysis Tool: Two-Factor ANOVA with Repeated Measures on Both Factors

This page will perform a two-way factorial analysis of variance for designs in which there are 2-4 levels of each of two variables, A and B, with each subject measured under each of the AxB combinations.

• ### Analysis Tool: Two-Factor ANOVA with Repeated Measures on One Factor

This page will perform a two-way factorial analysis of variance for designs in which there are 2-4 randomized blocks of matched subjects, with 2-4 repeated measures for each subject.

• ### Analysis Tool: Two-Way Analysis of Variance for Independent Samples

This page will compute the Two-Way Factorial ANOVA for Independent Samples, for up to four rows by four columns. This page will also calculate the critical values of Tukey's HSD for purposes of post-ANOVA comparisons.

• ### Analysis Tool: The Power of the Chi-Square Goodness of Fit Test (Monte Carlo Simulation)

In the first simulation, random samples of size n are drawn from the population one sample at a time. With df=3, the critical value of chi-square for significance at or beyond the 0.05 level is 7.815; hence, any calculated value of chi-square equal to or greater than 7.815 is recorded as "significant," while any value smaller than that is noted as "non-significant." The second simulation does the same thing, except that it draws random samples 100 at a time. The Power of the Chi-Square "Goodness of Fit" Test pertains to the questionable common practice of accepting the null hypothesis upon failing to find a significant result in a one- dimensional chi-square test.

• ### Analysis Tool: Basic Linear Correlation and Regression (Data-Import Version)

The following pages calculate r, r-squared, regression constants, Y residuals, and standard error of estimate for a set of N bivariate values of X and Y, and perform a t-test for the significance of the obtained value of r. Allows for import of raw data from a spreadsheet; for samples of any size, large or small.

• ### Analysis Tool: Spearman Rank Order Correlation Coefficient

This page will calculate r_s , the Spearman rank- order correlation coefficient, for a bivariate set of paired XY rankings. As the page opens, you will be prompted to enter the number of items for which there are paired rankings. If you are starting out with raw (unranked) data, the necessary rank-ordering will be performed automatically.

• ### Chebychev's Estimate

For n = 50 to 400, in steps of size 5, this program computes and displays (1) the exact probability P(|A_n - p| >= epsilon), where A_n is the average outcome of n Bernoulli trials with probability p of success, and (2) the Chebyshev estimate p(1-p)/(n(epsilon^2)) for this probability. You can specify p and epsilon.
• ### Statistical Methods in Biomedical Imaging

These lecture notes are composed of nearly 180 PowerPoint slides that have been coverted to a pdf file (6 per page) on Biomedical Imaging. The following topics are outlined: Vocabulary, Displaying Data, Central Tendency and Variability, Normal Z-scores, Standardized Distribution, Probability, Samples & Sampling Error, Type I and Type II Errors, Power of a Test, Hypothesis Testing, One Sample Tests, Two Independent Sample Tests, Two Dependent Sample Tests & Estimation, Correlation and Regression Techniques, Non-Parametric Statistical Tests, Applications of Central Limit Theorem, Law of Large Numbers, Design of Studies and Experiments, Fisher's F-Test, Analysis Of Variance(ANOVA), Principle Component Analysis (PCA), Chi-Square Goodness-of-fit test, Multiple Linear Regression, General Linear Model, Bootstrapping and Resampling.
• ### Analysis Tool: Basic Linear Correlation and Regression (Direct-Entry Version)

The following pages calculate r, r-squared, regression constants, Y residuals, and standard error of estimate for a set of N bivariate values of X and Y, and perform a t-test for the significance of the obtained value of r. Values of X and Y are entered directly into individual data cells. This page will also work with samples of any size, though it will be rather unwieldy with samples larger than about N=50. As the page opens, you will be prompted to enter the value of N.