Data Presentation

  • The Neutral Buoyancy Laboratory allows astronauts an atmosphere resembling zero gravity (weightlessness) in order to train for missions involving spacewalks. In this activity, students will evaluate pressures experienced by astronauts and scuba divers who assist them while training in the NBL.  This lesson addresses correlation, regression, residuals, inerpreting graphs, and making predictions.

    NASA's Math and Science @ Work project provides challenging supplemental problems for students in advanced science, technology, engineering and mathematics, or STEM classes including Physics, Calculus, Biology, Chemistry and Statistics, along with problems for advanced courses in U.S. History and Human Geography.

    0
    No votes yet
  • Math and Science @ Work presents an activity for high school AP Statistics students. In this activity, students will look at data from an uncalibrated radar and a calibrated radar and determine how statistically significant the error is between the two different data sets.

    NASA's Math and Science @ Work project provides challenging supplemental problems for students in advanced science, technology, engineering and mathematics, or STEM classes including Physics, Calculus, Biology, Chemistry and Statistics, along with problems for advanced courses in U.S. History and Human Geography.

    0
    No votes yet
  • NASA's Math and Science @ Work presents an activity focused on correlation coefficients, weighted averages and least squares. Students will analyze the data collected from a NASA experiment, use different approaches to estimate the metabolic rates of astronauts, and compare their own estimates to NASA's estimates.

    NASA's Math and Science @ Work project provides challenging supplemental problems for students in advanced science, technology, engineering and mathematics, or STEM classes including Physics, Calculus, Biology, Chemistry and Statistics, along with problems for advanced courses in U.S. History and Human Geography.

    0
    No votes yet
  • This presentation is a part of a series of lessons on the Analysis of Categorical Data.  This lecture overs the following: covariance patterns and generalized estimating equations (GEE). 

    0
    No votes yet
  • This presentation is a part of a series of lessons on the Analysis of Categorical Data.  This lecture overs the following:  odds ratio, dependent proportion, marginal homogeneity, McNemar's Test, marginal homogeneity for greater than 2 levels, measures of agreement, and the kappa coefficient (weighted vs. unweighted).

    0
    No votes yet
  • This presentation is a part of a series of lessons on the Analysis of Categorical Data. This lecture covers the following: sparse tables, sampling zeros, structural zeros, and log-linear model (and limitations).

    0
    No votes yet
  • This presentation is a part of a series of lessons on the Analysis of Categorical Data. This lecture covers the following: partial/conditional tables, confounding, types of independence (mutual, joint, marginal, and conditional), identifiability constraints, partial odds ratios, hierarchical log-linear model, pairwise interaction log-linear model, conditional independence log-linear model, goodness of fit, and model building.

    0
    No votes yet
  • This presentation is a part of a series of lessons on the Analysis of Categorical Data. This lecture covers the following: maximum likelihood estimation for logistic regression, sample size requirements for approximate normality of the MLE’s, confidence intervals, likelihood ratio statistic, score test statistic, deviance, Hosmer-Lemeshow goodness-of-fit statistic, the Hosmer-Lemeshow statistic, parameter estimates, scaled/unscaled estimates, residuals, grouped binomials, and model building strategies.

    0
    No votes yet
  • This presentation is a part of a series of lessons on the Analysis of Categorical Data. This lecture covers the following: Mantel-Haenszel estimator of common odds ratio, confounding in logistic regression, univariate/multivariate analysis, bias vs. variance, and simulations.

    0
    No votes yet
  • This presentation is a part of a series of lessons on the Analysis of Categorical Data. This lecture covers the following: Pearson's residuals and rules for partitioning an I x J contingency tables as ways to determine association between variables.

    0
    No votes yet

Pages

register