Out-of-class

  • The information in this resource provides an overview of ISS utilization up to the end of March 2013.  

    0
    No votes yet
  • The purpose of this work is to provide a comprehensive reference for facts about Project Apollo, America’s effort to put humans on the Moon.  While there have been many studies recounting the history of Apollo, this new book in the NASA History Series seeks to draw out the statistical information about each of the flights that have been long buried in numerous technical memoranda and historical studies. It seeks to recount the missions, measuring results against the expectations for them.

    0
    No votes yet
  • Probabilistic Risk Assessment (PRA) is a comprehensive, structured, and logical analysis method aimed at identifying and assessing risks in complex technological systems for the purpose of cost-effectively improving their safety and performance. NASA’s objective is to better understand and effectively manage risk, and thus more effectively ensure mission and programmatic success, and to achieve and maintain high safety standards at NASA. This PRA Procedures Guide, in the present second edition, is neither a textbook nor an exhaustive sourcebook of PRA methods and techniques. It provides a set of recommended procedures, based on the experience of the authors, that are applicable to different levels and types of PRA that are performed for aerospace applications. 

    0
    No votes yet
  • This NASA-HANDBOOK is published by the National Aeronautics and Space Administration (NASA) to provide a Bayesian foundation for framing probabilistic problems and performing inference on these problems. It is aimed at scientists and engineers and provides an analytical structure for combining data and information from various sources to generate estimates of the parameters of uncertainty distributions used in risk and reliability models. The overall approach taken in this document is to give both a broad perspective on data analysis issues and a narrow focus on the methods required to implement a comprehensive database repository.

    0
    No votes yet
  • Dr. Kuan-Man Xu from the NASA Langley Reserach Center writes, "A new method is proposed to compare statistical differences between summary histograms, which are the histograms summed over a large ensemble of individual histograms. It consists of choosing a distance statistic for measuring the difference between summary histograms and using a bootstrap procedure to calculate the statistical significance level. Bootstrapping is an approach to statistical inference that makes few assumptions about the underlying probability distribution that describes the data. Three distance statistics are compared in this study. They are the Euclidean distance, the Jeffries-Matusita distance and the Kuiper distance. "

    0
    No votes yet
  • This paper comes from researchers at the NASA Langley Research Center and College of William & Mary.  

    "The experience of retinex image processing has prompted us to reconsider fundamental aspects of imaging and image processing. Foremost is the idea that a good visual representation requires a non-linear transformation of the recorded (approximately linear) image data. Further, this transformation appears to converge on a specific distribution. Here we investigate the connection between numerical and visual phenomena. Specifically the questions explored are: (1) Is there a well-defined consistent statistical character associated with good visual representations? (2) Does there exist an ideal visual image? And (3) what are its statistical properties?"

    0
    No votes yet
  • This is a graduate level introduction to statistics including topics such as probabilty/sampling distributions, confidence intervals, hypothesis testing, ANOVA, and regression.  Perfect for students and teachers wanting to learn/acquire materials for this topic.

    5
    Average: 5 (1 vote)
  • This course covers methodology, major software tools and applications in data mining. By introducing principal ideas in statistical learning, the course will help students to understand conceptual underpinnings of methods in data mining. It focuses more on usage of existing software packages (mainly in R) than developing the algorithms by the students. The topics include statistical learning; resampling methods; linear regression; variable selection; regression shrinkage; dimension reduction; non-linear methods; logistic regression, discriminant analysis; nearest-neighbors; decision trees; bagging; boosting; support vector machines; principal components analysis; clustering. Perfect for students and teachers wanting to learn/acquire materials for this topic.

    0
    No votes yet
  • The emphasis in this course will be understanding statistical testing and estimation in the context of "omics" data so that you can appropriately design and analyze a high-throughput study. Since the measurement technologies are evolving rapidly, important objectives of the course are for students to gain a basic understanding of statistical principles and familiarity with flexible software tools so that you can continue to assess and use new statistical methodology as it is developed for new types of data.

    By the end of the course, you should be able to tailor the analysis of your data to your needs while maintaining statistical validity.  You should come out of the course with insight so that you can assess the validity of new statistical methodologies as they are introduced as well as understand appropriate statistical analyses for data types not discussed in the class. 

    Perfect for students and teachers wanting to learn/acquire materials for this topic.

    0
    No votes yet
  • The objective of this course is to learn and apply statistical methods for the analysis of data that have been observed over time.  Our challenge in this course is to account for the correlation between measurements that are close in time. Perfect for students and teachers wanting to learn/acquire materials for this topic.

    0
    No votes yet

Pages