Resource Library

Statistical Topic

Advanced Search | Displaying 1 - 10 of 209
  • A cartoon that can be used in discussing how choosing an appropriate sample size must balance budget and logistics along with statistical power. The cartoon was used in the April 2023 CAUSE cartoon caption contest and the winning caption was written by retired AP Statistics teacher Jodene Kissler.  The cartoon was drawn by British cartoonist John Landers (www.landers.co.uk) based on an idea by Dennis Pearl from Penn State University.  An alternate caption for the cartoon might be “The Negative Correlation Moving Company had trouble holding on to their shorter employees,” that can be used to discuss the difference between positive and negative associations.

    0
    No votes yet
  • A cartoon that can be a vehicle to discuss the value of approximations in statistical inference and the need to check the fit of models. The cartoon was used in the October 2022 CAUSE cartoon caption contest and the winning caption was written by Eric Vance, from University of Colorado in Boulder. The cartoon was drawn by British cartoonist John Landers (www.landers.co.uk) based on an idea by Dennis Pearl from Penn State University.

    0
    No votes yet
  • A cartoon providing a nice way to introduce the value of data mining for finding patterns in data but not as a gold standard for inference. The cartoon was used in the July 2020 CAUSE cartoon caption contest and the winning caption was written by Charles Eugene Smith from North Carolina State University. The cartoon was drawn by British cartoonist John Landers (www.landers.co.uk) based on an idea by Dennis Pearl from Penn State University.

    0
    No votes yet
  • A cartoon that provides a clever way to introduce neural networks and machine learning topics. The cartoon was used in the June 2020 CAUSE cartoon caption contest and the winning caption was written by Luis Rivera-Galicia from Alcala University in Spain. The cartoon was drawn by British cartoonist John Landers (www.landers.co.uk) based on an idea by Dennis Pearl from Penn State University.

    0
    No votes yet
  • A cartoon to help discuss both the value and limits of making predictions with large amounts of data. The cartoon was drawn by American cartoonist Jon Carter in 2015.

    0
    No votes yet
  • A song designed to assist in teaching the basics of Multi-Armed Bandits, which is a type of machine learning algorithm and is the foundation for many recommender systems. These algorithms spend some part of the time exploiting choices (arms) that they know are good while exploring new choices.  The song (music and lyrics) was written in 2021 by Cynthia Rudin from Duke University and was part of a set of three data science oriented songs that won the grand prize in the 2023 A-mu-sing competition.  The lyrics are full of double entendres so that the whole song has another meaning where the bandit could be someone who just takes advantage of other people! The composer mentions these examples of lines with important meanings:
    "explore/exploit" - the fundamental topic in MAB!
    "No regrets" - the job of the bandit is to minimize the regret throughout the game for choosing a suboptimal arm
    "I keep score" - I keep track of the regrets for all the turns in the game
    "without thinking too hard,"  - MAB algorithms typically don't require much computation
    "no context, there to use," - This particular bandit isn't a contextual bandit, it doesn't have feature vectors 
    "uncertainty drove this ride." - rewards are probabilistic
    "I always win my game"  - asymptotically the bandit always finds the best arm
    "help you, decide without the AB testing you might do" - Bandits are an alternative to massive AB testing of all pairs of arms
    "Never, keeping anyone, always looking around and around" - There's always some probability of exploration throughout the play of the bandit algorithm

    0
    No votes yet
  • A music video designed to assist in teaching the basics of Multi-Armed Bandits, which is a type of machine learning algorithm and is the foundation for many recommender systems. These algorithms spend some part of the time exploiting choices (arms) that they know are good while exploring new choices (think of an ad company choosing an advertisement they know is good, versus exploring how good a new advertisement is). The music and lyrics were written by Cynthia Rudin of Duke University and was one of three data Science songs that won the grand prize and first in the song category for the 2023 A-mu-sing competition.

    The lyrics are full of double entendres so that the whole song has another meaning where the bandit could be someone who just takes advantage of other people! The author provides these examples of some lines with important meanings:
    "explore/exploit" - the fundamental topic in MAB!
    "No regrets" - the job of the bandit is to minimize the regret throughout the game for choosing a suboptimal arm
    "I keep score" - I keep track of the regrets for all the turns in the game
    "without thinking too hard,"  - MAB algorithms typically don't require much computation
    "no context, there to use," - This particular bandit isn't a contextual bandit, it doesn't have feature vectors 
    "uncertainty drove this ride." - rewards are probabilistic
    "I always win my game"  - asymptotically the bandit always finds the best arm
    "help you, decide without the AB testing you might do" - Bandits are an alternative to massive AB testing of all pairs of arms
    "Never, keeping anyone, always looking around and around" - There's always some probability of exploration throughout the play of the bandit algorithm

    0
    No votes yet
  • This song is about overfitting, a central concept in machine learning. It is in the style of mountain music and, when listening,  one should think about someone staying up all night trying to get their algorithm to work, but it just won't stop overfitting! The music and lyrics are by Cynthia Rudin from Duke University and was one of three data science songs  by Dr. Rudin that won the grand prize and 1st place in the song category in the 2023 A-mu-sing competition.

    0
    No votes yet
  • This song is about the k-nearest neighbors algorithm in machine learning. This popular algorithm uses case-based reasoning to make a prediction for a current observation based on nearby observations. The music and lyrics were written by Cynthia Rudin from Duke University who was accompanied by  Dargan Frierson, from University of Washington in the audio recording. The song is one of three data science songs written by Cynthia Rodin that took the grand prize and first prize in the song category in the 2023 A-mu-sing competition.

    0
    No votes yet
  • A cartoon to spark a discussion about the normal equations in the matrix approach to linear models.  The cartoon was created by Kylie Lynch, a student at the University of Virginia.  The cartoon won first place in the non-song categories of the 2023 A-mu-sing competition.

    0
    No votes yet

Pages

register