Skip to main content

Posts

Showing posts with the label sampling error

Sampling Error (Taylor's Version)

Friends. You don't know what finding fun stats blog content has been like over the last few years. All of the data writers/websites I followed were always writing about, explaining, and visualizing COVID or political data (rightfully so). I prefer examples about puppies , lists of songs banned from wedding reception s, and ghosts . Memorable examples stick in my students' heads and don't presuppose any knowledge about psychological theory.  Due to the lack of silly data and my own life as a professor, mom of two, wife, and friend, my number of posts during The Rona definitely dipped.  But now, as the crocuses bloom in Erie, PA, the earth, and I, are finding new life and new examples. Nathaniel Rakich, writing for FiveThirtyEight, wrote a whole piece  USING TAYLOR SWIFT TO EXPLAIN POLLING/SAMPLING ERROR S. Specifically, this article tackles three different polling firms and how they went about asking Americans which Taylor Swift album is their favorite Taylor Swift album....

How to unsuccessfully defend your brand using crap data: A primer

As I write this blog post, Francis Haugen testifies on Capitol Hill and sheds light on some of Facebook's shady practices. TL;DR- Facebook realizes that its practices are support terrorism.  This led to a public relations blitz from Facebook, including Monika Bickert, who appeared on CNN . Of particular relevance is repeated reference made to an Instagram survey of 40 teens ( here is the documentation I was able to find ). I saw this tweet from Asha Rangappa Reaction about one of those interviews: https://twitter.com/AshaRangappa_/status/1445487820580081674 LOL I had to listen to this twice to make sure I didn’t mishear: This @Facebook exec repeatedly refers to a “survey” of FORTY teen Insta users— as in 4-0 — to support her assertion that the “majority” of teens have a great experience on the platform. For real. Listen to it https://t.co/Ye7ocWcnzG — Asha Rangappa (@AshaRangappa_) October 5, 2021 This example packs a lot of punch. It is a good one for the youths because it is ...

Sampling bias example via NASA, Pew Research Center, and Twitter

Today's post is one, small, to-the-point example of sampling bias. On May 27, 2020, my family and I were awaiting lift-off for the (subsequently grounded) NASA/SpaceX launch. To no one's surprise, I was following NASA on Twitter during the hoopla, and I noticed this Tweet: https://twitter.com/NASA/status/1265724481009594369 And I couldn't help but think: That is some sampling bias. Admittedly, their sample size is very impressive, with over 54K votes. But this poll went out to a bunch of people who love NASA so much that they follow it on Twitter.  What is a less biased response to this question? As always, Pew Research Center had my back. 58% of Americans responded that they definitely/probably weren't interested in traveling into space: https://www.pewresearch.org/fact-tank/2018/06/07/space-tourism-majority-of-americans-say-they-wouldnt-be-interested/ If you want to expand upon this example in class, you could ask your students to Google around for information on the ...

Aschwanden's "Why We Still Don’t Know How Many NFL Players Have CTE"

This story by Christine Aschwanden  from 538.com  describes the limitations of a JAMA article.   That JAMA article describes a research project that found signs of Chronic Traumatic Encephalopathy (CTE) in 110 out of 111 brains of former football players. How to use in stats and research methods: 1) It is research, y'all. 2) One of the big limitations of this paper comes from sampling. 3) The 538 article includes a number of thought experiments that grapple with the sampling distribution for all possible football players. 4) Possible measurement errors in CTE detection. 5) Discussion of replication using a longitudinal design and a control group. The research: The JAMA article details a study of 111 brains donated by the families deceased football players. They found evidence of CTE in 110 of the brains. Which sounds terrifying if you are a current football player, right? But does this actually mean that 110 out of 111 football players will develop CTE...

The Knot's Real Wedding Study 2017

The Knot, a wedding planning website, collected data on the amount of money that brides and grooms spend on items for their weddings. They shared this information, as well as the average cost of a wedding in 2017. See the infographic below: BUT WAIT! If you dig into this data and the methodology, you'll find out that they only collected price points from couples who ACTUALLY PAID FOR THOSE ITEMS. https://xogroupinc.com/press-releases/the-knot-2017-real-weddings-study-wedding-spend/ Problems with this data to discuss with your students: 1) No one who got stuff for free/traded for stuff would have their $0 counted towards the average. For example, one of my cousins is a tattoo artist and he traded tattoos for use of a drone for photos of their outdoor wedding. 2) AND...if you didn't USE a service, your $0 wasn't added to their ol' mean value. For example, we had our wedding and reception at the same location, so we spent $0 on a ceremony site. 3) As poi...

Great Tweets about Statistics

I've shared these on my Twitter feed, and in a previous blog post dedicated to stats funnies. However,  I decided it would be useful to have a dedicated, occasionally updated blog post devoted to Twitter Statistics Comedy Gold. How to use in class? If your students get the joke, they get a stats concept. *Aside: I know I could have embedded these Tweets, but I decided to make my life easier by using screenshots. How NOT to write a response option.  Real life inter-rater reliability Scale Development Alright, technically not Twitter, but I am thrilled to make an exception for this clever, clever costume: This whole thread is awesome...https://twitter.com/EmpiricalDave/status/1067941351478710272 Randomness is tricky! And not random! ...