Skip to main content

Posts

Showing posts with the label p-hacking

SPSP 2021 Key Note Address

I gave one of the keynote addresses at the STP Teaching Pre-conference this year. I'm so sad that we weren't all able to come together face to face, but I am so glad that I was invited to give this talk at a pre-conference I've attended off and on for years. Here is the PowerPoint. There are a bunch of links. TL;DR: You should do discussions in your stats classes. No, don't talk about degrees of freedom or bar graphs for an hour. Instead, help your novice statisticians see data (and your class!) IRL. A few details on my discussion days, and how it goes: I spend a full 55-minute period on the discussions. My students must submit a brief reflection piece about the readings. Here is how I describe the reflection piece and Discussion Day in my syllabus: Here is how I present the Discussion Day materials to my students:

Naro's "Why can't anyone replicate the scientific studies from those eye-grabbing headlines?"

Maki Naro created a terrific comic strip detailing the replication, where it came from, where we are, and possible solutions.  You can use it in class to introduce the crisis and solutions. I particularly enjoy the overall tone: Hope is not lost. This is a time of change in statistics and methodology that will ultimately make science better. A few highlights: *History of science, including the very first research journal (and why the pressure to get published has lead to bad science) *Illustration of some statsy ways to bend the truth in science  *References big moments in the Replication Crisis  *Discusses the crisis AND solutions (PLOS, SIPS, COS)

Sense about Science USA: Statistics training for journalists

In my Honors Statistics class, we have days devoted to discussing thorny issues surround statistics. One of these days is dedicated to the disconnect between science and science reporting in popular media. I have blogged about this issue before and use many of these blog posts to guide this discussion: This video by John Oliver is hilarious  and touches on p-hacking in addition to more obvious problems in science reporting, this story from NPR demonstrates what happens when a university's PR department does a poor job of interpreting r esearch results. The Chronicle covered this issue, using the example of mis-shared research claiming that smelling farts can cure cancer (a student favorite), and this piece describes a hoax that one "researcher" pulled in order to demonstrate how quickly the media will pick up and disseminate bad-but-pleasing research to the masses . When my students and I discuss this, we usually try to brain storm about ways to fix this problem. Pro...

John Oliver's "Scientific Studies" with discussion quesions

This hilarious video is making the rounds on the Interwebz. Kudos to John Oliver and his writing team for so succinctly and hilariously summarizing many different research problems...why replication is important but not rewarded, how research is presented to the public, how researchers over-reach about their own findings, etc.  I Tweeted about this, but am making it cannon by sharing as a blog post. Note: This video has some off-color humor (multiple references to bear fellatio) so it is best suited to college aged students. I will use this in my Online and Honors classes as discussion prompts. Here are some of the prompts I came up with: 1) In your own words, why aren't replications published? How do you think the scientific community could correct this problem?  2) In your own words, explain just ONE of how a RESEARCHER can manipulate their own data and/or research findings. It should be one of the methods of manipulation described in the video. Also, don't just na...

Aschwanden's "Science is broken, it is just a hell of a lot harder than we give it credit for"

Aschwanden (for fivethirtyeight.com) did an extensive piece that summarizes that data/p-hacking/what's wrong with statistical significance crisis in statistics. There is a focus on the social sciences, including some quotes from Brian Nosek regarding his replication work. The report also draws attention to  Retraction Watch  and Center for Open Science as well as retractions of findings (as an indicator of fraud and data misuse). The article also describes our funny bias of sticking to early, big research findings even after those research findings are disproved (example used here is the breakfast eating:weight loss relationship). The whole article could be used for a statistics or research methods class. I do think that the p-hacking interactive tool found in this report could be especially useful illustration of How to Lie with Statistics. The "Hack your way to scientific glory" interactive piece demonstrates that if you fool around enough with your operationalized...