Skip to main content

NPR's "Data linking aspartame to cancer risk are too weak to defend, hospital says"

This story from NPR is a good example of 1) the media misinterpreting statistics and research findings as well as 2) Type I errors and the fact that 3) peer-reviewed does not mean perfect. Here is a print-version of the story, and here is the radio/audio version...(note: the two links don't take you to the exact same stories...the print version provides greater depth but the radio version eats up class time when you forget to prep enough for class AND it doesn't require any pesky reading on the part of your students).




Essentially, researchers at Brigham and Women's Hospital were studying the relationship between aspartame consumption and certain types of cancers (click here for the actual study). They got some unexpected gender-related findings (men who drank both regular and aspartame soda had a higher risk of cancer than women). So, the incredible story of the research was this gender effect that isn't unique to aspartame consumption.


Anyway, the PR folks at Brigham and Women's Hospital got a hold of the story and sensationalized the findings to the media, and hyped up the publication of the manuscript. This then lead to a big ol' mess as the authors had to try and stop the speeding train of a news story about how diet soda causes cancer and will make you dead and is evil. The authors also had to admit that the study had been denied publication at least four other journals before it found a home at the American Journal of Clinical Nutrition. And one of the co-authors is on the editorial board for the American Journal of Clinical Nutrition. Hmmm...


Do Shweddy Ball's contain aspartame? 
When I present this to my students, I ask them to identify if the authors were trying to prevent a Type I or Type II error, what can be done to keep the media from sensationalizing research reports and to name an alternate explanation for the findings.

Comments

Popular posts from this blog

Ways to use funny meme scales in your stats classes

Have you ever heard of the theory that there are multiple people worldwide thinking about the same novel thing at the same time? It is the multiple discovery hypothesis of invention . Like, multiple great minds around the world were working on calculus at the same time. Well, I think a bunch of super-duper psychology professors were all thinking about scale memes and pedagogy at the same time. Clearly, this is just as impressive as calculus. Who were some of these great minds? 1) Dr.  Molly Metz maintains a curated list of hilarious "How you doing?" scales.  2) Dr. Esther Lindenström posted about using these scales as student check-ins. 3) I was working on a blog post about using such scales to teach the basics of variables.  So, I decided to create a post about three ways to use these scales in your stats classes:  1) Teaching the basics of variables. 2) Nominal vs. ordinal scales.  3) Daily check-in with your students.  1. Teach your students the basics...

Using pulse rates to determine the scariest of scary movies

  The Science of Scare project, conducted by MoneySuperMarket.com, recorded heart rates in participants watching fifty horror movies to determine the scariest of scary movies. Below is a screenshot of the original variables and data for 12 of the 50 movies provided by MoneySuperMarket.com: https://www.moneysupermarket.com/broadband/features/science-of-scare/ https://www.moneysupermarket.com/broadband/features/science-of-scare/ Here is my version of the data in Excel format . It includes the original data plus four additional columns (so you can run more analyses on the data): -Year of Release -Rotten Tomato rating -Does this movie have a sequel (yes or no)? -Is this movie a sequel (yes or no)? Here are some ways you could use this in class: 1. Correlation : Rotten Tomato rating does not correlate with the overall scare score ( r = 0.13, p = 0.36).   2. Within-subject research design : Baseline, average, and maximum heart rates are reported for each film.   3. ...

Rouse, Russel, & Campbell (2025) is a curated list of Psi Chi journals that are perfect for Intro Stats.

This summer, the Psi Chi Journal of Psychology Research published  Rouse, Russel, and Campbell's Beyond the textbook: Psi Chi Journal articles in introductory psychology courses. It is a curated list of paywall-free Psi Chi articles, mostly with student co-authors, that are peer-reviewed and of an appropriate writing level and length to use in an Introduction to Psychology course. The authors provide the following information for each of the articles: In addition to being appropriate for Into Psych, these articles are also perfect for Intro Stats. In my classes, I emphasize the ability to read and write simple result sections. One way I would review this skill is by showing my students Results sections from published research and asking them to identify the test statistics, effect size, and other relevant information. This selection of articles features clear and concise results sections for t -tests, ANOVA, factorial ANOVA, regression, and correlation. I created a spreadsheet...