Skip to main content

Posts

NYT's "You Draw It" series

As I've discussed in this space before, I think that it is just as important to show our students how to use statistics in real life as it is to show our students how to conduct an ANOVA. The "You Draw It" series from the New York Times provides an interactive, personalized example of using data to prove a point and challenge assumptions. Essentially, this series asks you to predict data trends for various social issues. Then it shows you how the data actually looks. So far, there are three of these features: 1) one that challenges assumptions about Obama's performance as president, 2) one that illustrates the impact of SES on college attendance, and 3) one that illustrates just how bad the opioid crisis has become in our country. Obama Legacy Data This "You Draw It" asks you to predict Obama's performance on a number of measures of success. Below, the dotted yellow line represents my estimate of the national debt under Obama. The blue line shows t...

Sense about Science USA: Statistics training for journalists

In my Honors Statistics class, we have days devoted to discussing thorny issues surround statistics. One of these days is dedicated to the disconnect between science and science reporting in popular media. I have blogged about this issue before and use many of these blog posts to guide this discussion: This video by John Oliver is hilarious  and touches on p-hacking in addition to more obvious problems in science reporting, this story from NPR demonstrates what happens when a university's PR department does a poor job of interpreting r esearch results. The Chronicle covered this issue, using the example of mis-shared research claiming that smelling farts can cure cancer (a student favorite), and this piece describes a hoax that one "researcher" pulled in order to demonstrate how quickly the media will pick up and disseminate bad-but-pleasing research to the masses . When my students and I discuss this, we usually try to brain storm about ways to fix this problem. Pro...

Reddit's data_irl subreddit

You guys, there is a new subreddit just for sharing silly stats memes. It is called r/data_irl/ . The origin story is pretty amusing. I have blogged about the subreddit r/dataisbeautiful  previously. The point of this sub is to share useful and interesting data visualizations. The sub has a hard and fast rule about only posting original content or well-cited, serious content. It is a great sub. But it leaves something to be desired. That something is my deep desire to see stats jokes and memes. On April Fool's Day this year, they got rid of their strict posting rules for a day and the dataisbeautiful crowd provided lots of hilarious stats jokes, like these two I posted on Twitter: The response was so strong, because there are so many of people that love stats memes, that a new sub was started, data_irl JUST TO SHARE SILL STATS GRAPHICS. It feels like coming home to my people. 

Day's Edge Production's "The Snow Guardian"

A pretty video featuring Billy Barr, a gentleman that has been recording weather day in his corner of Gothic, Colorado for the last 40 years.  Billy Barr This brief video highlights his work. And his data provides evidence of climate change. I like this video because it shows how ANYONE can be a statistician, as long as... They use consistent data collection tools... They are fastidious in their data entry techniques... They are passionate about their research. Who wouldn't be passionate about Colorado?

Shameless Self Promotion: I wrote a chapter in a book about Open Educational Resources!

Let's make the academy better for science and better for our students, and let's make it better for free. Want to learn how? I recommend a Open: The Philosophy and Practices that are Revolutionizing Education and Science , edited by Rajiv Jhangiani and Robert Biswas-Diener. In the spirit of open resources, it is totally free. In the spirit of open pedagogy and quick sharing of teaching ideas, I wrote a chapter for the book about how I've gone about sustaining a blog dedicated to teaching for the last four years . The basic message of my chapter: I blog about teaching, and you can, too!  Here are all the chapters from the book: Introduction to Open Rajiv S. Jhangiani & Robert Biswas-Diener A Brief History of Open Educational Resources M. Smith & T. J. Bliss Open Licensing and Open Education Licensing Policy Cable Green Openness and the Transformation of Education and Schooling David M. Monetti & William G. Huitt What Can OER Do f...

Johnson's "The reasons we don’t study gun violence the same way we study infections"

This article from The Washington Post summarizes research published in the Journal of the American Medical Association . Both are simple, short articles that show how you can use regression to make an argument. Here, the authors use regression to demonstrate the paucity of funding and publications for research studying gun-related deaths. A regression line was generated to predict how much money was spent studying common causes of death in the US. Visually, we can see that deaths by firearms aren't receiving funding proportional to the number of deaths they cause. See the graph below. How to use in class: 1) How is funding meted out by our government to better understand the problems that plague our country? Well, it isn't being given to researchers studying gun violence because of the Dickey Amendment . I grew up in a very hunting friendly/gun-friendly part of Pennsylvania. I've been to the shooting range. And it upsets me that we can't better understand and stu...

Retracton Watch's "Study linking vaccines to autism pulled following heavy criticism"

This example from Retraction Watch illustrates how NOT to do research. It is a study that was accepted and retracted from Frontiers in Public Health. It purported to find a link between childhood vaccination and a variety of childhood illnesses. This would be a good case study for Research Methods. In particular, this example illustrates: 1) Retraction of scientific studies 2) The problems with self-report surveys 3) Sampling and trying to generalized from biased samples 4) What constitutes a small sample size depending on the research you are conducting 5) Conflict of interest This study, since retracted, studied unvaccinated, partially vaccinated, and fully vaccinated children. And the study found " Vaccinated children were significantly less likely than the unvaccinated to have been diagnosed with chickenpox and pertussis, but significantly more likely to have been diagnosed with pneumonia, otitis media, allergies and NDDs (defined as Autism Spectrum Disorder, Attenti...

I've tracked all my son's first words since birth [OC]

Reddit user jonjiv conducted a case study in human language development. He carefully monitored his son's speaking ability, and here is what he found: https://imgur.com/gallery/KwZ6C#qLwsn9S...go to this link for a clearer picture of the chart! How to use in class: 1) Good for Developmental Psychology. Look at that naming explosion! 2) Good to demonstrate how nerdy data collection can happen in our own lives. 3) Within versus between subject design. Instead of sampling separate 10, 11, 12, etc. month old children, we have real-time data collected from one child. AND this isn't retrospective data, either. 4) Jonjiv even briefly describes his "research methodology" in the original post. The word had to be used in a contextually appropriate manner AND observed by both him and his wife (inter-rater reliability!). He also stored his data in a Google sheet because of convenience/ease of tracking via cell phone.

Annenberg Learner's "Against All Odds"

Holy smokes. How am I just learning about this amazing resource (thanks, Amy Hogan, for the lead) now? The folks over at Annenberg, famous for Zimbardo's Discovering Psychology series, also have an amazing video collection about statistics, called "Against All Odds" . Each video couches a statistical lesson in a story. 1) In addition to the videos , there are student and faculty guides to go along with every video/chapter. I think that using these guides, and instructor could go textbook free. 2) The topics listed approximate an Introduction to Statistics course. https://www.learner.org/courses/againstallodds/guides/faculty.html

rStats Institute's "Guinness, Gossett, Student, and t Tests"

This is an excellent video for introducing t -tests AND finally getting the story straight regarding William Gossett, Guinness Brewery, and why Gossett published under the famous Student pseudonym. What did I learn? Apparently, Gossett DID have Guinness' blessings to publish. Also, this story demonstrates statisticians working in Quality Assurance as the original t-tests were designed to determine the consistency in the hops used in the brewing process. Those jobs are still available in industry today. Credit goes to the RStats Institute at Missouri State University.  This group has created many other tutorial videos for statistics as well.

Raff's "How to read and understand a scientific paper: a guide for non-scientists"

Jennifer Raff  is a geneticist, professor, and enthusiastic blogger . She  created a helpful guide for how non-scientists (like our students) can best approach and make sense of research articles. The original article is very detailed and explains how to make sense of experts. Personally, I appreciate that this guide is born out of trying to debate non-scientists about research. She wants everyone to benefit from science and make informed decisions based on research. I think that is great. I think this would be an excellent way to introduce your undergraduates to research articles in the classroom. I especially appreciated this summary of her steps (see below). This could be turned into a worksheet with ease. Note: I still think your students should chew on the full article before they are ready to answer these eleven questions. http://blogs.lse.ac.uk/impactofsocialsciences/2016/05/09/how-to-read-and-understand-a-scientific-paper-a-guide-for-non-scientists/#author ...

NY Magazine's "Finally, Here’s the Truth About Double Dipping"

New York Magazine's  The Science of Us made a brief, funny video that investigates the long running issue of the dangers of double dipping.  It is based on a Scientific America report of an actual published research article  about double dipping. Yes, it includes the Seinfeld clip about George double dipping. The video provides a brief example of how to go about testing a research hypothesis by operationalizing a hypothesis, collecting, and analyzing data. Here, the abstract question is about how dirty it is to double dip. And they operationalized this question: Research design: The researchers used a design that, conceptually, demonstrates ANOVA logic (the original article contains an ANOVA, the video itself makes no mention of ANOVA). The factor is "Dips" and there are three levels of the factor: Before they double dipped, they took a base-line bacterial reading of each dip. Good science, that. They display the findings in table form (aga...

Refutations to Anti-Vaccine Memes' Vaccination rates vs. infection rates

Refutation to Anti-vaccination Memes  came up with this excellent illustration to explain why anti-vaxxers shouldn't claim a "win" just because more vaccinated people than unvaccinated people get sick during an outbreak. This example has a bit more credence if paired with actual immunization rate/infection rate data. For instance, in a case when an outbreak has occurred, and most infected are immunized, but there were still some un-immunized individuals. To further this case, yes, most people in America are immunized . However, here  is an example of an outbreak that has been linked to un-vaccinated folks. How to use it in class: -Base rate fallacy (which DOES matter when making an argument with descriptive stats!) -Relative v. absolute risk. -Making sense of and contextualizing descriptive statistics.