Skip to main content

Posts

Showing posts with the label statistical literacy

My other favorite stats newsletter: The Washington Post's How to Read This Chart

 Unlike the Chartr newsletter, I love this as it feeds my fascination with data and provides interesting examples for the class. As I sit here writing (5/11/24), I am enjoying my other favorite stats newsletter, How to Read This Chart . The current newsletter discusses data visualizations used on the front page of the Post. Such as: Philip Bump lovingly curates this newsletter. One time, he found historic, unlabeled charts and asked readers for help interpreting them . I also thought this one, which compared the margin of error and sample sizes used by major national polling firms, fascinating .

New STP resources for teaching statistical reasoning in Intro Psych

A bit over a year ago, Susan Nolan asked me to chair the Statistical Literacy, Reasoning, and Thinking, Guidelines 2.0 for the Society for the Teaching of Psychology.  We were asked to explore and provide guidance for a) teaching statistical thinking in intro psychology and b) understanding how statistical thinking is taught across the psychology curriculum. This post will highlight the accomplishments of the first group, which created easy-to-implement teaching exercises that emphasize statistical reasoning skills in Intro Psych. The Guidelines 1.0 group provided lists of topics included in Intro Psych. The Guidelines 2.0 convened and created a series of brief, easy-to-apply exercises that correspond to the core topics typically taught in Intro. The sub-committee chair, Dr. Garth Neufeld, shared his considerable expertise about Intro Psychology to lead the group and center each exercise in American Psychological Association and American Statistical Association guidelines for under...

SPSP 2021 Key Note Address

I gave one of the keynote addresses at the STP Teaching Pre-conference this year. I'm so sad that we weren't all able to come together face to face, but I am so glad that I was invited to give this talk at a pre-conference I've attended off and on for years. Here is the PowerPoint. There are a bunch of links. TL;DR: You should do discussions in your stats classes. No, don't talk about degrees of freedom or bar graphs for an hour. Instead, help your novice statisticians see data (and your class!) IRL. A few details on my discussion days, and how it goes: I spend a full 55-minute period on the discussions. My students must submit a brief reflection piece about the readings. Here is how I describe the reflection piece and Discussion Day in my syllabus: Here is how I present the Discussion Day materials to my students:

APA's "How to Be A Wise Consumer of Psychological Research"

This is a nice, concise hand out from APA that touches on the main points for evaluating research. In particular, research that has been distilled by science reporters. It may be a bit light for a traditional research methods class, but I think it would be good for the research methods section of most psychology electives, especially if your students working through source materials. The article mostly focuses on evaluating for proper sampling techniques. They also have a good list of questions to ask yourself when evaluating research: This also has an implicit lesson of introducing the APA website to psychology undergraduates and the type of information shared at APA.org. (including, but not limited to, this glossary of psychology terms .)

Pew Research's "The art and science of the scatterplot"

Sometimes, we need to convince our students that taking a statistics class changes the way they think for the better. This example demonstrates that one seemingly simple skill, interpreting a scatter plot, is tougher than it seems. Pew Research conducted a survey on scientific thinking in America ( here is a link to that survey ) and they found that only 63% of American adults can correctly interpret the linear relationship illustrated in the scatter plot below. And that 63% came out a survey with multiple-choice responses! How to use in class: -Show your students that a major data collection/survey firm decided that interpreting statistics was an appropriate question on their ten-item quiz of scientific literacy. -Show your students that many randomly selected Americans can't interpret a scatter plot correctly. And for us instructors: -Maybe a seemingly simple task like the one in this survey isn't as intuitive as we think it is!

Paul Basken's "When the Media Get Science Research Wrong, University PR May Be the Culprit"

Here is an article from the Chronicle of Higher Education ( .pdf  in case you hit the pay wall) about what happens when university PR promotes research findings in a way that exaggerates or completely misrepresents the findings. Several examples of this are included (Smelling farts cures cancer? What?), including empirical study of how health related research is translated into press releases ( Sumner et al. , 2014). The Sumner et al. piece found, that among other things, that 40% of the press releases studied contained exaggerated advice based upon research findings. I think that this is an important topic to address as we teach our student not to simply perform statistical analyses, but to be savvy consumers of statistics. This may be a nice reading to couple with the traditional research methods assignment of asking students to find research stories in popular media and compare and contrast the news story with the actual research article. If you would like more di...

Justin Wolfers' "A Persuasive Chart Showing How Persuasive Charts Are"

NEVER MIND ABOUT THIS ONE, GUYS! https://hal.sorbonne-universite.fr/hal-01580259/file/Dragicevic_Jansen_2017.pdf (Note the second author). ___________________________________________________________ Wolfers (writing for the New York Times) summarizes a study from  Wansink and Tal  (2014) in which participants were either a) presented with just  in-text data about a drug trial or b) the text as well as with a bar graph that conveyed the exact same information. The results can be read below: Wolfers/NYT According to Wansink and Tal, the effects seem to be strongest in people who agreed with the statement "I believe in science". So, a graph makes a claim more "sciencier" and, therefore, more credible? Also, does this mean that science believers aren't being as critical because they already have an underlying belief in what they are reading?  I think this is a good way of conveying the power of graphs to students in a statistics class as well ...

Correlation =/= Causation

Tyler Vigen's Spurious Correlations

Tyler Vigen has has created  a long list of easy-to-paste-into-a-powerpoint graphs that illustrate that correlation does not equal causation. For instance, while per capita consumption of cheese and number of people who die by become tangled in their bed sheets may have a strong relationship (r = 0.947091), no one is saying that cheese consumption leads to bed sheet-related death. Although, you could pose The Third Variable question to your students for some of these relationships). Property of Tyler Vigens, http://i.imgur.com/OfQYQW8.png Vigen has also provided a menu of frequently used variables (deaths by tripping, sunlight by state) to help you look for specific examples. This portion is interactive, as you and your students can generate your own graphs. Below, I generated a graph of marriage rates in Pennsylvania and consumption of high fructose corn syrup. Generated at http://www.tylervigen.com/

UPDATE: The Knot's Infographic: The National Average Cost of a Wedding is $28,427

UPDATE: The average cost of a wedding is now $33,391, as of 2017 . Here is the most up to date infographic: Otherwise, my main points from the original version of this survey are still the same: 1) To-be-weds surveyed for this data come were users of a website used to plan/discuss/squee about pending nuptials. So, this isn't a random survey. 2) If you look at the fine print for the survey, the average cost points quoted come from people who paid for a given service. So, if you didn't have a reception band ($0 spent) your data wasn't used to create the average. Which probably leads to inflation of all of these numbers. _________________________________________ Original Post: This infographic describes the costs associated with an "average" wedding. It is a good example non-representative sampling and bending the truth via lies of omission. For the social psychologists in the crowd, this may also provide a good example of persuasion by establishing ...