Skip to main content

Posts

Showing posts with the label Center for Open Science

Using manly beards to explain repeated measure/within subject design, interactions.

There are a lot of lessons in this one study  (Craig, Nelson, & Dixson, 2019): Within subject design, factorial ANOVA and interactions,and data is available via OSF. Let's begin: TL: DR: The original study looked and the presence or absence of beards and whether or not this affected participants' ability to decode the emotional expression on a man's face. Or, more eloquently: TL: DR: Their stimuli were pictures of the same dudes with and without beards. And those weren't just any dudes, they had been trained in the Ekman facial coding system as to make distinct expressions. Or... One participant, rating the same man in Bearded vs. Non-bearded condition, provides a clear example of within subject research design. This article also provides examples of interactions and two-way ANOVA. Here look at aggression ratings for expressing (happy v. angry) and face hairiness (clean-shaven v. beard). Look at that bearded face interaction! Bearded guy...

Naro's "Why can't anyone replicate the scientific studies from those eye-grabbing headlines?"

Maki Naro created a terrific comic strip detailing the replication, where it came from, where we are, and possible solutions.  You can use it in class to introduce the crisis and solutions. I particularly enjoy the overall tone: Hope is not lost. This is a time of change in statistics and methodology that will ultimately make science better. A few highlights: *History of science, including the very first research journal (and why the pressure to get published has lead to bad science) *Illustration of some statsy ways to bend the truth in science  *References big moments in the Replication Crisis  *Discusses the crisis AND solutions (PLOS, SIPS, COS)

Granqvist's "Why Science Needs to Publish Negative Results"

This  link  is worth it for these pictures alone: I know, right? Perfect for teaching research methods and explaining the positivity bias in publication. These figures also sum up the reasoning behind the new journal described in this article. New Negatives in Plant Science was founded in order to combat the file drawer problem. It publishes non-significant research. It is open access. It publishes commentaries. It even plans special issues for specific controversial topics within Plant Science. Which absolutely, positively are NOT my jam. However, the creators of this journal hope that it will serve as a model for other fields. Given the recent flare up in the Replication Crisis (now Replication War?), this new journal provides a model for on-going, peer reviewed, replication and debate. I think this journal (or the idea behind this journal) could be used in a research methods class as a discussion piece. Specifically, how else could we reduce the file dra...

Aschwanden's "Science is broken, it is just a hell of a lot harder than we give it credit for"

Aschwanden (for fivethirtyeight.com) did an extensive piece that summarizes that data/p-hacking/what's wrong with statistical significance crisis in statistics. There is a focus on the social sciences, including some quotes from Brian Nosek regarding his replication work. The report also draws attention to  Retraction Watch  and Center for Open Science as well as retractions of findings (as an indicator of fraud and data misuse). The article also describes our funny bias of sticking to early, big research findings even after those research findings are disproved (example used here is the breakfast eating:weight loss relationship). The whole article could be used for a statistics or research methods class. I do think that the p-hacking interactive tool found in this report could be especially useful illustration of How to Lie with Statistics. The "Hack your way to scientific glory" interactive piece demonstrates that if you fool around enough with your operationalized...

Dread Fall 2015 Semester

It's coming, guys. But let's get ahead of it. I thought I would re-share some resources that you may want to consider working into your curriculum this year. I picked out a few lessons and ideas that also require a bit of forethought and planning, especially if they become assessment measures for your class. Center for Open Science workshops: As previously discussed on this blog , COS offers f ree consultation  (face-to-face or online) to faculty and students in order to teach us about the open framework for science. They provide guidance about more more traditional statistical issues, like power calculations and conducting meta-analysis in addition to lessons tailored to introducing researchers to the COS framework. Take your students to an athletic event , talk about statistics and sports : I took my students to a baseball game and worked some statsy magic. You can do it, too. If not a trip to the ballpark, an on-campus or televised athletic event will w...

Center for Open Science's FREE statistical & methodological consulting services

Center for Open Science (COS) is an  organization  that seeks " to increase openness, integrity, and reproducibility of scientific research " . As a social psychologist, I am most  familiar  with COS as a repository for experimental data. However, COS also provides free consulting services as to teach scientists how to make their own research processes more replication-friendly .  As scholars, we can certainly take advantage of these services. As instructors, the kind folks at COS are willing to provide workshops to our students (including, but not limited to, online workshops). Topics that they can cover include:  Reproducible Research Practices, Power Analyses, The ‘New Statistics’, Cumulative Meta-analyses, and Using R to create reproducible code (or more information on scheduling, see their availability  calendar ). I once heard it said that the way you learn how to conduct research and statistics in graduate school will be the way you...