Skip to main content

Posts

Showing posts with the label research article

Daily Cycles in Twitter Content: Psychometric Indicators

Here is a YouTube video that summarizes some research findings . The researchers looked at Tweets in order to study how are focus and emotions change with our sleep/wake cycles. And the findings are interesting and not terribly surprising. Folks are mellow and rational in the morning and contemplate their mortality at 2 AM. Make money, get paid. And THIS is why I go to bed by 9 AM. I don't need to think about death at 2:20 AM. How to use in class: 1) Archival data (via Tweet) to explore human emotion. 2) What are the shortcomings of this sample method. To be sure, their data set is ENORMOUS, but how are Twitter users different from other people? Do your students think these findings would hold for people who work the night shift? 3) Go back to the original paper and look more closely at the findings: https://journals.plos.org/plosone/article?id=10.1371/journal.pone.0197002 4) This data represents one of the ways that researchers collect real-time information ...

Raff's "How to read and understand a scientific paper: a guide for non-scientists"

Jennifer Raff  is a geneticist, professor, and enthusiastic blogger . She  created a helpful guide for how non-scientists (like our students) can best approach and make sense of research articles. The original article is very detailed and explains how to make sense of experts. Personally, I appreciate that this guide is born out of trying to debate non-scientists about research. She wants everyone to benefit from science and make informed decisions based on research. I think that is great. I think this would be an excellent way to introduce your undergraduates to research articles in the classroom. I especially appreciated this summary of her steps (see below). This could be turned into a worksheet with ease. Note: I still think your students should chew on the full article before they are ready to answer these eleven questions. http://blogs.lse.ac.uk/impactofsocialsciences/2016/05/09/how-to-read-and-understand-a-scientific-paper-a-guide-for-non-scientists/#author ...

Kevin McIntyre's Open Stats Lab

Dr. Kevin McIntryre from Trinity University has created the Open Stats Lab.  OSL provides users with research articles, data sets, and worksheets for studies that illustrate statistical tests commonly taught in Introduction to Statistics. Topics covered, illustrated beautifully by Natalie Perez All of his examples come from Open Science Framework-compliant publications from Psychological Science. McIntyre presents the OSF data (in SPSS, R, and .  CSV files are available ), the original research article, AND a worksheet to accompany each article. Layout for each article/data set/activity. This article demonstrates one-way ANOVA. I know. It can be challenging to find 1) research an UG can follow that 2) contains simple data analyses. And here, McIntryre presents it all. This project was funded by a grant from APS.

Everything is fucked: The syllabus, by Sanjay Srivastava (with links to articles)

This syllabus for  PSY 607: Everything is Fucked ,  made the rounds last week. The syllabus is for a course that  purports  that science is fucked. The course readings are a list of articles and books that hit on the limitations of statistics and research psychology ( p -values, shortcomings of meta-analysis, misuse of mediation, replication crisis, etc.). PSY 607 isn't an actual class ( as author/psychologist/blogger Srivastava explains in this piece from The Chronicle ) but it does provide a fine reading list for understanding some of the current debates and changes in statistics and psychology.  Most of articles are probably too advanced for undergraduates but perfectly appropriate for teaching graduat e students about our field and staying up to date as instructors of statistics. Here is a link to the original blog post/syllabus. 

Chris Taylor's "No, there's nothing wrong with your Fitbit"

Taylor, writing for Mashable , describes what happens when carefully conducted public health research (published in the  Journal of the American Medical Association ) becomes attention grabbing and poorly represented click bait. Data published in JAMA (Case, Burwick, Volpp, & Patel, 2015) tested the step-counting reliability of various wearable fitness tracking devices and smart phone apps (see the data below). In addition to checking the reliability of various devices, the article makes an argument that, from a public health perspective, lots of people have smart phones but not nearly as many people have fitness trackers. So, a way to encourage wellness may be to encourage people to use the the fitness capacities within their smart phone (easier and cheaper than buying a fitness tracker). The authors never argue that fitness trackers are bad, just that 1) some are more reliable than others and 2) the easiest way to get people to engage in more mindful walking...

Anya Kamenetz's "The Past, Present, And Future of High-Stakes Testing"

Kamenetz (reporting for NPR) talks about her book , Test , which is about the extensive use of standardized testing in our schools. Largely, this is a story about the impact these tests have had on how teachers instruct K-12 education in the US. However, a portion of the story discusses alternatives to annual testing of every student. Alternatives include using sampling to assess a school as well as numerous alternate testing methods (stealth testing, assessing child emotional well-being, portfolios, etc.). Additionally, this story touches on some of the implications of living in a Big Data society and what it is doing to our schools. I think this would be a great conversation starter for a research methods or psychometric course (especially if you are teaching such a class for a School of Education). What are we trying to assess: Individual students or teachers or schools? What are the benefits and short comings of these different kinds of assessments? Can you students come up with...

Quoctrung Bui's "Who's in the office? The American workday in one graph"

Credit: Quoctrung Bui/NPR Bui, reporting for NPR, shares  interactive graphs that demonstrate when people in different career fields are at the office. Via drop-down menus, you can compare the standard workdays of a variety of different fields (here, "Food Preparation and Serving" versus "All Jobs"). If you scoff at pretty visualizations and want to sink your teeth into the data yourself, may I suggest the original government report entitled, " American Time Use Survey " or a related publication by Kawaguci, Lee, & Hamermesh, 2013 . Demonstrates: Biomodal data, data distribution, variability, work-life balance, different work shifts.