Monday, February 8, 2016

Davies' "Ted Cruz using firm that harvested data on millions of unwitting Facebook users"

So, this is a story of data mining and Mechanical Turk and data privacy and political campaigns. Lots of good stuff for class discussion about data privacy, applied use of data, etc.. It won't exactly teach your students how to ANOVA, but it is a good and timely discussion piece.

Short version of the story: Ted Cruz's campaign hired a consulting firm (Strategic Communications Laboratories, SCL) to gather information about potential voters. They did so by using Amazon's Mechanical Turk to recruit participants. Participants were asked to complete a survey that would give SCL access to your Facebook account. SCL would then download all visible user information from you. And then they would download the same information FROM ALL OF YOUR FRIENDS who did not consent to be involved in the study. Some mTurk users claim this was a violation of Amazon's Terms of Service.

This data was then used to create psychological profiles for campaigning purposes.

Discussion pieces:
-Would you be mad if your data was accessed thusly?
-Did the Cruz campaign violate ethics by gathering data from individuals who did not consent to having their data collected?
-Do you think that such data collection and research should be subject to an IRB? If yes, were IRB rules violated (the right to withdraw from a study, informed consent).

Monday, February 1, 2016

Dr. Mages' "APA Exposed: Everything you wanted to know about APA formatting but were afraid to ask."

Teaching undergraduates APA style is not fun. It is not fun for teachers. It is not fun for students. However, I think that the more tools that we, the teachers, have in order to convey the rules of APA style, the more likely we are to find something that finally sticks for our students. This week, I offer one such tool created by Dr. Wendy K. Mages. Dr. Mages created an online, self-paced, free Powerpoint presentation that teaches the essentials of APA style.

Lessons are presented in a PowerPoint-esque format with a voice-over (as well as a transcript)

I like that Dr. Mages includes some of her own experiences grading students papers in order to keep current students from making frequent mistakes that Dr. Mages has encountered. She also offers plenty of original examples and uses appropriate Powerpoint animations/highlighting to engage the viewer.

Monday, January 25, 2016

Statistics/RM videos from The Economist

TED isn't the only source of videos for teaching statistics. The Economist also makes animated videos that are lousy with data. One easy, no-pay-wall source for such videos is The Economists Videographic playlist on YouTube (there is a limit on number article views/month at their website).

One really statsy video from The Economist that I've featured previously on this blog explains the real life implications for Type I/Type II error in research (and, specifically, how it leads to errors in published research).

The other videos may not be as directly related to the teaching of statistical topics, but they do illustrate data. Topics range from American union membership trends to this video about world population growth. As you may have inferred from the source, many of these videos focus on national and global economic information, but all of the videos do present data that you can integrate into your classes.

Some are more applicable to teaching statistics: This video describes why we have so much data and keep on generating more data. Others are particularly applicable to social/interpersonal psychology, like this illustration of how "like likes like" in terms of education level (and how this may contribute to income inequality).

Like likes like

Others are about non-teaching topics but very relatable, like this video about shifting world age demographics or this one explaining why textbooks in America are so expensive.

These videos demonstrate to your students a) how data can be used to make a logical argument, b) how illustrated data can help to visualize a compelling story, and c) that statistics are used by people who do not work in explicitly stats-y careers.

Monday, January 18, 2016

Explaining between and within group differences using Pew Research data on religion/climate change

I am a big fan of Pew Research Center. They collect, share, and summarize data about a wide variety of topics. In addition to providing very accessible summaries of their findings, they also provide more in-depth information about their data collection techniques, including original materials used in their data collection and very through explanations of their methods.

One topic they collect Pew studies is religion and attitudes (religious and secular) held by people of different religions. And it got me thinking that I could use their data in order to explain within and between group differences at the heart of a conceptual understanding of ANOVA.

Specifically, Pew gathered data looking at between-group differences in beliefs in global climate change by religion...

Chart created by Pew Research

...and belief in climate change within just Catholics, divided up by political affiliation.

Chart created by Pew Research

The questionnaires differed slightly for the two surveys. However, both groups were asked whether or not global warming was caused by human activity. The data table illustrates the between group differences between religions and their views on climate change while the bar graphs demonstrate how within one group (Catholics) there is a fair amount of variability in beliefs about climate change, based upon political affiliation.

How to use in class? Well, the Catholics, as a group, report a 45% agreement with the idea that climate change is caused by human activity. Which is pretty different than White Evangelicals, who report a 28% agreement with this statement. So that would be significantly different, right? But wait...62% of Catholic Democrats agree that global climate change is caused by human activity, while only 24% of Catholic Republicans agree with this statement. That is an awful lot of within group variance.

More data on more religion is available from Pew.

Monday, January 11, 2016

Stein's "Is It Safe For Medical Residents To Work 30-Hour Shifts?"

This story describes an 1) an efficacy study that 2) touches on some I/O/Health psychology research and 3) has gained the unwanted attention of government regulatory agencies charged with protecting research participants.  

The study described in this story is an efficacy study that questions a decision made by the 2003 Accreditation Council for Graduate Medical Education. Specifically, this decision capped the number of hours that first-year medical student can work at 80/week and a maximum shift of 16 hours. The PIs want to test whether or not these limits improve resident performance and patient safety. They are doing so by assigning medical students to either 16-hour maximum shifts or 30-hour maximum shifts. However, the research participants didn't have the option to opt out of this research. Hence, an investigation by the federal government.

So, this is interesting and relevant to the teaching of statistics, research methods, I/O, and health psychology for a number of reasons.

1) As an I/O instructor, it is nice to double dip with a research methods example that studies an I/O topic (shift work/night shift and employee well-being).

2) Efficacy research must be conducted because intuition isn't always right. Here, they question whether a 30-hour shift is really worse than multiple 16-hour night shifts over the course of a week. The PIs argue that the longer shifts lead to more consistent care (your doctor doesn't change in the middle of your care), which may lead to fewer mistakes and better patient care.

3) This multi-location study looked at two different conditions: maximum 16-hour shift versus maximum 30-hour shifts.

4) None of the medical residents or their patients consented to be part of this research study nor were the medical students able to opt out without leaving their residency. The research is being investigated by the federal government, even though it was classified as "minimal risk" (and they use that applicable IRB term in the story).