Monday, December 30, 2013

The United Nation's "2013 World Happiness Report"

I am teaching positive psychology for the first time this semester. One way to quickly teach students that this isn't just Happy Psych. 101 is to show them convincing data collected by an international organization (here, the United Nations) that demonstrates the link between positive psychology and the well-being of nations.

This data isn't just for a positive psychology class: You could also use it more broadly to demonstrate how research methods have to be adjusted when data is collected internationally (see item 4) and as examples of different kinds of data analysis (as described under item 1).

1) Report on international happiness data from the United Nations.

If you look through the data collected, there is a survival analysis related to longevity and affect on page 66. A graphic on page 21 describes factors that account for global variance in happiness levels across countries. There is also a lot of data about mental health care spending in different nations.

2) A quick summary of a few data points from National Geographic.

Including points that have been made previously in positive psychology circles: a) living someplace with nice weather doesn't lead to happiness, b) after basic financial stability has been achieved, happiness doesn't increase in proportion to one's income.

3) Data, visualized, via Huffington Post.

4) Another rich source of well-being data comes from the Gallup polling organization. They collect data on various aspects of wellness, including subjective well being as well as health data from the US and around the world. Included are weekly polls on if Americans feel that they are thriving, struggling, or suffering as well information on how well-being data is collected internationally.

5) Finally, The Onion's take on America's ranking as the 17th happiest country in the world.

Monday, December 23, 2013

The Economist's "Unlikely Results"

A great, foreboding video  (here is a link to the same video at YouTube in case you hit the paywall) about the actual size and implication of Type II errors in scientific research. This video does a great job of illustrating what p < .05 means in the context of thousands of experiments.

Here is an article from The Economist on the same topic.

From TheEconomist

Monday, December 16, 2013

The Atlantic's "Congratulations, Ohio! You Are the Sweariest State in the Union"

While it isn't hypothesis driven research data, this data was collected to see which states are the sweariest. The data collection itself is interesting and a good, teachable example. First, the article describes previous research that looked at swearing by state (typically, using publicly available data via Twitter or Facebook). Then, they describe the data collection used for the current research:

"A new map, though, takes a more complicated approach. Instead of using text, it uses data gathered from ... phone calls. You know how, when you call a customer service rep for your ISP or your bank or what have you, you're informed that your call will be recorded? Marchex Institute, the data and research arm of the ad firm Marchex, got ahold of the data that resulted from some recordings, examining more than 600,000 phone calls from the past 12 months—calls placed by consumers to businesses across 30 different industries. It then used call mining technology to isolate the curses therein, cross-referencing them against the state the calls were placed from."

Nice big sample size, archival data, AND data collected in a very naturalistic setting of folks calling and complaining to companies. You could also discuss how this data may be more representative of the average American versus data collected only from folks who use FB or Twitter.

In addition to swearing, they also analyzed the data for courtesy. Way to go, South Carolina!

From The Atlantic

Monday, December 9, 2013

Washington Posts's "GAO says there is no evidence that a TSA program to spot terrorists is effective" (Update: 3/25/15)

The Travel Security Agency implemented SPOT training in order to teach air port security employees how to spot problematic and potentially dangerous individuals via behavioral cues. This intervention has cost the U.S. government $1 billion+. It doesn't seem to work.

By discussing this with your class, you can discuss the importance of program evaluations as well as validity and reliability. The actual government issued report goes into great detail about how the program evaluation data was collected to demonstrate that SPOT isn't working. The findings (especially the table and figure below) do a nice job of demonstrating the lack of reliability and the lack of validity. This whole story also implicitly demonstrates that the federal government is hiring statisticians with strong research methods backgrounds to conduct program evaluations (= jobs for students).

Here is a summary of the report from the Washington Post.

Here is a short summary and video about the report from CBS.

Here is the actual report.

This table from the official report demonstrates a lack of inter-rater reliability.

This figure from the report demonstrates a lack of validity in terms of SPOT leading to arrests (also demonstrates the concept of false positives/Type I error)

UPDATE (3/25/15): As reported by Brian Naylor (for NPR), the ACLU is suing for access to this efficacy data. They argue that the SPOT program has led to racial profiling and they filed a Freedom of Information Act petition in order to examine the data themselves. This update also describes in greater detail the debate about whether or not people are very good lie detectors and includes a brief interview with psychologists Nicholas Eppley and Anne Kring, making this a good applied social psychology example.

Monday, December 2, 2013

Monday, November 25, 2013

Burr Settles's "On “Geek” Versus “Nerd”"

Settles decided to investigate the difference between being a nerd and being a geek via a pointwise mutual association analysis (using archival data from Twitter). Specifically, he measured the association/closeness between various hashtag descriptors (see below) and the words nerd and geek. Settles provides a nice description of his data collection and analysis on his blog.

A good example of archival data use as well as PMA.

Monday, November 18, 2013

Joshua Katz's visualizations of American dialect data (edited 11/30)

I love American dialects. There might be a Starbuck's in every city, but our regions are still uniquely identifiable by the way we talk. Joshua Katz (graduate student in Statistics) at NCS created graphical representations of data from Cambridge that identified dialectical differences in how Americans speak. Here is a story about the maps and here are the maps themselves. AND: You can even take the Dialect Similarity Quiz that tells you (via map) what parts of the country tend to have language patterns like your own.

I think this demonstrates that 1) graphs are interesting ways of conveying information, 2) data being used to make predictions (of what portion of the U.S. you hail from), and 3) statisticians and social sciences gather interesting and varied data.

Edited to add: The Atlantic has a created a video that contains the audio of folks providing examples of their awesome accents whilst completing the original surve.

Monday, November 4, 2013

The Onion's "Son-Of-A-Bitch Mouse Solves Maze Researchers Spent Months Building"

Ha. This story is a good example of just how frustrating research can be, how well conceived research can go wrong, the ceiling effect, and why you should pre-test measures before going live.

"Above, researchers discuss plans for a new maze, since the prick of a mouse, right, destroyed their chances of making any new discoveries whatsoever about the nature of synaptical response."

Monday, October 21, 2013

"If the P is low, then the H0 must go"

Created by Kevin Clay

Priceless. More from Kevin Clay here

Aside: I am so, so pleased to now have Snoop Dogg as a label for my blog.

Monday, October 14, 2013

Lesson Plan: SIDS and plagioencephaly

I like the following examples because they are accessible, potentially life-saving, and demonstrate statistics that disprove convention (and saves lives!), and provide a good argument for program evaluation.

For decades, prevailing wisdom stated that we should put babies to sleep on their stomachs so that they wouldn't choke on their own spit-up in their sleep.

Then, lo-and-behold, data suggested that putting babies to sleep on their back reduced deaths due to Sudden Infant Death Syndrome (SIDS). BY HALF. Data disproved convention AND improved public health dramatically and cheaply as the American Academy of Pediatrics rolled out the Back To Sleep campaign to inform parents about this research and best practices for bedtime.

Now, the law of unintended consequences: Wee little babies are developing flat heads! My own son did (he is the cutie in the helmet), and required a helmet and physical therapy to correct the condition. More on the flat head (technical name: plagioencephaly) via the USAToday summary of a study from Pediatrics.

I think that the SIDS reduction rates far, far outweighs the possibility of developing plagioencephaly. However, I think this could be a good example for introducing your students to the importance of empirically studying things that seem obvious (like putting a baby to sleep on their stomach), collecting data to demonstrate that interventions work (see the graph above), but always be aware of the possibility of unintended consequences following interventions (per the Pediatrics article).

My son, who wore a helmet for four months (and attended PT for 10 months) to correct a flat spot on the back of his head.

Monday, October 7, 2013

Monday, September 30, 2013

Lesson Plan: The Hunger Games t-test review

Hey, nerds-

Here is a PPT that I use to review t-tests with my students. All of the examples are rooted in The Hunger Games. My students get a kick out of it and this particular presentation (along with my Harry Potter themed ANOVA review) is oft-cited as an answer to the question "What did you like the most about this class?" in my end of the semester reviews.

Essentially, I have found various psychological scales, applied them to THG, and present my students with "data" from the characters. For example, the students perform a one-sample t-test comparing Machvellianism in Capital leadership versus Rebellion leadership (in keeping with the final book of the series, the difference between the two groups is non-significant). So, as a psychologist, I can introduce my students to various psychological concepts in addition to review t-tests. Note: I teach in a computer lab using SPSS, which would be a necessity for using exercises.

Caveat: I would recommend using this only if you are familiar with and love The Hunger Games trilogy.

Monday, September 16, 2013

Northwestern Mutual's "The Longevity Game"

I guess "The Longevity Game" sounds better than The Death Calculator. Which is what Northwestern Mutual has created and shared with us. Essentially, you answer questions about yourself (weight, exercise, stress management, driving habits, drug and alcohol habits, etc.) and the Game will give you an estimation for how long you should live based on the data you provide.

The Longevity Game, from Northwestern Mutual

I use this in class to demonstrate how data and statistics influence certain aspects of our lives (like whether or not an insurer is willing to provide us with insurance coverage). This can also be used to introduce multiple regression, since multiple factors are taken into account when predicting the outcome measure of life expectancy.

I also make sure to emphasize to my students that this calculator was created by an insurance company that was founded in 1857 and that this calculator isn't just some random interwebz quiz.

Warning: I wouldn't ask students to discuss their personal life expectancy (as it deals with their weight, ability to handle stress, alcohol and drug abuse, and other personal things) but I do think it is safe to ask them about which measured variable were surprising (seat belt usage is usually one that is surprising).

Monday, September 9, 2013

r/skeptic's "I was practicing GraphPad and I think I may have discovered the 'real' cause of autism..."

NOTE: I'm not entirely certain about the origin of this graph, so I apologize if my citation isn't correct. The earliest version I could find was on imgur from user r/skeptic (yes, associated with the Skeptic subreddit).


I think the illustration above  is a good way of a) demonstrating that correlation does not equal causation and b) sticking it to anti-vaxers who use a lot of correlational data (see below) to back up their theories about why rates of Autism have been increasing.


Monday, September 2, 2013

The Colbert Report's "Texas Gun Training Bill & Free Shotgun Experiment"

The Colbert Report's take on Kyle Copland's research studying whether or not gun ownership lowers crimes. Copland's method? Handing out free .22s in high crime areas (to folks that pass a background check and take a gun safety course).


This applies more to a research methods class (Colbert expresses a need for a control group in Copland's research. His suggestion? Sugar guns as well as a second experimental condition in which EVERYONE is given a gun). However, I imagine that you could show your students this video and pause it before they introduce the research project and ask your students how we could finally answer this question of whether or not gun ownership lowers crimes.

Thanks to Chelsea for pointing this out!

Monday, August 26, 2013

University of Cambridge's Facebook Research

University of Cambridge's Psychometric Center has used statistics to make make personality predictions based upon an individual's Facebook "likes".

For instance, your likes can be used to create your Big Five personality trait profile. Your students can have their data FB "likes" analyzed at as to determine their Big Five traits. After your students complete the FB version of the scale, you could have your students complete a more traditional paper and pencil version of the inventory and discuss differences/similarities/concurrent validity between the two measures. Below, I've included a screen grab of my FB-derived Big Five rating from Note: Yes, that is how I score on more traditional versions of the same scale.

Generated at

In addition to Big Five prediction, the researchers also used the "like" data to make predictions of other qualities, like sexual orientation, intelligence, etc., based upon what you have liked on FB. Highlights: Liking curly fries is related to high intelligence, liking fan fiction with introversion, and a fondness for Timmy from South Park seems to be related with competitiveness.

And...if you register as a collaborator at the UofC website, you can access some of their data. Which is pretty generous, I think.

Here is the information regarding the data from the actual source.

Monday, August 19, 2013

Geert Hofstede's website

Hofstede is a psychology rockstar who studies multiculturalism (specifically, how his cultural dimensions vary from country to country and how this can impact organizations). This page generates bar graphs that illustrate how the two countries you specify vary on his dimensions. Below is a screen grab of the U.S. compared to Brazil along his dimensions. Note: If this all sounds vaguely familiar, it may be because you read Malcolm Gladwell's Outliers and he discusses Power Distance in the context of the Korean Air safety issues.

How could you use this in the classroom?
1) This could be a quick example of the importance of multicultural research (as the Western view of the world/attitudes are not the default setting for humans). 
2) A quick way of demonstrating bar graphs.
3) A good example of applied social psychology. 

Monday, August 12, 2013

Meme III

Want a good way to waste time when you should be prepping for the semester ahead? Go generate some stats/research methods memes. If you are feeling extra generous, please feel free to send them to me so I can share them with the group.

Created at by Jess Hartnett

Created at by Jess Hartnett

Monday, August 5, 2013

US News's "Poll: 78 Percent of Young Women Approve of Weiner"

Best. Awful. Headline. Ever.

This headline makes it sound like a large majority of young woman support the sexting, bad-decision-making, former NY representative Anthony Weiner. If one takes a moment to read the article, they will learn that the "young women" sampled were recruited from A website for women looking for sugar daddies.

If you want your brain to further explode, read through the comments section for the article. Everyone is reacting to the headline, very few people actually read through the article themselves...which provides further anecdotal evidence of the fact that most folks can't tell good data from bad (and that part of our job as statistics instructors, in my opinion, is to ameliorate this problem).

Thursday, August 1, 2013

Statistics and Pennsylvania's Voter ID Law

Prior to the 2012 presidential election, Pennsylvania attempted to enact one of the toughest voter ID laws in the nation. This law has been kicked up to the courts to examine its legality. One reason that so many people protested the law was because it would make it more difficult for the elderly and the poor to vote (as it would be more difficult for them to obtain the ID required). Here is an NPR story that gives a bit of background on the law and the case in court.  Also, for giggles and grins, here is Jon Stewart's more amusing explanation of the law and why it was struck down prior to the election, including video footage of a PA legislature flat-out stating that the Voter ID law would allow Romney to win the 2012 election.

In order to support/raise questions about the impact of the law on the ability to vote, statisticians have been brought in on both sides in order to estimate exactly how disenfranchising this law will be.

Essentially, the debate in court centers around an analysis performed by Dr. Bernard Siskin. His analysis found that 11,000 of PA's 8.2 million voters lacked the proper ID required to vote. The state argues that this number is inflated and that Dr. Siskin's research methods did not take into account the variety of different kinds of valid IDs that will be reasonable IDs.

More on the court case available below:

I think this could be useful in demonstrating a) statistics being used in court in order to persuade people about a big, important social justice issues, b) statisticians having jobs that don't involve non-stop number crunching, c) statistics in the news, and d) the experts here are arguing about research methods, thus reinforcing to your students that the problems that most people have with statistics don't have to to with the math, but the methods.

Monday, July 29, 2013

Gerd Gigerenzer on how the media interprets data/science

Gerd "I love heuristics" Gigernezer talking about the misinterpretation of research by the media (in particular, misinterpretation of data about oral contraceptives leads to increases in abortions). He argues that such misinterpretation isn't just bad reporting, but unethical.

Monday, July 22, 2013

Lesson plan: Posit Science and Hypothesis Testing

Here is a basic lesson plan that one could use to teach the hypothesis testing method in a statistics course. I teach in a computer lab but I think it could be modified for a non-lab setting, especially if you use a smart classroom. The lesson involves learning about a company that makes web-based games that improve memory (specifically, I use the efficacy testing the company did to provide evidence that their games do improve memory).

Posit Science is a company that makes computer based games that are intended to improve memory. I use material from the company's website when teaching my students about the scientific method. Here is what I do...

Property of

Monday, July 15, 2013

Thursday, July 11, 2013

Khan Academy's Central Limit Theorem

Khan Academy has plenty of fair use videos for "learning anything". They have a number of statistics/probability examples in their library. Including the Central Limit Theorem video below (I highlight this one as CLT usually leads to a lot of head scratching in my class).

Monday, July 8, 2013

Andy Field's Statistics Hell

Andy Field is a psychologist, statistician, and author. He created a funny, Dante's Inferno-themed web site that contains everything you ever wanted to know about statistics. I know, I know, you're thinking, "Not another Dante's Inferno themed statistics web site!". But give this one a try.

Property of Andy Field. I certainly can't take credit for this.

Some highlights:

1) The aesthetic is priceless. For example, his intermediate statistics page begins with the introduction, "You will experience the bowel-evacuating effect of multiple regression, the bone-splintering power of ANOVA and the nose-hair pulling torment of factor analysis. Can you cope: I think not, mortal filth. Be warned, your brain will be placed in a jar of cerebral fluid and I will toy with it at my leisure."

2) It is all free. Including worksheets, data, etc. How amazing and generous. And, if you are feeling generous and feel the need to compensate him for the website, he asks that you make a donation to child-welfare organization. Or you could buy his text book.

3) He is a psychologist that teaches statistics, which is different than being a statistician teaching statistics. Many of his examples are psychological.

4) Another thing I like about this site is that he has plenty of SPSS examples. Most of the texts I've reviewed have Excel, Minitab, various TI calculator instructions, but not many integrated SPSS examples. Not only does he provide data, he also provides some jing/video tutorials.

Monday, July 1, 2013

Cracked's "The five most popular ways statistics are used to lie to you"

If you aren't familiar with, it is a website that composes lists. Some are pretty amusing (6 Myths About Psychology That Everyone (Wrongly) Believes6 Things Your Body Does Every Day That Science Can't Explain). And some are even educational, like "The five most popular ways statistics are used to lie to you".

The list contains good points to encourage critical thinking in your students. Some of the specific points it touches upon:
1) When it is more appropriate to use median than mean.
2) False positives
3) Absolute versus relative changes in amount
4) Probability
5) Correlation does not equal causation

And you'll get mad street cred points from undergraduates for using a Cracked list. Trust me.

Monday, June 24, 2013

Lesson plan: Teaching margin of error and confidence intervals via political polling

One way of teaching about margin of error/confidence intervals is via political polling data.


Here is a good site that has a break down of polling data taken in September 2012 for the 2012 US presidential election. I like this example because it draws on data from several well-reputed polling sites, includes their point estimates of the mean and their margin of errors.

This allows for several good examples: a) the point estimates for the various polling organization all differ slightly (illustrating sampling error), b) the margin of errors  are provided, and c) it can be used to demonstrate how CIs can overlap, hence, muddying our ability to predict outcomes from point estimates of the mean.

I tend to follow the previous example with this gorgeous polling data from Mullenberg College:

This is how sampling is done, son! While stats teachers frequently discuss error reduction via big n, Mullenberg takes it a step further by only polling registered voters who plan on voting in the upcoming election.

I have my students review the .pdf which contains 9/2012 pre-election voting data for various state and national elections. NOTE: I live in PA, making this data  more applicable to my own students. You may want to look up data from your home state.

For extra credit, I have my students answer the following questions (which forces them to read for statistics):

a) How they define a "likely voter"?
b) What is the CI used?
c) What is the margin of error?
d) What is the n-size for this sample?

Monday, June 17, 2013

Jon Mueller's CROW website

I have been using Mueller's CROW website for years. It is a favorite teaching resource among my fellow social psychologists, with TONS of well-categorized resources for teaching social psychology. This resource is also useful to statistics/research methods instructors out there as it contains a section dedicated to research design with a sub-section for statistics.

Monday, June 3, 2013

Discover Magazine's "If a baby can do statistics you have no excuse"


Hahahaha. Like my C-students don't already feel bad enough about themselves, evidence now suggests that babies  have a rudimentary understanding of probability (this summary is also a good example of research methods in developmental psychology).

Monday, May 20, 2013

Stats in the News: Bloomberg Data Privacy Breach

Bloomberg LP makes a lot of money by compiling financial data and making it available to clients who pay $20K a year to access the data via special terminals.

Bloomberg also has a news branch. And reporters from the news branch have been collecting data from Bloomberg clients about how they are using/analyzing/etc. the Bloomberg data. Which has the clients up in arms as it could reveal business practices, propriety information, etc. When this story first made the news, the stock market plummeted. Currently, Bloomberg is launching its own investigation into the data abuse.

Here is one of the earlier news stories detailing the case as well as an NPR story about Bloomberg's reactions.
While this doesn't teach statistics, per se, it does provide you with an example to share with your students about real life application of statistics, the value of statistics, data mining, and how our current legal system is facing challenges in regards to regulating data.

Wednesday, May 8, 2013's "Packages sealed with "Atheist" tape go missing 10x more often than controls"

I originally cam across this story via More information from the source is available here.

Essential, these high-end German shoes are made by a company of devoted atheists. They even have their mailing materials branded with "athiest". And they had a problem with their packages being lost in by the USPS. They ran a wee experiement in which they sent out packages that were labled with the Atheist tape vs. not, and found that the Atheist packages went missing at a statistically higher rate than the non-denominational packages.

Property of

I think this could be used in the classroom because it is a pretty straight-forward research design, you can challenge your students to question the research design, simply challenge your students to read through the discussion of this article at the atheistberlin website, introduce your students to Milgram's "lost letter" technique and other novel research methods.

Monday, May 6, 2013

Jess Hagy's "This is Indexed"

Jess Hagy illustrates her observations about life using simple graphs. I use her illustrations in order to provide examples to my students.

Does this illustrate a positive or negative correlation?

Property of Jess Hagy

Would a correlation detect this relationship? Why or why not?

Property of Jess Hagy

According to this diagram, what two different factors may account for the shared variance between the two variables?

Property of Jess Hagy

Monday, April 29, 2013

Statistics Meme 2

After months of hard work, hypothesizing, data collection...then you hold your breath and click "OK" in SPSS...

From "I fucking love science" FB page

Tuesday, April 23, 2013

Shameless self-promotion

Here is a publication from Teaching of Psychology in which I outline not one, not two, not three, but FOUR free/cheap internet based activities to be used in statistics/research methods classes.

(If you have access to ToP publications, you can also get it here.)

Monday, April 22, 2013

Media Matter's "Today in dishonest Fox News charts"

How to lie with accurate data...note how Fox News used a "creative' graph in order to make an 8.6% unemployment rate look like a 9% unemployment rate. Full story available at Media Matters (which, admittedly, is very left-leaning).

From Media Matters

Monday, April 15, 2013

io9's "You're bitching about the wrong things when you read an article about science"

Colorful title aside, this article teaches critical thinking when analyzing scientific writing for validity and reliability.

Property of

As a Social Psychologist, I'm especially grateful that they covered the "Study of Duh" criticism. It also adresses the difference between bad science and bad journalism and why one needs to see the source material for research before they are in a position to truly evaluate a study.

Monday, April 8, 2013

Newsweek's "What should you really be afraid of?" Update 6/18/15

I use this when introducing the availability heuristic in Intro and Social (good ol' comparison of fatal airline accidents vs. fatal car crashes), but I think it could also be used in a statistics class. For starters, it is a novel way of illustrating data. Second, you could use it to spark a discussion on the importance of data-driven decision making when it comes to public policy/charitable giving. For instance, breast cancer has really good PR, but more women are dying of cardiovascular disease...where should the NSF concentrate its efforts to make the biggest possible impact?

Property of Newsweek

More of same from

Thursday, April 4, 2013

The Onion's "Are tests biased against students who don't give a shit?"

The language blue, so use at your own risk... but this faux debate is hilarious. I use it in my I/O and statistics classes to illustrate reliability, psychometric concerns related to test takers who are not totally engaged in their task, etc.

Monday, April 1, 2013

Franz H. Messerli's "Chocolate consumption, cognitive function, and Nobel Laureates"

A chocolate study seems very appropriate for the day after Easter.

Messerli's study found a strong and positive correlation between a nation's per capita chocolate consumption and the number of Nobel prizes won by that nation (see graph below). The research article is a pretty straight forward: The only statistical analysis conducted was a correlation, the journal article is very short, and it used archival data. As such, you can use this example to illustrate correlation and archival data as well as the dread "third variable" problem (by asking students to generate variables that may increase chocolate consumption as well as top-notch research/writing/peace/etc.).

Property of Messerli/New England Journal of Medicine