Skip to main content

Posts

Showing posts with the label medical interventions

The Washington Post, telling the story of the opioid crisis via data

I love dragging on bad science reporting as much as anyone, but I must give All Of The Credit to the Washington Post and its excellent, data-centered reporting on the opioid epidemic . It is a thing of beauty. How to use in class: 1) Broadly, this is a fine example of using data to better understand applied problems, medical problems, drug problems, etc. 2) Specifically, this data can be personalized to your locale via WaPo's beautiful, functional website . 3) After you pull up you localized data, descriptive data abound...# of pills, who provided them, who wrote the scripts (y'all...Frontier Pharmacy is like two miles from my house)...   4) Everyone teaches about frequency tables, right? Here is a good example: 5) In addition to localizing this research via the WaPo website, you can also personalize your class by looking for local reporting that uses this data. For instance, the Erie newspaper reporter David Bruce reported on our local problem ( .pdf of the...

A big metaphor for effect sizes, featuring malaria.

TL; DR- Effect size interpretation requires more than numeric interpretation of the effect size. You need to think about what would be considered a big deal, real-life change worth pursuing, given the real-world implications for your data. For example, there is a  malaria vaccine with a 30% success rate undergoing  a large scale trial in Malawi . If you consider that many other vaccines have much higher success rates, 30% seems like a relatively small "real world" impact, right? However, two million people are diagnosed with malaria every year. If science could help 30% of two million, the relatively small effect of 30% is a big deal. Hell, a 10% reduction would be wonderful. So, a small practical effect, like "just" 30%, is actually a big deal, given the issue's scale. How to use this news story: a) Interpreting effect sizes beyond Cohen's numeric recommendations. b) A primer on large-scale medical trials and their ridiculously large n-sizes and tra...

Watson's For Women Over 30, There May Be A Better Choice Than The Pap Smear

Emily Watson, writing for NPR, describes medical research by Ogilvie, vanNiekerk, & Krajden . This research provides a timely, topical example of false positives, false negatives, medical research, and gets your students thinking a bit more flexibly about measurement. This research provides valuable information about debate in medicine: What method of cervical cancer detection is most accurate: The traditional Pap smear, or an HPV screening? The Pap smear works by scraping cells off of a cervix and having a human view and detect abnormal cervical cancer cells. The HPV test, indeed, detects HPV. Since HPV causes 99% of cervical cancers, its presence signals a clinician to perform further screen, usually a colonoscopy. The findings: Women over 30 benefit more from the HPV test. How to use this example in class: - This is a great example of easy-to-follow  research methodology and efficacy testing in medicine. A question existed: Which is better, Pap or HPV test? The questi...

My favorite real world stats examples: The ones that mislead with real data.

This is a remix of a bunch of posts. I brought them together because they fit a common theme: Examples that use actual data that researchers collected but still manage to lie or mislead with real data. So, lying with facts. These examples hit upon a number of themes in my stats classes: 1) Statistics in the wild 2) Teaching our students to sniff out bad statistics 3) Vivid examples are easier to remember than boring examples. Here we go: Making Graphs Fox News using accurate data and inaccurate charts to make unemployment look worse than it is. Misleading with Central Tendency The mean cost of a wedding in 2004 might have been $28K...if you assume that all couples used all possible services, and paid for all of the services. Also, maybe the median would have been the more appropriate measure to report. Don't like the MPG for the vehicles you are manufacturing? Try testing your cars under ideal, non-real world conditions to fix that. Then get fined by the EPA. Mis...

Hancock's "Skip The Math: Researchers Paint A Picture Of Health Benefits And Risks"

Two scientists, Lazris and Rifkin, want to better illustrate the risks and benefits associated with preventative medicine. They do so by asking people to imagine theaters filled with 1,000 people, and describing the costs and benefits for different preventative procedures by discussing how many people in the theater will be saved or perish based on current efficacy data. One such video can be viewed here and illustrates the absolute and relative risks associated with mammography. They are attempting to demystify statistics and better explain the risks and benefits by showing an animated theater filled with 1,000 women, and showing how many women actually have their lives saved by mammograms (see screen shot below)... ...as well as the number of women who received false positives over the course of a life time... A screen shot of the video, which is trying a new way to illustrate risk. ...the video also illustrates how a "20% reduction in breast cancer deaths" ca...