"(Algorithms) are used so heavily, they don't just predict the future, they are the future." -Cathy O'Neil
^This quote from this NPR story made me punch the air in my little Subaru after dropping my kid off to school. What a great sentence. There are many great one-liners in this little five-minute review of algorithms.
This NPR story by Dina Temple-Raston is a great primer for All The Ethical Issues Related To Algorithms, accessible to non- or novice-statisticians. It clocks in at just under five minutes, perfect as a discussion prompt or quick introduction to the topic.
How to use in class:
They talk about regression without ever saying "regression":
"Algorithms use past patterns of success to predict the future."
So, regression, right? Fancy regression, but that one line can take this fancy talk of algorithms and make it more applicable to your students. Sometimes, I feel like I'm just waving my hands when I try to explain this very, very important piece of regression but this report describes the prediction side of regression succinctly.
Bias In Algorithms:
"The feedback loop that reinforces lucky people's luck." Historically, who is likely to get promoted at large, successful organizations? White dudes. If that is part of your algorithm, you'll keep promoting only white dudes. Which isn't to say that there aren't qualified white dudes, but you will miss out on great women and POC.
"By learning from the past, algorithms are doomed to repeat the past" (Another great one-liner!)
False Positives:
Who are spies? White men who work for the US government and speak Russian. There are a lot of those! If the federal government flagged everyone who met that description, they would flag many, many non-spies.
Similarly, people who are engaging in corporate espionage tend to show up before everyone else or stay after everyone else. So they can be sneaky and unobserved. But a new parent may also work very early hours so they can leave early to accommodate their kid's school schedule, or an employee might stay late routinely because they have a regular, early PT appointment. An algorithm can't know this.
Algorithms still can't beat human insights (n = 2):
The bigger narrative in this piece has to do with how various government agencies attempt to use algorithms to uncover likely spies within their ranks. In two very high profile spying cases (Aldrich Ames and Jerry Chun Shing Lee) the spies were uncovered not by an algorithm but by human analysts who noticed odd behaviors and acted on those observations.
^This quote from this NPR story made me punch the air in my little Subaru after dropping my kid off to school. What a great sentence. There are many great one-liners in this little five-minute review of algorithms.
This NPR story by Dina Temple-Raston is a great primer for All The Ethical Issues Related To Algorithms, accessible to non- or novice-statisticians. It clocks in at just under five minutes, perfect as a discussion prompt or quick introduction to the topic.
How to use in class:
They talk about regression without ever saying "regression":
"Algorithms use past patterns of success to predict the future."
So, regression, right? Fancy regression, but that one line can take this fancy talk of algorithms and make it more applicable to your students. Sometimes, I feel like I'm just waving my hands when I try to explain this very, very important piece of regression but this report describes the prediction side of regression succinctly.
Bias In Algorithms:
"The feedback loop that reinforces lucky people's luck." Historically, who is likely to get promoted at large, successful organizations? White dudes. If that is part of your algorithm, you'll keep promoting only white dudes. Which isn't to say that there aren't qualified white dudes, but you will miss out on great women and POC.
"By learning from the past, algorithms are doomed to repeat the past" (Another great one-liner!)
False Positives:
Who are spies? White men who work for the US government and speak Russian. There are a lot of those! If the federal government flagged everyone who met that description, they would flag many, many non-spies.
Similarly, people who are engaging in corporate espionage tend to show up before everyone else or stay after everyone else. So they can be sneaky and unobserved. But a new parent may also work very early hours so they can leave early to accommodate their kid's school schedule, or an employee might stay late routinely because they have a regular, early PT appointment. An algorithm can't know this.
Algorithms still can't beat human insights (n = 2):
The bigger narrative in this piece has to do with how various government agencies attempt to use algorithms to uncover likely spies within their ranks. In two very high profile spying cases (Aldrich Ames and Jerry Chun Shing Lee) the spies were uncovered not by an algorithm but by human analysts who noticed odd behaviors and acted on those observations.
Comments
Post a Comment