SHOW AUDIO: Link is usually posted within about 72 hours of show broadcast. We take callers during this show at 713-526-5738.
Thinkwing Radio with Mike Honig (@ThinkwingRadio), a listener call-in show airing live every Monday night from 2-3 PM (CT) on KPFT-FM 90.1 (Houston). My engineer is Don.
Listen live on the radio or on the internet from anywhere in the world! When the show is live, we take calls at 713-526-5738. (Long distance charges may apply.)
Please take a moment to visit Pledge.KPFT.org and choose THINKWING RADIO from the drop-down list when you donate.
For the purposes of this show, I operate on two mottoes:
- You’re entitled to your own opinion, but not your own facts;
Houston Mayor Annise Parker [L] with Mike, just before the show. (Dec. 7, 2015)
- An educated electorate is a prerequisite for a democracy.
SIGNOFF QUOTE[s]:
“You see things; and you say ‘Why?’ But I dream things that never were; and I say ‘Why not?’” ~ George Bernard Shaw (1856–1950), Back to Methuselah, act I, Selected Plays with Prefaces, vol. 2, p. 7 (1949). The serpent says these words to Eve.
Senator Robert F. Kennedy used a similar quotation as a theme of his 1968 campaign for the presidential nomination: “Some men see things as they are and say, why; I dream things that never were and say, why not.” ~ Senator Edward M. Kennedy quoted these words of Robert Kennedy’s in his eulogy for his brother in 1968.—The New York Times, June 9, 1968, p. 56.
______________________________________________________________________
- Make sure you are registered to vote:
- HarrisVotes.com (Election Information Line (713) 755-6965)
- VoteTexas.gov
- Next Election: August 25, 2018 – Harris County Flood Control District Bond Election
- Be sure to early-vote when polling places and hours are announced
- Election Day polls are open from 7 a.m. to 7 p.m.
- You may vote early by-mail if you are registered to vote and meet one of the following criteria:
- Away from the county of residence on Election Day and during the early voting period;
- Sick or disabled;
- 65 years of age or older on Election Day;
- or Confined in jail, but eligible to vote.
- 65 years of age or older on Election Day;
- or Confined in jail, but eligible to vote.
- Today’s show is pre-recorded. I hope you find it interesting but, unfortunately, we will not be able to take phone calls today. If I get emails or tweets asking for Melissa to return in an open forum, she has agreed to do that.
- What is risk assessment?
- What is the history of risk assessment in criminal justice?
- What decisions do risk assessment outcomes inform?
- These tools predict the risk of what, exactly?
- I understand that some risk assessment tools are now based on algorithms. What does that mean in this context?
- To what extent are these systems an Artificial Intelligence (AI)
- What is sexual dimorphism, and how does it apply here?
- Can an algorithm be sexist?
- While your paper is about gender bias specifically, how do these predictive algorithms work more generally, and how are they created?
- There has been some media attention about a “racist” algorithm, what is that about?
- Ethnicity subdivides into cultures
- How can a computer algorithm be biased?
- What are the perceived values of automated risk assessment?
- If risk assessment tools are driven by a computer algorithm, does that not mean they are objective and transparent?
- What are some of the common factors included in risk assessment tools?
- I assume these algorithms are proprietary, so no “looking under the hood”?
- Financial Prospectuses usually state, “Past Performance Is No Guarantee f Future Results.” In the case of predictive algorithms, to what extent are past histories successfully modeled, and how well do they real past outcomes? Is there any standard or field-recognized test for this?
- Are these tools better than gut instinct?
- There’s a saying that behind every computer error there’s a human error. Isn’t that the place for first assumptions?
- Another saying: “To err is human. To really screw up takes a computer.” Computers take human error and/or assumptions and magnify them exponentially. Once a computer has decided “the future”, does case management become in some ways a self-fulling prophecy?
- In the broadest sense, is this a bit like “Minority Report” (Probably everyone’s first question) which involves predicting commissions of crimes and then arresting people before the crimes are committed?
- What are some of the relevant ethical or legal issues that are being discussed?
- What would an objective experiment look like? What kinds of permissions would be necessary from offenders?
- A “double blind” study might not permit judges and prosecutors to see or speak directly to offenders in order to eliminate subconscious biases based on things like skin color, clothing, speech (accent, idioms) etc.
- Have any similar studies been attempted?
- More broadly:
- There is gender-bias in all sorts of research. How/why is this different?
- How long have these tools been place?
- How widely are they used? How many algorithms have passed into common use?
- Do conditions of initial incarceration (“dangerous” vs “safe”, environmentally adequate vs. chronically uncomfortable, etc.) affect recidivism, and is this figured in somehow?
- In science, models are tweaked until they mimic reality, but physics is physics; variables are confined to physical laws. In sociology, every “fact” has a butterfly effect based on human choices. How can theoretical bases be standardized?
LINKS:
SOURCES WHICH MAY BE RELEVANT TO OTHER DISCUSSION:
======================================================
__________________________________________________________________