What if the data tell you to be racist? Without the right precautions, machine learning — the technology that drives risk assessment in law enforcement, as well as hiring and loan decisions — explicitly penalizes underprivileged groups. Left to its own devices, the algorithm will count a black defendant’s race as a strike against the person. Yet some data scientists are calling for turning off the safeguards and unleashing computerized prejudice, signaling an emerging threat that supersedes the well-known concerns about inadvertent machine bias.

Imagine sitting across from a person being evaluated for a job, a loan or even parole. When asked how the decision process works, you inform them, “For one thing, our algorithm penalized your score by seven points because you’re black.”

Read the entire article.

Print Friendly, PDF & Email

Leave a reply

<a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong> 

required