Lisa Miller

  • Home
  • About Lisa Miller
  • Contact Lisa
  • About “Heaven”
    • Praise for “Heaven”
  • Book Clubs

In 2029, AI Will Make Prejudice Much Worse

November 11, 2019 By Lisa Miller

If you’re female, the machines may not recognize you as human. They may not see you if you’re trans or a person of color, nor, possibly, if you have poor dental hygiene or carry a cane or are diminutive in stature or extraordinarily tall. The machines understand the world based on the information they’ve been given, and if you aren’t well represented in the data — if the white-male prejudice of history itself has disenfranchised you to date — then chances are to the machine you don’t exist. A dark-skinned woman in the U.K. couldn’t renew her passport online because the digital form looked at her photo and didn’t recognize it as a proper face. Trans people confound airport body scanners and are regularly hauled out of security lines to be frisked as if they were terrorist suspects. Worst-case scenarios are not so far-fetched. A self-driving car knows to brake in the crosswalk when it sees a person. But what does it understand a person to look like?

If you think structural bias is bad now, in other words, just wait until the machines take over. “Bias,” warns Kate Crawford, co-founder of the AI Now institute at NYU, in a lecture she gave last year, “is more of a feature than a bug of how AI works.” And the worst of it is that you may never know how the machines have judged you, or why they have disqualified you from that opportunity, that career, that scholarship or college. You never see the ad on your social-media feed for your dream job as a plumber or roofer or software engineer because the AI knows you’re female, and it perpetuates the status quo. (Instead, you only see ads for waitresses or home health-care workers — lower paying and with less opportunity for advancement.) These are real-life examples, by the way.

The reason recruiting engines downgrade candidates with names like Latanya is that people named Latanya have always had a harder time finding a job, according to research conducted by Harvard’s Latanya Sweeney, who used her own name as a sample. (And if you do happen to be searching for Latanya online, you will find ads alongside your search for criminal-background checks.) One recent experiment showed that AIs gave special preference to the résumés of job candidates named Jared who played lacrosse, which corroborates every one of your worst fears about the world. But there it is, replicating ad infinitum and without oversight.

The data are only part of the problem. There are also the men (mostly men: AI researchers are 88 percent male) who build the algorithms that tell the machines what to do. Sometimes the faulty design is unintentional, as when Amazon decided to create an AI that sorted résumés to find optimal employees. The men in the Amazon AI lab built their algorithm around a question — what kinds of people get hired to work at Amazon — and then loaded generations of résumés into the machine to teach it the attributes of a successful Amazon employee. And what did they find? That maleness was a prerequisite for getting hired at Amazon because for as long as Amazon has been in business it has promoted and rewarded men. Ashamed of themselves, the AI geniuses scrubbed their program. They tried to make the AI neutral, but they couldn’t guarantee it would ever unlearn its biased beginnings and wound up killing the project dead.

Filed Under: New York Magazine

Lisa Miller

Lisa Miller

Lisa Miller is a staff writer at New York magazine. She is a former columnist for the Washington Post, former senior editor of Newsweek magazine, and author of "Heaven: Our Enduring Fascination with the Afterlife."

About Lisa Miller

Lisa-Miller-Headshot

Lisa Miller is a staff writer at New York magazine. She is the former religion columnist for the Washington Post, former senior editor of Newsweek magazine, and author of "Heaven: Our Enduring Fascination with the Afterlife."

In 2014, Lisa Miller was nominated for the National Magazine Award and featured in Best Magazine Writing of 2014.

read more...

NOMINATED FOR THE NATIONAL MAGAZINE AWARD

Lisa-Miller-Headshot

Recent Tweets

  • “It’s definitely the case that most people with schizophrenia do not commit violent crimes. It’s also true that peo… https://t.co/qknpGwsrkY January 18, 2023 2:44 pm
  • “Hospital is not prison. A person who has not been accused of a crime cannot be jailed." https://t.co/xboQYkrg3k January 18, 2023 2:42 pm
  • So the ER doctors, with few other options, often knowingly discharge people back onto the street. https://t.co/xboQYkrg3k January 18, 2023 2:41 pm
  • "Given the hard realities of living on the street, patients who are unsheltered frequently present a full slate of… https://t.co/m6jkKTod1H January 18, 2023 2:40 pm
  • “It made us feel like he was taking control of our lives,” she told me. https://t.co/n8gFpH4rC5 October 11, 2022 1:17 pm
  • https://www.twitter.com/lisaxmiller

Recent Posts

  • Children of Quarantine
  • My Therapists Were Right About Uncertainty
  • Why Did I Think She Wouldn’t Die?
  • The Making of a Molotov Cocktail
  • Two Weeks With Rachel Noerdlinger, the Movement’s Publicist

Copyright © 2023 Lisa Miller