© 2024 WLRN
MIAMI | SOUTH FLORIDA
Play Live Radio
Next Up:
0:00
0:00
0:00 0:00
Available On Air Stations

Google Researcher Discusses Departure

STEVE INSKEEP, HOST:

Google faces pressure from outside and inside. The outside pressure comes from lawsuits. A third big suit came yesterday. State attorneys general accused Google of giving its own products priority in search results. We report this morning on the inside pressure. Many employees are protesting the departure of a Google researcher. The story highlights tensions over race and gender in tech. The company says it accepted the researcher's resignation, which Timnit Gebru says she never gave.

TIMNIT GEBRU: People have come up with this term, resignate - to resignate someone - I love this term - in our team.

INSKEEP: Resignate - so that is like the word resignation. I resign; this is something that I do. But to be resignated means that someone announces that you resigned when you didn't.

GEBRU: Yeah.

INSKEEP: Is that - OK.

GEBRU: Yeah, you know? Because they're saying I resigned, but I obviously didn't.

INSKEEP: What we're about to hear is one side of a personnel decision. Google is a financial supporter of NPR, which we cover like any other company. Google has not said much about Gebru's case, though the CEO apologized for causing some employees to doubt they had a place at Google. Let's start at the beginning. Timnit Gebru is an Ethiopian-born engineer who studied at Stanford and worked in the male-dominated tech firms of Silicon Valley.

GEBRU: When I started my Ph.D., I felt very isolated. And, you know, I was the only woman - I mean, everything - right? - like, all of the isolation that you hear about - the only Black person, et cetera.

INSKEEP: And that feeling of isolation led her to become a technology researcher.

GEBRU: I started becoming very concerned with the potential negative societal impacts of some of this technology. At the same time, I would go to my conferences and other places and note that there were barely any Black people there at all.

INSKEEP: She began to feel that some people in tech might not grasp the social power of what they were doing. She pondered data-driven policing, where police used computers to identify drug use hot spots.

GEBRU: Looking at the national survey for - of drug use, drug use - it was very evenly distributed around Oakland, but the people who are arrested for drug use is not very evenly distributed. So when you use this kind of data to train a model to then say that it's going to be crime hot spots in certain locations, then you send more police to those locations, and you arrest more people in Black and brown communities, and then you feed it once again into your training data, then this creates what they call a runaway feedback loop where you're actually widening societal inequity. You are actually making things a lot worse than they already are.

INSKEEP: Yeah. I mean, I'm even thinking about an insidious part of this, which is that someone using this particular model might tell themselves, of course it's not biased; it's computers; it's science; it's objective.

GEBRU: That's exactly the issue. Scientists can be some of the most dangerous people in the world because we think we're objective. We have this illusion of objectivity.

INSKEEP: She built a reputation for studying the ethics of artificial intelligence. And then she went to work for Google.

Why did they hire you?

GEBRU: Well, they hired me exactly because of this expertise that I have, which is working to mitigate the potential negative consequences of this technology to society.

INSKEEP: She began exploring Google's AI work with languages. The company has financed enormous studies of millions and millions of words online, training computers in the intricacies of how people speak. This kind of research is already used to help computers recognize your voice or guess which terms you're searching for on Google.

GEBRU: It can also be used, for instance, to predict the next word that you might write or to predict the next sequence in a sequence of words that you might want to write, so that can be used to generate language.

INSKEEP: I suppose it also might be used to manipulate me - right? - to figure out which words could be said to me to influence me in a certain way.

GEBRU: Yeah. And it could also be used - so people have done these experiments; if you seed it with certain extremist views, it can also be used to write other extremist views and spread misinformation as well.

INSKEEP: Working with outside academics, she wrote a research paper that warned of possible risks. There's a lot of awful speech on the Internet. What if Google computers sucked it all in and learned to talk the way racists do? What if we unknowingly transmit our own more subtle biases to computers? The paper even questioned the environmental cost of using so much electricity to run so many data farms that analyze so much information.

At what point did you realize there was a problem?

GEBRU: So a week before Thanksgiving, there was a meeting invite from my manager's manager, randomly. And then on that meeting, we're told that, you know, we should retract the paper. And I have to tell you, I was so upset, I started crying because it wasn't really about the paper for me. It's - one paper is really not the end of the world. You know, I'm a researcher. I've been rejected many times.

But, you know, I have been - I've had to fight for everything at Google. It's been just marginalization after marginalization. And all of a sudden, we have this order. You know, it's not even a discussion. It's an order saying, you have to retract it by next week.

INSKEEP: She wrote an objection to the company, referring to the possibility that she might have to leave. That, she says, is when they resignated her - welcomed what they said was a resignation.

GEBRU: My best guess is that they had been looking for a reason to terminate me, and a lot of it has to do with me speaking up. A lot of it has to do with the kinds of papers I want to write.

INSKEEP: Did they bring you in to be a dissenting voice and then realized they didn't want quite that much dissent?

GEBRU: I think they just wanted to have a name associated - like, they wanted to have my presence but not me, exactly. You know what I mean? I don't know. They wanted to have the idea of me being at Google but not the reality of me being at Google. And this is what you see over and over again in tech companies. So my story is really not unique, in my opinion. It's just getting a lot of attention because I didn't stay quiet.

INSKEEP: After Gebru's dismissal, more than 2,000 Google employees signed an open letter accusing their own company of, quote, "research censorship" and "a retaliatory firing." Now her former colleagues are demanding that she be rehired and promoted. Our NPR colleague Bobby Allyn obtained an email from CEO Sundar Pichai promising to work to regain employees' trust. Gebru is unimpressed.

GEBRU: A lot of people are waiting for some sort of change in the tech industry at this point. I think we've been talking about issue after issue, and I think a lot of people are ready for some sort of actual, meaningful change, whether it's regulation, whether it's some sort of oversight by government. I think that people rallying behind me like this is showing me that.

INSKEEP: Timnit Gebru says she wants to see a better environment inside the company where she once worked.

(SOUNDBITE OF ALIJOSHA KONSTANTY'S "AK - PS") Transcript provided by NPR, Copyright NPR.

More On This Topic