93

22 comments

[–] Eriomra 32 points (+33|-1)

I guess that it is not a biais of the algorithm, but of our societies? If more often it’s a women who are said to do laundry (in news, documents, blogs, Google translate uses to), the results make sense? I mean, you can’t expect machine learning to not pick up any biais when all material they are learning from is full of our biais.

Of course then it’s the human team responsibility to code it differently (maybe use this neutral pronoun in that case).

[–] BogHag 41 points (+41|-0)

It's pretty much impossible for a biased society to create an unbiased AI. Machine learning is going to be essential to creating intelligence. It can only learn what we show it.

[–] bluestocking 6 points (+6|-0)

Humans are designing these things though, and it is possible to make it so that this kind of thing doesn't happen. Likely the developers working on this were men and didn't see the need to bother, or it wasn't tested thoroughly or with someone who knew the actual language.

[–] immersang 6 points (+6|-0) Edited

Don't underestimate the HUGE amount of data Google uses to train these language models. I work in that tech area (not for Google, but also in AI) and I know how much data is already used when we are just speaking about domain language models, i.e. models that get created to understand language for a specific vertical such as Banking or Retail.

Now, what Google uses for these? X times that, all raw data.

Sure, this stuff can be tuned up to a certain point, but....

[–] bluestocking 0 points (+0|-0)

My point wasn't that the quantity of data is the issue, it's the quality of the data and the bias (or incompetence) of the people deciding how to use that data. They can do better than just saying "here, computer, go learn from this pile of garbage," and actually refine the pile or refine how it is used. Correct for sexist or whatever other issues are in there. But given that most in tech are men, that's not happening.

Google has shown it's possible, like when people pointed out the sexist and racist results in image searches (like only showing men when you searched for doctors). They and other tech companies can do better, but whether they will is another matter.

[–] z6iiab 1 points (+1|-0)

yes, it is a bias of our society. the algorithm just reproduces it, because that's what its learning. like any human child.

[–] Zaftig 17 points (+17|-0)

I think Google translate might just start with he and then alternate because I decided to try it for myself but I started with the second one first. And it translated it as “He is doing laundry”.

I just tried the first 4 and it translated it to he everytime. 🤷‍♀️

[–] z6iiab 1 points (+1|-0)

I tried them all and I got the same results as the picture...

[–] fireworks 8 points (+8|-0) Edited

My language also doesn't have gendered pronouns and translation programs always translate pronoun as "he". "She" never comes up unless I write a female pronoun which didn't exist in Korean language until English translation became a thing.

[–] [Deleted] 6 points (+6|-0)

i've tried this out with several languages, and it's the same thing all around. it's because the linguistic corpus they feed into translators is full of bias, and machines only reflect what they've been taught.

[–] SterlingWitch 5 points (+5|-0)

Also, fun fact: I am learning Greek and my teacher pointed out that they do not have feminine words for doctor or lawyer, but instead change the modifier in front and keep the masculine suffix on the word.

[–] emptiedriver 3 points (+3|-0) Edited

Is it just coincidence that the ones ending in "a" are the ones with female pronouns and the ones ending in "o" are the male pronouns? I don't know filipino...

ok, when I tried rewriting with a's and o's switching I got all he's:

Translation results

he is doing laundry

he is blurring

he was playing

he is drawing

... not sure what that means

[–] RikkiTikkiTavi 3 points (+3|-0)

Not that I am into the whole pronoun thing, but I really don't think it would be too hard to include a programming translation choice : 'he/she' or 'her/him' - especially in languages that do not have gendered pronouns.

Is this just lazy coding? Can we not make the equivalent 'if/then' clause in the code? (IF Filapino (or other non-gendered pronoun language) THEN he/she)

[–] skaskull 1 points (+1|-0)

it would be interesting to see how AI interprets signed languages (they are in 4-D format). I know that my language, ASL, uses non-gendered pronouns (unlike the manually coded English systems which does rely heavily on gendered pronouns).

[–] z6iiab 0 points (+0|-0)

and they say english is the "least gendered language in the world". hmpf.

[–] Kimaris 0 points (+0|-0)

They also do this for the filipino/tagalog language.