• guajojo
    cake
    link
    fedilink
    English
    118 months ago

    Why is this outrageous? The AI will take the most common concept is not like is biased or racist, it just shows how the internet in it’s whole is biased and racist ᕕ( ᐛ )ᕗ

    • AggressivelyPassive
      link
      fedilink
      English
      08 months ago

      And it’s not like the real world is any different.

      I hate all this “but it’s not really creative” bullshit. 99% of anything even remotely resembling art is not creative at all, stop pretending the 12 billionth digital painting of a semi nude girl is more than high effort porn.

  • @feedum_sneedson@lemmy.world
    link
    fedilink
    English
    18 months ago

    Heuristics might be a better way to think about it. There’s a risk of overshooting, which would become stereotypes. I think that risk is already apparent in humans, hence the two concepts already existing. So, it becomes a question of mitigation, amelioration, or whatever. Probably easier to correct in an AI model than a species.

  • @Potatos_are_not_friends@lemmy.world
    link
    fedilink
    English
    18 months ago

    It can only work with the data it has. We can’t just inject things artificially and still be unbiased.

    I tried a few months ago with “typical computer developer” and it was a sea of nerdy white dudes. Because humans tag white guys with glasses behind computers like that.

  • Turun
    link
    fedilink
    English
    08 months ago

    Good article. The text is written from a pretty left-ish perspective, but the conclusion is very well rounded. The whole article is well worth reading.

    Regarding the content

    Each doll was supposed to represent a different country: Afghanistan Barbie, Albania Barbie, Algeria Barbie, and so on. The depictions were clearly flawed: Several of the Asian Barbies were light-skinned; Thailand Barbie, Singapore Barbie, and the Philippines Barbie all had blonde hair. Lebanon Barbie posed standing on rubble; Germany Barbie wore military-style clothing. South Sudan Barbie carried a gun.

    I find it funny how the images are described as flawed, because they do not conform to the stereotypical look of a person from these countries. In an article that argues against stereotypes.

    In many cases, this results in a more accurate or relevant image. But if you don’t want an “average” image, you’re out of luck. “It’s kind of the reason why these systems are so good, but also their Achilles’ heel,” Luccioni said.

    I’d argue with such a generic prompt you implicitly asked for an average image. But I do concur that the sex bias, especially in the Indian portraits, is extreme and not desired.

    Usually, this requires humans to annotate the images. “If you give a couple of images to a human annotator and ask them to annotate the people in these pictures with their country of origin, they are going to bring their own biases and very stereotypical views of what people from a specific country look like right into the annotation,”

    There is also a language bias in data sets that may contribute to more stereotypical images. “There tends to be an English-speaking bias when the data sets are created,” Luccioni said. “So, for example, they’ll filter out any websites that are predominantly not in English.

    This language bias may also occur when users enter a prompt. Rest of World ran its experiment using English-language prompts; we may have gotten different results if we typed the prompts in other languages.

    This is a very important point and I am really curious how the results differ when prompting in completely different languages. How would the results look if the same experiment is repeated, but with Chinese prompts instead? With icelandic prompts?

    Out of the 100 images of predominantly beige American food, 84 included a U.S. flag somewhere on the plate.

    It’s good to see Americans realizing just how pervasive and annoying their flag based nationalism is, haha.

    I especially notice that in YouTube videos where you see a machine shop or something. For example in some videos by smarter every day. I have never seen a German flag in a German machine shop, but seemingly all American machine shops have a giant American flag hanging on the wall. It’s so weirdly nationalist.

    (Though this may be based in the training data annotations as well. If you have Americans tag pictures, of course only pictures that are blatantly American will be tagged as such. In all other images that tag is implied, because the person doing the tagging is American, so an image of an American without a flag is just a normal person, it does not have to be stated that they are an American person)