Google is coming in for sharp criticism after video went viral of the Google Nest assistant refusing to answer basic questions about the Holocaust — but having no problem answer questions about the Nakba.

  • MxM111@kbin.social
    link
    fedilink
    arrow-up
    3
    arrow-down
    1
    ·
    6 months ago

    That’s not how ANN should react if it was simply trained on images of past popes. The diversity had to be part of the training. This is simple technical statement.

    • VirtualOdour@sh.itjust.worksOP
      link
      fedilink
      arrow-up
      1
      arrow-down
      1
      ·
      6 months ago

      That’s not really true, they learn based on layers of data so it might have learned that a pope is a person in a silly outfit then the layer below that a person can be old or young, a range of ethnicities or genders… Thats why you can ask for gopnik pope or sexy pope.

      You would expect it to make stereotypical old male popes but they had people write similar articles complaining that asking for doctor gave make doctors snd nurse was female so instead of telling people to ask for what they actually want they added nonsense to the promp - now people run and still don’t ask for what they want and complain it goes the other way.

    • snooggums@midwest.social
      link
      fedilink
      English
      arrow-up
      0
      arrow-down
      1
      ·
      6 months ago

      So if someone wrote a prompt to make an image of a black woman as a pope, would you expect the model to only return historical popes?

      If the model is supposed to be able to make both historically accurate and possibilities, why would the expectation for a vague prompt to be historical instead of possible?

      If the model is supposed to default to historical accuracy, how would it handle a request for a red dragon? Just the painting named Red Dragon, dragons from mythology, or popular media?

      Yes, there is could be something that promotes diversity or it could just be that the default behavior doesn’t have context for what content ‘should’ be historically accurate and what is just a randomized combination of position/race/gender.

      • MxM111@kbin.social
        link
        fedilink
        arrow-up
        3
        arrow-down
        1
        ·
        6 months ago

        Of course it will draw black female pope if you request, but if you do not - it would not. As a gross approximation, ANN is an interpolator of known data-points (with some noise), and if you ask simply a pope, it will interpolate between the images it learned of popes. Since all of them are white male it is highly unlikely for ANN to produce black female (the noise should be very high). If you ask black female pope, it would start to interpolate between the images of popes and black females. You have to tune the model so that when you ask just for pope, something else pushes the model to consider otherwise irrelevant images.