Google is coming in for sharp criticism after video went viral of the Google Nest assistant refusing to answer basic questions about the Holocaust — but having no problem answer questions about the Nakba.

  • snooggums@midwest.social
    link
    fedilink
    English
    arrow-up
    0
    arrow-down
    1
    ·
    2 months ago

    So if someone wrote a prompt to make an image of a black woman as a pope, would you expect the model to only return historical popes?

    If the model is supposed to be able to make both historically accurate and possibilities, why would the expectation for a vague prompt to be historical instead of possible?

    If the model is supposed to default to historical accuracy, how would it handle a request for a red dragon? Just the painting named Red Dragon, dragons from mythology, or popular media?

    Yes, there is could be something that promotes diversity or it could just be that the default behavior doesn’t have context for what content ‘should’ be historically accurate and what is just a randomized combination of position/race/gender.

    • MxM111@kbin.social
      link
      fedilink
      arrow-up
      3
      arrow-down
      1
      ·
      2 months ago

      Of course it will draw black female pope if you request, but if you do not - it would not. As a gross approximation, ANN is an interpolator of known data-points (with some noise), and if you ask simply a pope, it will interpolate between the images it learned of popes. Since all of them are white male it is highly unlikely for ANN to produce black female (the noise should be very high). If you ask black female pope, it would start to interpolate between the images of popes and black females. You have to tune the model so that when you ask just for pope, something else pushes the model to consider otherwise irrelevant images.