Adobe is selling fake AI images of the Israel-Hamas war::Adobe Stock’s image service has fake, artificially generated images of Israel, Gaza and Hamas which are being used by online news sites.

    • PilferJynx@lemmy.world
      link
      fedilink
      English
      arrow-up
      33
      ·
      1 year ago

      It’s a mess. I can’t trust anything online as coming from a real human anymore. It’s all filters, generated, or paid for. The God of profit corrupts everything it touches.

      • RGB3x3@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        1 year ago

        I miss the internet from the times when South Park was making fun of it for not being a serious place to do business and features the early internet YouTube memes.

        Things were simpler then.

    • makyo@lemmy.world
      link
      fedilink
      English
      arrow-up
      5
      ·
      1 year ago

      It will always be more and more important to know the source of the info and how trustworthy it is. This is, of course, why authoritarians like Trump are always going on, trying to discredit the media, especially those with some scruples left.

  • Tygr@lemmy.world
    link
    fedilink
    English
    arrow-up
    20
    ·
    1 year ago

    Adobe Stock Contributor. Someone, as a contributor, listed these photos to earn a commission for sales for photo content they say is theirs and own the copyright for.

    It’s a decent side hustle but I’d never upload AI crap.

      • brambledog@lemmy.today
        link
        fedilink
        English
        arrow-up
        7
        ·
        1 year ago

        I’m not surprised some stock photo companies are selling AI work. I imagine this is an industry not easy to make money in if you are one of the few remaining firms not owned by Getty.

  • lloram239@feddit.de
    link
    fedilink
    English
    arrow-up
    12
    arrow-down
    3
    ·
    1 year ago

    Not great, but also not really Adobe’s fault. Journalists using them are the problem. Back when the fakenews of Israel hitting that hospital with 500 dead went around there were plenty of news article that just had regular stock images from other completely unrelated bombings in the articles, which did nothing more than misguide the reader. That’s the kind of stuff that really shouldn’t be acceptable, but happens all to often.

    The media needs much better standards when it comes to photos (e.g. include GPS coordinates and time so we can verify and cross check it easier). Just plastering stock images in articles, AI generated or not, is rarely helpful and often misleading.

    • P03 Locke@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      Why does the most reasonable and balanced take have the most downvotes?

      This is absolutely the journalists’ fault. It doesn’t matter if its AI-generated or not.

  • Kusimulkku@lemm.ee
    link
    fedilink
    English
    arrow-up
    7
    arrow-down
    1
    ·
    1 year ago

    If we had these before we wouldn’t have to be fighting for real smh

        • pewter@lemmy.world
          link
          fedilink
          English
          arrow-up
          3
          ·
          1 year ago

          I thought so too, but it’s getting harder to tell the difference between AI and weird people.

          • oillut@lemm.ee
            link
            fedilink
            English
            arrow-up
            2
            ·
            1 year ago

            The OP account definitely seems to be.

            All the posts they’ve made have a header, sub header, “::”, and description, with all the copy around the same length each time. It’d be a weird amount of effort to not be using GPT for this