Data poisoning: how artists are sabotaging AI to take revenge on image generators::As AI developers indiscriminately suck up online content to train their models, artists are seeking ways to fight back.

  • Dr. Moose@lemmy.world
    link
    fedilink
    English
    arrow-up
    6
    arrow-down
    9
    ·
    11 months ago

    How are you going to stop that lol it’s ridiculous. Would you stop a corporate suit from viewing your painting because they might learn how to make a similar one? It’s makes absolutely zero sense and I can’t believe delulus online are failing to comprehend such simple concept of “computers being able to learn”.

    • Cyber Yuki@lemmy.world
      link
      fedilink
      English
      arrow-up
      9
      arrow-down
      6
      ·
      11 months ago

      Ah yes, just because lockpickers can enter a house suddenly everyone’s allowed to break and enter. 🙄

    • BURN@lemmy.world
      link
      fedilink
      English
      arrow-up
      5
      arrow-down
      11
      ·
      11 months ago

      Computers can’t learn. I’m really tired of seeing this idea paraded around.

      You’re clearly showing your ignorance here. Computers do not learn, they create statistical models based on input data.

      A human seeing a piece of art and being inspired isn’t comparable to a machine reducing that to 1’s and 0’s and then adjusting weights in a table somewhere. It does not “understand” the concept, nor did it “learn” about a new piece of art.

      Enforcement is simple. Any output from a model trained on material that they don’t have copyright for is a violation of copyright against every artist who’s art was used illegally to train the model. If the copyright holders of all the training data are compensated and have opt-in agreed to be used for training then, and only then would the output of the model be able to be used.

      • cm0002@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        5
        ·
        11 months ago

        they create statistical models based on input data.

        Any output from a model trained on material that they don’t have copyright for is a violation of copyright

        There’s no copyright violation, you said it yourself, any output is just the result of a statistical model and the original art would be under fair use derivative work (If it falls under copyright at all)

        • BURN@lemmy.world
          link
          fedilink
          English
          arrow-up
          4
          arrow-down
          2
          ·
          11 months ago

          Considering most models can spit out training data, that’s not a true statement. Training data may not be explicitly saved, but it can be retrieved from these models.

          Existing copyright law can’t be applied here because it doesn’t cover something like this.

          It 100% should be a copyright infringement for every image generated using the stolen work of others.

          • cm0002@lemmy.world
            link
            fedilink
            English
            arrow-up
            4
            arrow-down
            2
            ·
            11 months ago

            You can get it to spit out something very close, maybe even exact depending on how much of your art was used in the training (Because that would make your style influence the weights and model more)

            But that’s no different than me tracing your art or taking samples of your art to someone else and paying them to make an exact copy, in that case that specific output is a copyright violation. Just because it can do that, doesn’t mean every output is suddenly a copyright violation.

        • BURN@lemmy.world
          link
          fedilink
          English
          arrow-up
          4
          arrow-down
          6
          ·
          edit-2
          11 months ago

          That’s just one of the dumbest things I’ve heard.

          Naming has nothing to do with how the tech actually works. Ignorance isn’t an excuse. Neither is stupidity