In this video I discuss how generative AI technology has grown far past the governments ability to effectively control it and how the current legislative measures could lead to innocent people being jailed.

  • Mr_Blott@lemmy.world
    link
    fedilink
    arrow-up
    65
    arrow-down
    3
    ·
    1 year ago

    If you want to be taken seriously about child abuse, have you tried not having thumbnails that look like a ten-year-old made them 😂

  • Vendetta9076@sh.itjust.works
    link
    fedilink
    arrow-up
    61
    arrow-down
    11
    ·
    1 year ago

    While lolicon is absolutely disgusting, its not actually csam. Legislation won’t work either and is honestly a waste of time. Any effort spent protecting digital children should instead be spent protecting real ones.

    • MuchPineapples@lemmy.world
      link
      fedilink
      arrow-up
      24
      arrow-down
      15
      ·
      1 year ago

      The problem is that it’s not just cartoon characters, but also realistic looking people. That makes it, especially in the next years when the techniques improve, impossible to know what is fake and what is not and thus the fake ones should also be banned. And these models are trained on images of actual abused children, which of course is the main problem with this.

        • Microw@lemm.ee
          link
          fedilink
          arrow-up
          14
          arrow-down
          2
          ·
          1 year ago

          It wouldnt surprise me tbh. From my superficial visit to the darknet years ago, it seemed like these csam consumers have specific “favourites” among the victims whom they want to see more of. At least that’s what I remember from clicking a link to such a chan and noping out of it.

          • RaincoatsGeorge@lemmy.zip
            link
            fedilink
            English
            arrow-up
            4
            arrow-down
            1
            ·
            1 year ago

            What isn’t happening? Them making fake csam? I haven’t seen it because I don’t want to see it but I am connnnfident it’s occurring. Some kid already got busted feeding images of girls in his class into an image generator and making nudes out of them.

            So while it might not be wide spread it’s 100 percent happening and will increase.

            Honestly releasing these generators to the general public was a mistake. They thought they could put up safety measures but they’re easily bypassed. I think they should have kept them locked up and only give access to people who are registered and trackable with people reviewing what they’re generating.

            All of these ai generators are getting abused left and right and anyone who didn’t think that would happen is an idiot.

            • FunkyCasual@lemmygrad.ml
              link
              fedilink
              arrow-up
              6
              ·
              1 year ago

              No, I’m saying the models aren’t being trained with actual CSAM. The comment I replied to was about training, not generation.

              All I was saying is that you don’t need to train a model on child abuse images to get it to output child abuse images

              • datavoid@lemmy.ml
                link
                fedilink
                English
                arrow-up
                2
                arrow-down
                3
                ·
                1 year ago

                Do you really think the people generating CSAM give a fuck about their training data? They are making the content because they enjoy it - I’d guess they’d use all training data available (of which they would likely have plenty of, considering their interests)

                • FunkyCasual@lemmygrad.ml
                  link
                  fedilink
                  arrow-up
                  6
                  ·
                  1 year ago

                  The people generating it are rarely the ones who are training the models. They take pretrained models and prompt them for what they want.

                  Even if they were training a model for a specific subject, they could train it with any pictures of the subject and combine it with another model that can generate the kind of image they want.

                  There is absolutely no reason they would need abuse images to use for training. There are far better general nsfw models available right now than they could ever train themselves.

  • jet@hackertalks.com
    link
    fedilink
    English
    arrow-up
    53
    arrow-down
    7
    ·
    1 year ago

    In general terms, making an idea illegal, and then making representations of that idea illegal, are going to be forever, at best to treadmill, and at worst reduce the effectiveness and reputation of law.

    This is really about thought crime. If somebody can draw stick figures, and that can be illegal depending on interpretation. That’s thought crime.

    It’s impossible to completely stamp out thought crime. Computer tools can be used to further thought crime, because they can be used for creative purposes.

    If you restrict the use of creative tools, to only a trusted few, or hobble tools for everyone: you create central authority over creative tools, which has its own issues.

    • ono@lemmy.ca
      link
      fedilink
      English
      arrow-up
      25
      arrow-down
      2
      ·
      1 year ago

      It’s impossible to completely stamp out thought crime.

      Also, trying to do so through law and enforcement sets a dangerous precedent.

      I suspect it would be better to approach it as a public health issue.

      • jet@hackertalks.com
        link
        fedilink
        English
        arrow-up
        11
        arrow-down
        1
        ·
        edit-2
        1 year ago

        And then you run into legal arguments that sound like people trying to jailbreak GPT prompt control.

        I’m going to preface all of the following creative work by saying that we live in a universe where everyone is a vampire that never dies, but ages very slowly. All participants in this manga are at least 213 years old…

    • Tanoh@lemmy.world
      link
      fedilink
      arrow-up
      10
      ·
      1 year ago

      In some countries all forms of description of underage sexual activities are illegal. So the sentance “She was having sex” is perfectly legal, but add an age marker and it is illegal. “She was having sex on the day before her 18th birthday”.

      It is hard to legislate around as there will always ve ways to avoid it and get around it. But all this just sounds like the normal hype => fear => hype => fear, etc cycle that all new tech goes through.

        • Tanoh@lemmy.world
          link
          fedilink
          arrow-up
          7
          ·
          edit-2
          1 year ago

          Some countries have different age restrictions for hetro and homosexual encounters too. Not to mention that in a lot of countries it just outright illegal, and everything not condeming it can be seen as encouraging it and hence illegal too.

          We humans make some weird laws around sex.

    • mindbleach@sh.itjust.works
      link
      fedilink
      arrow-up
      6
      ·
      1 year ago

      This is especially damning on the internet, because genuinely intolerable pursuits directly benefit from lesser problems being treated as equally bad. Filesharing networks work better with more users. Chasing merely distasteful people toward paranoid systems softens the reputation of those systems and makes the worst minority of traffic easier to hide.

  • mindbleach@sh.itjust.works
    link
    fedilink
    arrow-up
    34
    arrow-down
    4
    ·
    1 year ago

    There is no such thing.

    God dammit, the entire point of calling it CSAM is to distinguish photographic evidence of child rape from made-up images that make people feel icky.

    If you want them treated the same, legally - go nuts. Have that argument. But stop treating the two as the same thing, and fucking up clear discussion of the worst thing on the internet.

    You can’t generate assault. It is impossible to abuse children who do not exist.

    • m0darn@lemmy.ca
      link
      fedilink
      arrow-up
      30
      ·
      1 year ago

      Did nobody in this comment section read the video at all?

      The only case mentioned by this video is a case where highschool students distributed (counterfeit) sexually explicit images of their classmates which had been generated by an AI model.

      I don’t know if it meets the definition of CSAM because the events depicted in the images are fictional, but the subjects are real.

      These children do exist, some have doubtlessly been traumatized by this. This crime has victims.

    • rurutheguru@lemmings.world
      link
      fedilink
      arrow-up
      8
      arrow-down
      1
      ·
      1 year ago

      I think a lot of people are arguing that the models which are used to generate these types of content are trained on literal CSAM. So it’s like CSAM with extra steps.

    • crispy_kilt@feddit.de
      link
      fedilink
      arrow-up
      3
      arrow-down
      1
      ·
      1 year ago

      In most (all?) countries no such distinction is made, the material is illegal all the same.

    • Neato@kbin.social
      link
      fedilink
      arrow-up
      26
      arrow-down
      8
      ·
      1 year ago

      Prove it’s fake when some of it of your daughter is making it’s way around school.

      You’ve missed the point. Fake or not it does damage to people. And eventually it won’t be possible to determine if it’s real or not.

      • hydration9806@lemmy.ml
        link
        fedilink
        arrow-up
        19
        arrow-down
        6
        ·
        1 year ago

        When that becomes widespread, photos will be generateable for literally everyone, not just minors but every person with photos online. It will be a societal shift; images will be assumed to be AI generated, making any guilt or shame about a nude photo existing obselete.

        • Neato@kbin.social
          link
          fedilink
          arrow-up
          4
          arrow-down
          13
          ·
          1 year ago

          What a disguising assumption. And the best argument against AI I’ve ever heard.

          • hydration9806@lemmy.ml
            link
            fedilink
            arrow-up
            12
            arrow-down
            1
            ·
            1 year ago

            I mean, anyone with enough artistic talent can draw whatever they would like right now. With AI image generation, it essentially just gives everyone the ability to draw whatever they want. You can try to fight the tech all you want, but it’s a losing battle.

      • Ignotum@lemmy.world
        link
        fedilink
        arrow-up
        12
        arrow-down
        5
        ·
        1 year ago

        AI generated porn depicting real people seems like a different and much bigger issue

        AI generated CSAM in general, while disgusting, at least doesn’t directly harm people, fabricated nudes most definitely does, regardless of the age of the victim

          • Ignotum@lemmy.world
            link
            fedilink
            arrow-up
            6
            arrow-down
            2
            ·
            1 year ago

            AI generated nudes of noone in particular isn’t hurting anyone, not directly at least, but AI generated nudes of a specific person, using that persons likeness and everything, that’s much worse

            AI can generate faces of people that don’t actually exist, that’s what i mean

            The post made it seem like it was about AI generated CSAM in general, which while disgusting, doesn’t directly harm anyone, but then the comments spoke about AI generated CSAM depicting a real individual, and that’s much worse, but also not a problem that’s specific to children

              • Ignotum@lemmy.world
                link
                fedilink
                arrow-up
                1
                ·
                1 year ago

                Currently pedos tend to group up and share real csam, and these “communities” probably serves to normalize the activities for the members, perhaps being able to generate it will keep pedos from clumping together, reducing the degree of normalization so they’re more likely to seek help, and as a bonus, real children aren’t preyed upon to create said csam?

                And saying that removing ai tools that can generate csam will lead to them “attempt to fuck children in the streets” as you say, would you also say that we should stop criminalizing the distributing existing csam, because the existing csam that is shared in paedophile circles is all that is keeping them from going out and raping children?

            • Neato@kbin.social
              link
              fedilink
              arrow-up
              4
              arrow-down
              17
              ·
              1 year ago

              AI CSAM is incredibly harmful. All CSAM is harmful. It’s been shown to increase chance of pedophilic abuse.

              Stop defending CSAM, HOLY SHIT.

              • Helix 🧬@feddit.de
                link
                fedilink
                English
                arrow-up
                11
                arrow-down
                1
                ·
                edit-2
                1 year ago

                It’s been shown to increase chance of pedophilic abuse.

                Can you link me a source for that, please?

              • Ignotum@lemmy.world
                link
                fedilink
                arrow-up
                7
                arrow-down
                1
                ·
                1 year ago

                Jeez, calm down

                I am not defending CSAM, just saying that CSAM depicting an actual existing child is magnitudes worse, as is any other kind of fabricated sexual content of real people.

                Take loli porn for example, it’s probably bad for society, but if someone makes loli porn based on the appearance of an actual individual, that’s much more fucked up, and in addition to the “normal” detrimental effects, that would also harm that victim in a much more direct way.

    • pixeltree@lemmy.world
      link
      fedilink
      arrow-up
      4
      arrow-down
      2
      ·
      edit-2
      1 year ago

      What data is it trained on? This isn’t meant to be a “gotcha” question, I’m wondering about it.

  • andrew_bidlaw@sh.itjust.works
    link
    fedilink
    arrow-up
    21
    ·
    1 year ago

    Creating, collecting and sharing CSAM is in the law already. There are orgs and agencies for tracking and prosecuting these violations.

    It’s like fighting against 3d printers because you can make yourself a diy gun, a thing that have never being possible before because we got all pipes banned from hardware stores. The means to produce fictional CSAM always existed and would exist, the problem is with people who use a LMM, a camera, a fanfic to create and share that content. Or a Lemmy community that was a problem in recent months.

    It’s better to ensure the existing means of fighting such content are effective and society is educated about this danger, know how to avoid and report it.

  • mo_ztt ✅@lemmy.world
    link
    fedilink
    English
    arrow-up
    20
    arrow-down
    2
    ·
    edit-2
    1 year ago

    What the hell is this guy?

    “Here’s a case where people made and shared fake nudes of real underage girls, doing harm to the girls”

    “But what the hell, that’s kind of hard to stop. Oh also here’s this guy who went to prison for it because it’s already illegal.”

    “Really the obvious solution everyone’s missing is: If you’re a girl in the world, just keep images of yourself off the internet”

    “Problem solved. Right?”

    I’m only slightly exaggerating.

    • spez@sh.itjust.worksOP
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      4
      ·
      1 year ago

      Also, I think the most governments would be able to do is to increase the friction of this process by giving all ai-gen photos an ‘id’ to track later and probably controlling open-source models, but that’s harder to do. Most probably old senators who don’t know gmail will pass unenforceable laws which won’t do jackshit but get them votes.

      • mo_ztt ✅@lemmy.world
        link
        fedilink
        English
        arrow-up
        12
        arrow-down
        1
        ·
        edit-2
        1 year ago

        The point I’m trying to make is, you don’t even have to do that.

        There are already laws against revenge porn and realistic child porn. You don’t have to “prevent” this stuff from happening. That is, as he accurately points out, more or less impossible. But, if it happens you can absolutely do an investigation, and if you can find out who did it, you can put them in jail. That to me sounds like a pretty good solution and I’m still waiting to hear what his issue is with it.

        • spez@sh.itjust.worksOP
          link
          fedilink
          English
          arrow-up
          1
          ·
          1 year ago

          I don’t have any problems with the points you discussed either. Can’t speak for him though.

  • CJOtheReal@lemmy.sdf.org
    link
    fedilink
    arrow-up
    20
    arrow-down
    7
    ·
    1 year ago

    Loli stuff isn’t CSAM. You can find it bad, but its still just a drawing/generative image. No real person was harmed in general.

  • andruid@lemmy.ml
    link
    fedilink
    arrow-up
    5
    ·
    1 year ago

    Couldn’t the fact that AI generated content be reproduceable if give the exact parameters(or coordinates in latent space) and model help remove the confusion? Include those as meta data and train investigators on how to use to distinguish generated content from actual evidence.

    • Send_me_nude_girls@feddit.de
      link
      fedilink
      arrow-up
      3
      ·
      1 year ago

      There’s an option to speed up generation but it will make it less deterministic, like in it’s 98% the same image but a little different. Also it’s very hard to reproduce the same hard and software generation. That’s the first issue.

      The second is: I had examples of images with generation data, that I could reproduce to look 99% like the original and then just updating a single word or part of the training data (different Lora version for example) , switched the person away or their appearance changed a completely. (Imagine a picture of a street and a car is suddenly not there, or it’s blue instead of red). It will make reproducibility not a reliable option. Backgrounds of images are even less reliable than the focus object.

  • LemmyIsFantastic@lemmy.world
    link
    fedilink
    arrow-up
    2
    arrow-down
    4
    ·
    1 year ago

    And very much supported by lemmy.

    None of these servers are doing this. Some dude is running a script and calling it a day.

  • spez@sh.itjust.worksOP
    link
    fedilink
    English
    arrow-up
    3
    arrow-down
    11
    ·
    1 year ago

    What do you people think this will lead to? Is it solvable or not? and if yes then how?

  • Neato@kbin.social
    link
    fedilink
    arrow-up
    8
    arrow-down
    25
    ·
    1 year ago

    Most of this thread is defending csam, which loli definitely is. WTF. Disgusting community.

    • Vendetta9076@sh.itjust.works
      link
      fedilink
      arrow-up
      12
      arrow-down
      2
      ·
      1 year ago

      I think you’re confused. No one is defending CSAM. Lolicon isn’t CSAM. Also I don’t understand why we would spend effort protecting digital children instead of protecting real ones.

      • limitedduck@awful.systems
        link
        fedilink
        arrow-up
        8
        arrow-down
        6
        ·
        1 year ago

        Nobody is protecting digital children and it’s almost always disingenuous when this argument is claimed to be made. The effort is to stop the normalization of the sexualization children. Lolicon is exclusively about romancing or sexualizing children. Deluded adults who think what happens in lolicon material is ok are potential risks to real children. Allowing such a risk to children for the pleasure of these adult is absurd.

        • random65837@lemmy.world
          link
          fedilink
          arrow-up
          5
          arrow-down
          2
          ·
          1 year ago

          So by that logic why didn’t all of us that grew up playing COD “normalize” walking around shooting everybod?, that stupid claim used to be made all the time. I’ve yet to me a serial killer that blamed video games.

          • limitedduck@awful.systems
            link
            fedilink
            arrow-up
            1
            arrow-down
            2
            ·
            1 year ago
            1. The amount of people warped by COD or Lolicon is not 100%, but it’s certainly not 0%
            2. It sounds like you haven’t actually played COD because the game is about WARFARE, not domestic terrorism. Maybe ask people who joined the US military how inspired they were by the game
            • random65837@lemmy.world
              link
              fedilink
              arrow-up
              2
              arrow-down
              4
              ·
              1 year ago

              You’re a complete moron you know that? Aside from having every COD the first handful of years they came out, I also served. But you knew that right genius?

              Maybe ask people who joined the US military how inspired they were by the game

              That’d be a first, and go figure, I just happen to know a LOT of other people in the military.

              I also never made the claim ZERO people were ever effected, I said it never normalize any of it. Next time try reading with your eyes open and try not to inject your make believe facts based on zero.

        • Vendetta9076@sh.itjust.works
          link
          fedilink
          arrow-up
          4
          arrow-down
          1
          ·
          1 year ago

          Fair enough. Imo lolicon is disgusting. And Im not making an argument in bad faith, I just see how much general society fails at protecting children and would rather see any effort spent towards cracking down on lolicon to be used to help real children.

          • limitedduck@awful.systems
            link
            fedilink
            arrow-up
            2
            arrow-down
            1
            ·
            1 year ago

            I understand what you’re saying, but the fighting against Lolicon doesn’t necessarily take away from the fight against real CSAM. The reality is serious, far-reaching, and, ultimately, human issues like the exploitation of children are complex and require effort on multiple fronts to be effective.

    • random65837@lemmy.world
      link
      fedilink
      arrow-up
      4
      arrow-down
      3
      ·
      1 year ago

      Nobody has done that, you can’t redefine CSAM to mean what you want it to. Funny how people only label people as “disgusting” when they’re being driven by false emotion and not logical thinking

        • random65837@lemmy.world
          link
          fedilink
          arrow-up
          5
          arrow-down
          3
          ·
          1 year ago

          There very well could be, I’m sure there’s no shortage of them that will never be found out, and I’m sure that number is petrifying. BUT, that still doesn’t rationalize people trying to redefine words so that they can make things mean what they want to fit a context that isn’t there, simply to suit their needs. Just like free speech, you have to take the good with the bad.

          Pedo’s are wired wrong, theres something literally wrong with them, nobody would chose to be that, thinking otherwise is no different than people thinking you can “re-educate” gay people into being straight. Simply not gonna happen. If some basement dwelling pedo gets off on some cartoon and that keeps them functioning normal in society, while no children in real life are harmed, good luck arguing against that. Only a person ignorant to reality tries to make the reverse argument that doing so so how condones pedophilia. NOBODY other than other pedo’s are ok with how they are.