Schools and lawmakers are grappling with how to address a new form of peer-on-peer image-based sexual abuse that disproportionately targets girls.

  • @atomicorange@lemmy.world
    link
    fedilink
    English
    102 days ago

    If someone put a camera in the girls’ locker room and distributed photos from that, would you consider it CSAM? No contact would have taken place so the kids would be unaware when they were photographed, is it still abuse?

    If so, how is the psychological effect of a convincing deepfake any different?

    • @General_Effort@lemmy.world
      link
      fedilink
      English
      92 days ago

      If someone puts a camera in a locker room, that means that someone entered a space where you would usually feel safe. It implies the potential of a physical threat.

      It also means that someone observed you when you were doing “secret” things. One may feel vulnerable in such situations. Even a seasoned nude model might be embarrassed to be seen while changing, maybe in a dishevelled state.

      I would think it is very different. Unless you’re only thinking about the psychological effect on the viewer.

    • BombOmOm
      link
      fedilink
      English
      7
      edit-2
      2 days ago

      Taking secret nude pictures of someone is quite a bit different than…not taking nude pictures of them.

      It’s not CSAM to put a picture of someone’s face on an adult model and show it to your friend. It’s certainly sexual harassment, but it isn’t CSAM.

        • BombOmOm
          link
          fedilink
          English
          3
          edit-2
          2 days ago

          It’s absolutely sexual harassment.

          But, to your question: you can’t just say something has underage nudity when the nudity is of an adult model. It’s not CSAM.

          • @atomicorange@lemmy.world
            link
            fedilink
            English
            102 days ago

            Yes, it’s sexual abuse of a child, the same way taking surreptitious locker room photos would be. There’s nothing magical about a photograph of real skin vs a fake. The impact to the victim is the same. The impact to the viewer of the image is the same. Arguing over the semantic definition of “abuse” is getting people tangled up here. If we used the older term, “child porn” people wouldn’t be so hesitant to call this what it is.