Schools and lawmakers are grappling with how to address a new form of peer-on-peer image-based sexual abuse that disproportionately targets girls.

  • LostXOR
    link
    fedilink
    202 days ago

    Do deepfake explicit images created from a non-explicit image actually qualify as CSAM?

    • Lka1988
      link
      fedilink
      English
      5
      edit-2
      2 days ago

      I would consider that as qualifying. Because it’s targeted harassment in a sexually-explicit manner. All the girl would have to do is claim it’s her.

      Source: I’m a father of teenage daughters. I would pursue the individual(s) who started it and make them regret their choices.

    • @lath@lemmy.world
      link
      fedilink
      English
      72 days ago

      I don’t know personally. The admins of the fediverse likely do, considering it’s something they’ve had to deal with from the start. So, they can likely answer much better than I might be able to.

      • @cole@lemdro.id
        link
        fedilink
        English
        216 hours ago

        This actually is quite fuzzy and depends on your country and even jurisdiction in your country