Schools and lawmakers are grappling with how to address a new form of peer-on-peer image-based sexual abuse that disproportionately targets girls.
Schools and lawmakers are grappling with how to address a new form of peer-on-peer image-based sexual abuse that disproportionately targets girls.
Do deepfake explicit images created from a non-explicit image actually qualify as CSAM?
I would consider that as qualifying. Because it’s targeted harassment in a sexually-explicit manner. All the girl would have to do is claim it’s her.
Source: I’m a father of teenage daughters. I would pursue the individual(s) who started it and make them regret their choices.
I don’t know personally. The admins of the fediverse likely do, considering it’s something they’ve had to deal with from the start. So, they can likely answer much better than I might be able to.
Drawing a sexy cartoon that looks like an adult, with a caption that says “I’m 12”, counts. So yeah, probably.
This actually is quite fuzzy and depends on your country and even jurisdiction in your country