• 0 Posts
  • 16 Comments
Joined 1 year ago
cake
Cake day: June 12th, 2023

help-circle









  • Currently pedos tend to group up and share real csam, and these “communities” probably serves to normalize the activities for the members, perhaps being able to generate it will keep pedos from clumping together, reducing the degree of normalization so they’re more likely to seek help, and as a bonus, real children aren’t preyed upon to create said csam?

    And saying that removing ai tools that can generate csam will lead to them “attempt to fuck children in the streets” as you say, would you also say that we should stop criminalizing the distributing existing csam, because the existing csam that is shared in paedophile circles is all that is keeping them from going out and raping children?


  • Jeez, calm down

    I am not defending CSAM, just saying that CSAM depicting an actual existing child is magnitudes worse, as is any other kind of fabricated sexual content of real people.

    Take loli porn for example, it’s probably bad for society, but if someone makes loli porn based on the appearance of an actual individual, that’s much more fucked up, and in addition to the “normal” detrimental effects, that would also harm that victim in a much more direct way.


  • AI generated nudes of noone in particular isn’t hurting anyone, not directly at least, but AI generated nudes of a specific person, using that persons likeness and everything, that’s much worse

    AI can generate faces of people that don’t actually exist, that’s what i mean

    The post made it seem like it was about AI generated CSAM in general, which while disgusting, doesn’t directly harm anyone, but then the comments spoke about AI generated CSAM depicting a real individual, and that’s much worse, but also not a problem that’s specific to children