Surely you meant -rating:safe
?
Surely you meant -rating:safe
?
400? Seconds or years?
Firefox has built-in cookie-deletion when you close the browser/tab, it’s under settings/privacy i believe
I heard a guy saying that linux was trash, he had tried it once but it didn’t have drivers for anything and what did exist was difficult to install
So I asked him when it was that he tried it
I think he said something like 1998…
How much is a “substantial amount”? There’s not thaaat much porn on e621, most of it is marked safe
Well a lot of it is…
Well some of it is…
I’m relatively sure i saw one marked safe once…
Haven’t had it happen to me, but a guy i know dealt with it but saying “no don’t touch it, it doesn’t bother me”
His screen got covered in porn whenever he turned out on, hot singles in his area…
Considering how common rape is in American prisons and how often innocent people are getting locked up, that does not sound unlikely
I used to get some documents sent in a password encrypted zip file, they regularly messed up the password, so i ended up just brute forcing them when i received them since it was easier and faster (usually like 15 seconds)
Not very relevant here since i knew roughly the length of the password and it was quite short, but i thought it was pretty funny
Note: my phone was already connected to my own headphones.
Have to assume they would prefer keeping those connected
Currently pedos tend to group up and share real csam, and these “communities” probably serves to normalize the activities for the members, perhaps being able to generate it will keep pedos from clumping together, reducing the degree of normalization so they’re more likely to seek help, and as a bonus, real children aren’t preyed upon to create said csam?
And saying that removing ai tools that can generate csam will lead to them “attempt to fuck children in the streets” as you say, would you also say that we should stop criminalizing the distributing existing csam, because the existing csam that is shared in paedophile circles is all that is keeping them from going out and raping children?
Jeez, calm down
I am not defending CSAM, just saying that CSAM depicting an actual existing child is magnitudes worse, as is any other kind of fabricated sexual content of real people.
Take loli porn for example, it’s probably bad for society, but if someone makes loli porn based on the appearance of an actual individual, that’s much more fucked up, and in addition to the “normal” detrimental effects, that would also harm that victim in a much more direct way.
AI generated nudes of noone in particular isn’t hurting anyone, not directly at least, but AI generated nudes of a specific person, using that persons likeness and everything, that’s much worse
AI can generate faces of people that don’t actually exist, that’s what i mean
The post made it seem like it was about AI generated CSAM in general, which while disgusting, doesn’t directly harm anyone, but then the comments spoke about AI generated CSAM depicting a real individual, and that’s much worse, but also not a problem that’s specific to children
AI generated porn depicting real people seems like a different and much bigger issue
AI generated CSAM in general, while disgusting, at least doesn’t directly harm people, fabricated nudes most definitely does, regardless of the age of the victim
Either the watchers watch eachother, or the great kraken watches us all
All that matters is that you like the car, and that it is trying its best <3
I’ve always done this, it reduces the cars sensitivity so it lasts longer
Exported everything in lastpass as CSV, imported it into KeePass, the vault file is synchronized between my devices using syncthing, i never looked back