- cross-posted to:
- [email protected]
- cross-posted to:
- [email protected]
Each of these reads like an extremely horny and angry man yelling their basest desires at Pornhub’s search function.
Each of these reads like an extremely horny and angry man yelling their basest desires at Pornhub’s search function.
I’m unconvinced by this attempt to create a moral panic. IMO nothing here is shocking or offensive once I remember that people could already use their imaginations to picture celebrities naked.
it’s not like celeb porn fakes are anything new, anyway.
The main issue of this would be public defamation, i.e. wrongfully portraying someone as porn actor which might destroy their career. You cant really do that with written or drawn fiction.
But for that the pictures would have to be photorealistic, which is not the case just yet. But the tech is going to improve plus the generated images could be further manipulated (i.e. add blur/noise to the image to make it look like a bad phone picture).
Once the ability to make photo-realistic images like that becomes commonplace, those images won’t be evidence of anything anymore. Now I can tell you a story about how I had sex with a celebrity, and you won’t believe me because you know I easily could have made it all up. In the future I will be able to show you a 100% realistic video of me having sex with a celebrity, and you won’t believe me because you’ll know that I easily could have made it all up.
The obvious thing is that at some point any camera worth it’s salt will have a nice embedded key that it signs it’s output traceable to a vendor’s CA at the least. No signature, the image would be considered fake.
Yeah, I think that there may be something like that – the ability to prove things with a camera is useful – but it’s gonna be more-complicated than just that. It’s consumer hardware. If you just do that, someone is gonna figure out how to extract the keys on at least one model and then you can forge authenticated images with it.
As a programmer, I gotta say, that’s probably not technically feasible in a sensible way.
Every camera has got to have an embedded key, and if any one of them leaks, the system becomes worthless.
No, that would actually be feasible with enough effort.
The real question is what do you do if someone takes a screenshot of that image? Since the picture must be in a format that can be shown, nothing is stopping people from writing software that just strips the authentication from the camera file.
Edit: misread the problem. You need to get a private key to make forgeries and be able to say “no look, this was taken with a camera”. Stripping the signature from photographs is the opposite of what we want here.
The point is, without the signature then there’s plausible deniability that it wasn’t real. If you want to prove something happened, then it should have a signature and be validated.
If someone is showing off a screenshot of an image then in the future (now really) one probably needs to assume it’s fake unless there’s some overriding proof otherwise.
It will kill celebrity rather than be a constant issue about stealing images.
Good. Fame is overrated, anyway. Let’s praise the era where no one person is completely dominating the cultural zeitgeist, and people are talking about their own indie discoveries they found, that algorithms and bots recommended them.
Shit, Spotify’s discovery systems are so good that we’re almost there with the music industry.
I kind of get what you’re saying, but it’s also definitely not the same as imagination. It’s vivid, almost real, shareable, and permanent. Imagine if someone generated an AI image of you doing something you consider embarrassing or compromising and sent it to your coworkers or family.
That said, I don’t think there’s much to be done about it. This isn’t containable.
To be fair, if compared to imagining something, sharing something like that with one’s family would be similar to spreading rumors verbally, leading to others imagining the same thing. Which while certainly something that happens, is also behavior we already recognize as extremely rude, sometimes illegally so
Speak for yourself. Some of us can’t do that.
The difference is that the images AIs spit out are, well, real. Imagining someone naked doesn’t produce a potentially very convincing actual image that can be shared.
I do think that AI can’t really be effectively regulated (my fucking laptop can run Stable Diffusion), but that doesn’t mean that there’s no need for a debate.