I get that most of these are hilarious (this is my favorite comment on HN in some time, https://news.ycombinator.com/item?id=37239909 ). But still, I find this incredibly frightening. These are only going to get better. Does anyone doubt that in a couple years time (if that) we'll be able to put the image of any known public person into whatever generated photo we want, which would be indistinguishable from reality? We're not that far already (see the Pope in a puffy jacket).
My only hope is that this extreme enshittification of online images will make people completely lose trust in anything they see online, to the point where we actually start spending time outside again.
> The most powerful image deepfake AI ever created. See any girl clothless with the click of a button.
This is just disgusting. I thought it would just be a uncensored generative AI but certainly wasn't expecting peeping-tom as a service. And advertise it so blatantly as being able to virtually strip any girl you have pictures of just makes me sick.
I agree it’s gross, but I’m not sure how to articulate it. It’s making real something people have done in their minds-that is imagine people with no clothes on. These pictures aren’t the actual subject naked. There’s nothing being discovered or disclosed. It’s a pure fiction. But it still bothers me.
It is fiction yes, but if it is lifelike enough does the difference matter? Even without the ick factor of making porn of someone without their consent, it would be so easy to destroy someone's career or relationship by making these deepfakes and then spreading them around. Especially once the tech gets more life-like and loses the current AI gen tells.
And the same would apply to doing this the old fashioned way in Photoshop, however you have to admit that taking it from "need special software and experience in using that software" to "just upload their image to a website and get back their AI generated nudes" is a huge change in how accessible this is.
> it would be so easy to destroy someone's career or relationship by making these deepfakes and then spreading them around.
If nude pics can get you fired, work culture needs to change.
Same with relationships.
Basically, people need to learn not to trust digital media at all without some kind of authentication, and to be a little more tolerant of nude human bodies when they do pop up.
Potentially the opposite. It may become more difficult to use such images to harm someone's career or relationship. Perhaps nothing is believable in the future.
I’d like to believe that this is the future. Already with the rise of digitally native relationships nudes have become commonplace (even Jeff Bezos has sent some). Now with these widely accessible deep-fake generators any leaked nude photo can be chalked up to digital malfeasance!
If by virtue signal you mean call out creeps and perverts then yes. The world would be a much better place if people did that more often instead of just leaving them be to hurt women and teenage girls. Maybe then they would at least not act on it in public.
It's really interesting how many people see a product/service with an impressive result (previously seemed impossible), albeit with flaws that humans find funny/obvious, and then are unable to imagine how it might be once improvement leads to the issues being ironed out.
A few years ago generating images like many of these was unimaginable, and now this does it, but perhaps 50% have some silly flaw in the image. Unless there were some reason to believe that further improvement isn't possible, as you say, it is likely to become indistinguishable from reality in the relatively near future.
The only difference between that world and our current world for the last 10 years is that before you needed to have some photoshop skills and now anyone can do it.
In some ways, I think that actually makes it safer now… the more trivial it is the more people will stop trusting photos as automatically being real.
The good news is that legal courts have already lost faith in all things digital imagery, and has for a good long while. They're actually way ahead of the curve.
> My only hope is that this extreme enshittification of online images will make people completely lose trust in anything they see online, to the point where we actually start spending time outside again.
Well in a weird way it will provide cover. You could post nudes of yourself online and just explain it away as bad actors using AI.
I don't see much to be frightened of. It has always been possible to create convincing fake photographs, at a price, while a photograph on its own (without proper provenance) has not usually been treated as important evidence (though there are some famous exceptions, e.g. Duchess of Argyll). The new technology just makes it cheaper for mischievous people to make fakes, and easier for people to dismiss an unprovenanced photo as a fake.
> It has always been possible to create convincing fake photographs...
Not really, depending on who you are "convincing". Up until relatively recently it was usually quite easy for photo experts (and often even less than experts, just people trained to look for certain "tells") to detect digital manipulation. But, on the flip side, yes, digital manipulation has occurred in the past, and I think it's a mistake to discount the strong negative effects it has had, e.g. many people having a completely unrealistic view of what most real humans actually look like (e.g. https://scottkelby.com/faith-hill-redbook-magazine-retouch-f...).
> The new technology just makes it cheaper for mischievous people to make fakes
That's a huge deal. Just look at the concerns around the use of LLMs to generate (and run) misinformation campaigns. Obviously those campaigns can and are run now, but the thought of it being incredibly cheap to do so, by people of extremely little skill, changes the information landscape drastically. Doing it for images is just another piece of the puzzle.
When one says fake photographs, that is not the same as saying digital manipulation. Photo manipulation is older than the transistor. See https://www.imaging-resource.com/news/2012/09/28/before-phot... for some examples; these are admittedly artistic manipulations and fairly obvious, but it's entirely possible to apply the same techniques in other ways.
I don't share your optimism. I expect that there will be a lot of false news and propaganda using these kinds of images. Perhaps our legal systems will evolve (albeit slower) to handle cases that come up using these technologies.
This will, if not already, make its way into porn and increase the amount of content there. It's already wild and with this, I don't expect anything good to happen.
https://getimg.ai already lets you train on submitted photos and then generate using various AI models. This tech is commonly called DreamBooth. There are clauses against misuse of course. The obvious one being creeps using photos of girls they know and then using an NSFW model with it.
> My only hope is that this extreme enshittification of online images will make people completely lose trust in anything they see online, to the point where we actually start spending time outside again.
I'm in full-bore accelerationist agreement with this point. Defense lawyers must love this.
My only hope is that this extreme enshittification of online images will make people completely lose trust in anything they see online, to the point where we actually start spending time outside again.