Not a troll. I’ve been doing a lot of self reflection on this topic lately. Some people seem to enjoy software for the act & craft, where the outcome / artifact is secondary or irrelevant. I don’t. Some people enjoy the artifacts it produces, for their utility or economic value. Not really me either. Often people frame it as this dichotomy, but I’ve realized my enjoyment and self-fulfillment comes from creating an artifact that is genuinely good and that I can be proud of creating. Too much AI robs me of this. I’ve created cool stuff with AI that leaves me feeling nothing because I didn’t really create it.
This is all valid. Your original comment came across as a troll because it implied that nobody could ever feel good about stuff they built with AI. Asserting that you know more about the emotional state of strangers on the internet than they know themselves is arrogant.
Well, it’s a genuine question. Like, if I have a machine in my house where I give it a recipe and it spits out the food, should I feel good about having “cooked” that food? Or what if someone prompts an AI for some art, should they feel proud of “creating” that art? I think not. And it’s the same with code. Depending on how much of the work you actually did should influence how you talk and feel about a creation. So many people lazily prompt an AI and then come here to post about something they “made” and I think that’s wrong.
I’m thinking there’s probably degrees to it. Like there is some stuff I absolutely want to hand craft, but then other stuff I don’t mind so much.
One of the interesting discussions at work (I’m in gamedev) has been about tooling and where AI fits in there.
Previously you’d spend sometimes significant time writing a tool, then polishing it up and giving it to the team (think things like editor extensions that make your workflow easier).
But AI can make this kind of bespoke tool dev so cheap now that it’s possible for every single dev to have their own tool that matches the way they work exactly. At that point, do you really need to spend the long 80% effort of polishing and getting it ready for mass consumption?
Stuff like that is interesting. I still can’t imagine never looking at the AI-generated code, but I’ve seen people take the approach of “I’m not interested in the code, only in what the thing does. If it’s wrong, I ask the agent to fix it”.
You could have government-signed models + programs that are approved for generating CP (not CSAM). It's legal if the signature checks out. Something like https://contentauthenticity.org/ but for verifying that something is definitely made by AI.
(You need to sign both the models and the programs to make sure there's no img2img.)
You don’t even need to give them a model, just generate some images and publish them.
If you find those images, it’s fine, if you find anything else, arrest them.
You don't need scientific arguments for everything you know. What's your argument against consentent 10 year old siblings having sex together if they use protection? I don't have one but I know it's morally wrong and won't bring anything good
This reminds me of a voting method I've seen some anarchists advocate for: the rules passed by votes should only be enforced on those who voted for it.
I mean this started with Stable Diffusion 1.x->XL which were only loosely open, and has just gotten worse with progressively farther from open licensed image gen models being described as “open weights”, but, yes, Flux.1 Krea (like the weights-available versions of Flux.1 from BFL itself) is not open even to the degree of the older versions of Stable Diffusion; weights available and “free-as-in-beer licensed for certain uses”, sure, but not open.
I don’t think censorship of nearly any kind has any place on the internet, but neither do kids.
It’s a parent’s responsibility to keep their children away from that type of content, not to hand them access to it so they can develop maligned, destructive ideas about sex, intimacy, and women.
reply