Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

It took me a little hunting, but thanks to Reddit I eventually found a cloud-gpu host that provides a working Stable Diffusion image. So you basically don't have to do anything that GP said. Everything is installed and you just rent the hardware.

https://www.runpod.io/console/templates

Look for "RunPod Stable Diffusion". I spent a whole $0.35/hr playing around with my own SD instance that I had running in minutes.



You can do the same thing on vast.ai too.

It's a little inconvenient to use non-base models and plugins this way (you pay extra for more storage), but it's definitely an easy way to use the full power of SD.


35c/hr seems crazy expensive compared to Midjourney. Midjourney gives you set fast hours (Immediate GPU) and unlimited relaxed hours (Delayed GPU). It also has a lot of built-in parameters you can use to easily tweak images. I'd rather pay for MJ than run my own SD.

The main upside of running your own SD is that you can completely automate it, but I'm not sure how useful that really is.


> The main upside of running your own SD is that you can completely automate it

No, the main upside of running your own SD web UI is that you can select and deploy your own checkpoints (not just using the base SD models), LoRas, embeddings, upscaling models, and UI plugins supporting additional services/models/features like multidiffusion (bigger gens and controls of which areas within the image different prompts apply to), ControlNet and associated models, video synthesis, combinatorial prompts, prompt shifting during generation to do blending effects, and, well, a million other things.

Also, you can completely automate it.


The midjourney price would be equivalent to ~100 hours cloud time. How is that crazy expensive?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: