If you use https://github.com/johnsmith0031/alpaca_lora_4bit then 30B only needs 24GB, and works on a single 3090 or $200 P40.