No, definitely not on every deploy! For a deploy, I just do a "git pull" on all of the production boxes and restart the web server.
Baking a new AMI only happens when there are new underlying dependencies, like a new package from apt-get or a new Python module from pip. In other words, it's rare.
Is there any reason for not letting pip update things automagically from your `prod_requirements.txt` using a simple fabric `fab update_pip` when necessary?
Yes, in fact that's what I do! The issue is I need to account for any new servers that might spin up later (hence updating the AMI).
I could also change the AMI's "user data" to run pip when the instance loads, but I'm not 100% confident it will always run without errors. I feel better doing it manually and baking it into the AMI. Personal taste.
For the record, we have a very similar production environment and we bake on every deploy. We have some tricks to make it as fast as possible though. A deployment takes 2-3 minutes with code changes and more with requirements changes. We then place it in the ELB server pool and pull out old machines if everything behaves correctly. Looking into autoscaling now.