Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

For what it's worth,

    python -m venv venv
    source venv/bin/activate
    pip install -r requirements.txt
has never let me down and doesn't abstract away anything.


In your example, one of the abstractions is 'what Python does it use'. Because I'm dealing with this very issue right now, distributing a simple Python tool to a Windows end user, and they're on heavily managed government machines and it turns out that their 'python' command invokes a python.exe somewhere on a network drive and for some reason the '-m venv venv' fails, which then causes all the rest to fail as well of course. Now I don't know yet if this network thing is really the root cause, I've just received some error logs that showed this this morning, so I'm on another round of trying to remote debug this, via email, with someone who is (while patient and willing) not a Python programmer.

Python packaging is such a ridiculous PITA, it's laughable if it wasn't so sad. I tried nuitka as suggested upthread, and send my customer a single binary to try, let's see if it works.


It's probably Python 2, which won't have venv.


That's pretty much the procedure the article link to in the intro, but with some additional subtleties on how to source and run python.


Except many python packages have started to move away from requirements.txt to pyproject.toml files, so you have to deal with that. Plus now you have to either remember where you put all your venvs and which tool is where or do weird hacky things with $PATH.


The recommended process nowadays is to use Pipx which just creates a fresh venv for each tool.


That doesn't work for development but I agree that it's extremely useful for using Python projects.


That might work for a subset of projects, but try that with anything that does for instance numerical analysis stuff, like machine learning. There is a reason why Conda is a requirement for so many Python projects.


If the provider of requirements.txt did their homework, yes. However I often got incomplete requirements files or they contained mutually incompatible versions (numba and numpy soemtimes dont get along).

(This is in academia btw)


I use a variation of this with direnv and asdf and it's fine, no problems.

    asdf install python 3.11.0
    asdf local python 3.11.0  # .tool-versions

    python -m venv .venv

    echo >.envrc <<EOF
    export VIRTUAL_ENV=.venv
    export PATH=.venv/bin:$PATH
    EOF 

    direnv allow .
Now I just `cd` to project directories and I'm good to go.


> has never let me down

Oh you sweet summer child. You are like those people who want to discover why make backups the hard way.


What if there are native dependencies required? It starts to get a bit hairy whenever something cannot be provided by pip alone. Shipping binrary libs is not a suitable option.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: