In your example, one of the abstractions is 'what Python does it use'. Because I'm dealing with this very issue right now, distributing a simple Python tool to a Windows end user, and they're on heavily managed government machines and it turns out that their 'python' command invokes a python.exe somewhere on a network drive and for some reason the '-m venv venv' fails, which then causes all the rest to fail as well of course. Now I don't know yet if this network thing is really the root cause, I've just received some error logs that showed this this morning, so I'm on another round of trying to remote debug this, via email, with someone who is (while patient and willing) not a Python programmer.
Python packaging is such a ridiculous PITA, it's laughable if it wasn't so sad. I tried nuitka as suggested upthread, and send my customer a single binary to try, let's see if it works.
Except many python packages have started to move away from requirements.txt to pyproject.toml files, so you have to deal with that. Plus now you have to either remember where you put all your venvs and which tool is where or do weird hacky things with $PATH.
That might work for a subset of projects, but try that with anything that does for instance numerical analysis stuff, like machine learning. There is a reason why Conda is a requirement for so many Python projects.
If the provider of requirements.txt did their homework, yes. However I often got incomplete requirements files or they contained mutually incompatible versions (numba and numpy soemtimes dont get along).
What if there are native dependencies required? It starts to get a bit hairy whenever something cannot be provided by pip alone. Shipping binrary libs is not a suitable option.