Indeed, AI packages share code like junkies. From my observation Nvidia has been one of the worst offenders of this. I wish they understood that “optional dependencies” are a thing in Python packaging. You don’t need to ship a web application framework and all its dependencies if its not a core part of the modeling process. But hey maybe I’m the one who’s wrong.
There are kinda two python cultures: programmers and data scientists/engineers.
The data guys often use anaconda, which is pretty much like installing every package in advance and then crossing your fingers.
They care about loading matrices and multiplying them, and not so much `__init__.py` or `pyproject.toml`.
Programming python for 20 years, I find that “everything is a sandbox” style impossible to maintain and hard to read, but if they are getting regressions and classifiers and encoder/decoders to run, the data guys don’t care ;)