Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

On that note, why are we trying to use language specific packaging tools to build packages rather than just building OS specific packages where dependencies are handled by apt for deb packages or dnf for rpm packages?

These are language agnostic and can get the job done.



Because I have dozens of colleagues, each with their own os/distribution, developing an application that needs to lock down dot releases of dependencies to the versions running in production and update them at the same time. There is no way to do that using distribution specific tools without going insane.


>> why are we trying to use language specific packaging tools to build packages rather than just building OS specific packages where dependencies are handled by apt for deb packages or dnf for rpm packages?

>>

>> These are language agnostic and can get the job done.

> Because I have dozens of colleagues, each with their own os/distribution, developing an application that needs to lock down dot releases of dependencies to the versions running in production

I assume that the production environment isn't running dozens of os versions and distributions. For development, using a VM or container running the same os version and distro that's used in production and using that OS's package format for packaging the software and installing it in the dev environment (on the container or VM) would work. You're testing to see if the software works in the production environment, not someone's preferred os/distribution.

> and update them at the same time

I'm not as familiar with apt, but dnf has a version lock feature that would allow you to lock down the dependencies to specific versions. You could update them and test them during development and update the version lock file to pull down the updated version of the dependency when updating production.


For every dependency that's missing from the upstream distribution in the exact version we need, we'd need to package that appropriately. We have nothing to gain here, nobody pays for that.


Once you package a dependency, updating to a new version requires minor changes unless it's a major version change. What you do gain is the ability to easily upgrade and downgrade a particular dependency and verify the integrity of the installed files (something that pip, for example, doesn't provide as far as I'm aware, but the OS package manager does).


This is just a crazy amount of work, given the alternative is updating your whatever lock file and let the transitive dependency resolution of your language of choice do the rest. Note that I very much like proper clean .debs as an end user, and if that's my customer base I'd publish like that as well. But if I'm last in the chain and my customers are intern, I'd never ever in a million years take the route you propose.


because where you have to build 1 or 2 packages for windows and macOS (97% of the computers used by end users), you have to build tens of packages for the main linux distribs (and not even all)


> you have to build tens of packages for the main linux distribs (and not even all)

This is something that's typically handled by the package maintainers for each linux distro, rather than the ones who developed the application. Some developers maintain their own public repositories for packages they built and instruct end users to add their repositories to their package manager config, but they typically will also include the source archive (if it's open source) and build instructions for those running distributions they haven't built packages for.

Looking at Virtualbox[1], for example, in addition to Mac and Windows, they built packages for Redhat, Ubuntu, Debian, OpenSUSE, and Fedora, and they provided the sources along with build instructions[2] for those who are running distributions where there's no pre-built package. In fact, the last option is the most flexible one though it requires a bit more work for the end user.

[1] https://www.virtualbox.org/wiki/Linux_Downloads

[2] https://www.virtualbox.org/wiki/Downloads


I just checked and there are 1.3 millions packages in npm, 600k in pypi and 100k in nuget. While I'm sure that most of them may be either obsolete or useless, that's still an order of magnitude bigger than the packages proposed by a Linux distributions (60k for Ubuntu)

And virtualbox is actually a good exemple of what I'm saying: despite being one of the major software in one of the field where Linux is strong if not dominant.

- they feel obliged to distribute themselves their Linux packages - they have to distribute 1 package for windows, 2 for macOS but 12 for Linux.


I'm not sure what goes into the decisions package maintainers make in terms of which packages to include in the OS repository, but, in my experience with python applications, most dependencies we needed could be found in the OS repository or some other supported package repositories (for RPM, repositories like epel or rpmfusion). The few we weren't able to find weren't that difficult to package and add to our internal package repository.

But this also brings up the issue of vetting dependencies. If you're pulling in a dependency that pulls in 10s of other dependencies (direct and indirect), it gets difficult to vet them. PyPi and npm have already had issues with malicious packages being uploaded. On the other hand, I haven't really found the large number of dependencies being an issue for python packages available in the OS package repositories, and I'm not aware of any incidents with those repositories unlike PyPi and npm.


Last thing the Linux package maintainers need is the entirety of PyPI dumped onto their lap to package and maintain.




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: