Hacker Newsnew | past | comments | ask | show | jobs | submit | stefan_'s commentslogin

Headlight regulation obviously stopped making any sense at all when they allowed bigger cars to put them up higher. Like you are gonna regulate all kind of beam parameters and then miss the most important thing.

There is a reason US school buses look like WW2 troop transport and the long haul trucks are museum pieces in all aspects. It's not even NIH, it's just protectionism.

Sorry but I think you just don't know a lot about LLMs. Why did they start spamming code with emojis? It's not because that is what people actually do, something that is in the training data. It's because someone reinforcement learned the LLM to do it by asking clueless people if they prefer code with emojis.

And so at this point the excessive bullet points and similar filler trash is also just an expression of whatever stupid people think they prefer.

Maybe I'm being too harsh and it's not the raters are stupid in this constellation, rather it's the ones thinking you could improve the LLM by asking them to make a few very thin judgements.


I know the style that most LLM's are mimicking quite well, and I also know people who wrote like that prior to the LLM deluge that is washing over us. The reason people are choosing to make LLMs mimic those behaviours is because it used to be associated with high effort content. The irony is now it si associated with the lowest effort content. The irony is I have stopped proof reading my comments etc. and put zero effort into styling or flow, because right now the only human thing left to do is make low effort content of the like only a human can.


Ironically there is precedent of Google caring more about this. When they realized location timeline was a gigantic fed honeypot, they made it per-device, locally stored only. No open letters were written in the process of.

They made the feature, now they get to live with it. So they can spare us the feigned surprise and outrage.

Instead of writing open letters they could of course do something about it. Even Google stopped storing your location timeline on their servers and now have it per-device only.


We’re talking about two different things. It would be like Gmail not storing your emails. Expecting ChatGPT to not store your chats is ridiculous

There are some good points around how limited the entropy available here is, but it entirely skips over who the fuck needs hotplug memory in the first place. That is a very niche feature that has no application in the vast majority of devices and should never inform the defaults.

It made it very clear - virtualization builds where memory can be dynamically added and removed by the emulator. I haven't done this with Android but it can be quite useful for running lots of test emulators, they can adapt their memory to the workload to not overwhelm the host.

So you agree, it has no place or purpose when running on an actual device.

Aaand 10 years later you just learned to compare versions by equality instead of being impossibly clever.

There is no free lunch, the same people that can't be bothered to make atomic semantic commits are the same people that will ruin your bisect with a commit that doesn't build or has some other unrelated run failure. People that don't care can't be fixed by tools.

The advice around PRs rings hollow, after all they were invented by the very people that don't care - which is why they show all changes by default and hide the commits away, commit messages buried after 5 clicks. And because this profession is now filled with people that don't care, add the whole JIRA ticket and fix version rigmarole on top - all kinds of things that show up in some PMs report but not in my console fixing an issue that requires history.


It is such a non problem it forced them to hack a „fuck you this Python is owned by the distribution not you“ message into pip requiring you to agree to „breaking your system“ to use it.

Of all the languages, python in the base system has been an unmitigated garbage fire.


> it forced them to hack a

It was not their action, nor is it hacked, nor is the message contained within pip.

The system works by pip voluntarily recognizing a marker file, the meaning of which was defined by https://peps.python.org/pep-0668/ — which was the joint effort of people representing multiple Linux distros, pip, and Python itself. (Many other tools ignore the system Python environment entirely, as mine will by default.)

Further, none of this causes containers to be necessary for installing ordinary projects.

Further, it is not a problem unique to Python. The distro simply can't package all the Python software out there available for download; it's completely fair that people who use the Python-native packaging system should be expected not to interfere with a system package manager that doesn't understand that system. Especially when the distro wants to create its tools in Python.

You only notice it with Python because distros aren't coming with JavaScript, Ruby etc. pre-installed in order to support the system.


Well the essential system Python should be in /usr/sbin and read-only (insofar Python allows that with its __pycache__ spam).

The fact that users have to keep up with multiple PEPs, error messages, --single-version-externally-managed, --break-system-packages, config files everywhere, stealth packages in .local and uv to paper over all of this shows that Python packaging is completely broken.


> the essential system Python should be in /usr/sbin

There's still quite a bit you can do with the "system Python". Mine includes NumPy, bindings for GTK, QT5 and QT6, Freetype, PIL....

> insofar Python allows that with its __pycache__ spam

This is, to my understanding, precisely why the standard library is pre-compiled during installation (when the process already has sudo rights, and can therefore create the `__pycache__` folders in those locations). This leverages the standard library `compileall` module — from the Makefile:

   @ # Build PYC files for the 3 optimization levels (0, 1, 2)
   -PYTHONPATH=$(DESTDIR)$(LIBDEST) $(RUNSHARED) \
    $(PYTHON_FOR_BUILD) -Wi $(DESTDIR)$(LIBDEST)/compileall.py \
    -o 0 -o 1 -o 2 $(COMPILEALL_OPTS) -d $(LIBDEST) -f \
    -x 'bad_coding|badsyntax|site-packages' \
    $(DESTDIR)$(LIBDEST)
> The fact that users have to keep up with multiple PEPs, error messages, --single-version-externally-managed, --break-system-packages, config files everywhere, stealth packages in .local and uv to paper over all of this shows that Python packaging is completely broken.

Please do not spread FUD.

They don't have to do any of that. All they have to do is make a virtual environment, which can have any name, and the creation of which is explicitly supported by the standard library. Further, reading the PEPs is completely irrelevant to end users. They only describe the motivation for changes like --break-system-packages. Developers may care about PEPs, but they can get a better summary of the necessary information from https://packaging.python.org ; and none of the problems there have anything to do with Linux system Python environments. The config files that developers care about are at the project root.

Today, on any Debian system, you can install an up-to-date user-level copy of yt-dlp (for example) like so, among many other options:

  sudo apt install pipx
  pipx install yt-dlp
You only have to know how one of many options works, in order to get a working system.


> All they have to do is make a virtual environment

Okay so to create a five line script I have to make a virtual environment. Then I have to activate and deactivate it whenever using it. And I have to remember to update the dependenceis regularly. For my five line script.

Seems to me the companies managing mloc-codebases pushed their tradeoffs on everyone else.


I found the whole 'activation' model deeply confusing and annoying. The underlying mechanism of venv is simple and elegant, but the original interface is terrible piece of design.

You, too: please do not spread FUD.

> Okay so to create a five line script... For my five line script.

I can guarantee that your "five line script" simply does not have the mess of dependencies you imagine it to have. I've had projects run thousands of lines using nothing but the standard library before.

> Then I have to activate and deactivate it whenever using it.

No, you do not. Activation scripts exist as an optional convenience because the original author of the third-party `virtualenv` liked that design. They just manipulate some environment variables, and normally the only relevant one is PATH. Which is to say, "activation" works by putting the environment's path to binaries at the front of the list. You can equally well just give the path to them explicitly. Or symlink them from somewhere more convenient for you (like pipx already does for you automatically).

> And I have to remember to update the dependenceis regularly.

No, you do not in general. No more so than for any other software.

Programs do not stop working because of the time elapsed since they were written. They stop working because the world around them changes. For many projects this is not a real concern. (Did you know there is tons of software out there that doesn't require an Internet connection to run? So it is automatically invulnerable to web sites changing their APIs, for example.) You don't have to remember to keep on top of that; when it stops working, you check if an update resolves the problem.

If your concern is with getting security updates (for free, applying to libraries you also got for free, all purely on the basis of the good will of others) for your dependencies, that is ultimately a consequence of your choice to have those dependencies. That's the same in every language that offers a "package ecosystem".

This also, er, has nothing to do with virtual environments.

> Seems to me the companies managing mloc-codebases pushed their tradeoffs on everyone else.

Not at all. They are the ones running into the biggest problems. They are the ones who have created, or leveraged, massive automation systems for containers, virtualization etc. — and probably some of it is grossly unnecessary, but they aren't putting in the time to think about the problem clearly.

And now we have a world where pip gets downloaded from PyPI literally billions of times a year.


Thank you! Exactly what I wanted to explain.


Yet, if I write a dockerfile, and need to use perl, system perl is fine.

If I need a python script, I have to arrange for all the RUN lines to live inside a virtual environment inside the container.


People are too harsh on this. It's not hard to install a version manager and set your primary python to that. Which is just good hygiene.

My understanding of the reasoning is that python-based system packages having dependencies managed through pip/whatever present a system stability risk. So they chose this more conservative route, as is their MO.

Honestly if there is one distribution to expect those kinds of shennanigans on it would be Debian. I don't know how anybody chooses to use that distro without adding a bunch of APT sources and a language version manager.


yes because then you're starting to use non-distro python packages. If you want to do that, use a virtualenv, there is no safe other way (even if there was no python in the base system) .


Yes, the distro people are strong believers in virtual environments as best practice - for you, not them.


There's a good reason for this. The average user has no idea and doesn't care what language some random distro-packaged program is written in. They want to be able to run ubxtool or gdal_calc or virt-manager or whatever without setting up a virtual environment. Python developers on the other hand should be adept at such things, should they choose to use a non-distro packaged version of something.

The tricky part is when "users" start using pip to install something because someone told them to.


This should become the official error message!


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: