Hacker Newsnew | past | comments | ask | show | jobs | submit | jibal's commentslogin

Of course science is consensus based ... consensus is a fundamental part of the scientific process, which is conducted by a community of scientists. Consensus is the end result of attempts at reproducibility and falsification, of the ongoing process by which scientists challenge the claims and purported findings of other scientists. Without it, all you have are assertions from which people can pick and choose based on their biases (as we see, for instance, with people who deny climate or vaccine science by cherrypicking claims).

https://en.wikipedia.org/wiki/Scientific_consensus

https://skepticalscience.com/explainer-scientific-consensus....

https://tomhopper.me/2011/11/02/scientific-consensus/

And even if you reject consensus as being essential to science, calling the consensus view "the non scientific view" is obviously mistaken, a basic error in logic.

This is all well understood by working scientists so I'm not going to debate it or comment on it further.


No it isn't and that clickbait article doesn't say it is.

What makes it a clickbait article?

>There is a federal law that prohibits people from communicating with dolphins.

>It’s called the Marine Mammal Protection Act. Signed in 1972 by President Richard Nixon, the federal law was created to protect marine mammals from being hunted, harassed, captured or killed.

>In a sense, talking to or communicating with dolphins could qualify as harassment under the Marine Mammal Protection Act.

>There are two levels of harassment, according to the National Oceanic and Atmospheric Administration. Harassment at one level is considered “any act of pursuit, torment, or annoyance that has the potential to injure a marine mammal or marine mammal stock in the wild.”

>On another level, harassment is defined by the NOAA as “acts having the potential to disturb (but not injure) a marine mammal or marine mammal stock in the wild by disrupting behavioral patterns, including, but not limited to, migration, breathing, nursing, breeding, feeding, or sheltering.”


No, the whole point is to eliminate dependencies that they have to maintain. "not obligate" really doesn't mean anything if it's available as a backend--the obligation is on the Zig developers to keep it working, and they want to eliminate that obligation.

And the original question was "how will they reivent the wheel on the man-years of optimization work went into LLVM to their own compiler infrastructure?" -- the answer is that Andrew naively believes that they can recreate comparable optimization.

There are a whole lot of misstatements about Zig and other matters in the comments here by people who don't have much knowledge about what they are talking about--much of the discussion of using low-level vs high-level languages for writing compilers is nonsense. And one person wrote of "Zig and D" as if those languages are comparable, when D is at least as high level as C++, which it was intended to replace.


First I was naive to believe I could make a new programming language, then I was naive to believe it would be anything but a toy project, then I was naive to believe that we could make our own backends for debug mode, now I'm naive to believe that we can add optimizations to the pipeline. It's getting old. Just because you lack the creativity, willpower, and work ethic to accomplish something, doesn't mean I do.

I admire your creativity, willpower, and work ethic, and a few other things about you, but I don't admire reactionary garbage like this ... I'm actually rather shocked by it and how it leans heavily on the strawman "I'm naive to believe that we can add optimizations to the pipeline" which is not the statement that was made, but I will maintain my high regard for you and your efforts despite it ... no human is perfect. I have a lengthy list of technical brilliancies in Zig that I admire that I won't bore you with but do often bore others with.

At least you acknowledge that I am correct about your belief, whereas someone else said I was exactly wrong.

As for me, while I had a successful software development career spanning 6 decades, received a mention in a two-digit RFC, and hold several networking patents, my best years are far behind me, but even in my heyday I couldn't hold a candle to your creativity, willpower, work ethic, or productivity ... but how is that at all relevant?


> the answer is that Andrew naively believes that they can recreate comparable optimization.

That's exactly wrong.

> There are a whole lot of misstatements about Zig and other matters in the comments here by people who don't have much knowledge about what they are talking about.

Well spoken. You should look in the mirror.


To clarify, my statement was based on comments I have seen and heard from Andrew Kelley when discussing this subject. I can't locate those at the moment, but here is https://news.ycombinator.com/item?id=39156426 by mlugg, a primary member of the Zig development team (emphasis added):

"To be clear, we aren't saying it will be easy to reach LLVM's optimization capabilities. That's a very long-term plan, and one which will unfold over a number of years. The ability to use LLVM is probably never going away, because there might always be some things it handles better than Zig's own code generation. However, trying to get there seems a worthy goal; at the very least, we can get our self-hosted codegen backends to a point where they perform relatively well in Debug mode without sacrificing debuggability."

The current interim plan (which I think was developed after the comments that I heard from Andrew, perhaps in recognition of their naivete) is for Zig to generate LLVM binary files that can be passed to a separate LLVM instance as part of the build process. Is that "a first-class supported backend target for compilation"? I suppose it's a matter of semantics, but that certainly won't be the current LLVM backend that does LLVM API calls.

P.S. It may be helpful to read through https://github.com/ziglang/zig/issues/13265


> The current interim plan...

What do you mean by "interim"? As I explicitly stated in the comment you quoted, it has never, and likely will never, been planned for the Zig compiler to become incapable of using LLVM. The LLVM backend still sees plenty of active development by the core team [0]---that's perfectly compatible with improving the experience of users (including ourselves) by avoiding unnecessary uses of LLVM [1].

> ...is for Zig to generate LLVM binary files that can be passed to a separate LLVM instance as part of the build process. Is that "a first-class supported backend target for compilation"? I suppose it's a matter of semantics, but that certainly won't be the current LLVM backend that does LLVM API calls.

I think you are incorrectly assuming that we currently make heavy use of the LLVM API. As indicated by #13265 being closed, that is not true. The Zig compiler already generates bitcode by itself, without touching the LLVM API. The only thing we actually use the LLVM API for is feeding that bitcode to LLVM, which can easily be done by invoking a CLI instead. Users quite literally would not be able to tell if, for instance, we changed the compiler to pass the bitcode to Zig's embedded build of Clang over CLI.

[0]: https://ziglang.org/devlog/2026/#2026-04-08

[1]: https://ziglang.org/download/0.15.1/release-notes.html#x86-B...


Fitbit Inspire3 ?

The location of the Earth is completely irrelevant. "closer" and "farther" refer to the center of the galaxy.

Earth isn't relevant. The stars at the center of the galaxy developed first, and development proceeded from the inside out, so the youngest stars are on the edge ... then they get older from there on out, as the stars beyond the edge broke away from the galaxy. The bottom of the age U is the location of the formative edge.

It's not objectively false, people just can't read.

> the high-caliber hurricanes that, before climate change, didn’t come to the Ridge


You're making the parent's point.

These recent storms only got to Cat4.

Similar storms hit the aforementioned areas in 2004-05 including Cat4.

How do these revelations not contradict the article?


Which people? And no, it's not defined that way: "radiation having a wavelength between about 700 nanometers and 1 millimeter"

I don't think you understand how jokes work. They are mostly "distortions" of real dialog or events to add incongruous or absurdist elements. Here, Hardy's not uncommon momentary doubt about whether a statement really was obvious, while faintly amusing, is made into a joke by turning the momentary doubt into a 15 minute excursion. People then riff on the joke by turning that excursion into a mathematician presenting an elaborate proof that a statement is "obvious", quite contrary to the meaning of "obvious".

> I don't think you understand how jokes work.

That makes sense, I was just born yesterday.


loser

(Didn't you notice being mocked for the spelling error?)


Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: