Hacker Newsnew | past | comments | ask | show | jobs | submit | lerno's commentslogin

I just have to add another reflection:

One thing that tends to be overlooked when discussing changes is the ecosystem effect of frequent changes.

A language that breaks frequently doesn't just impose upgrade work on apps, but also discourages the creation of long-lived libraries and tools. Anything that sits between the language and the user (linters, bindings, frameworks, teaching material, tutorials etc) has to to some degree "chase the language"

This means that the ecosystem will skew toward very actively maintained libraries and away from "write once then leave it alone" libs. And this the trade-off is reasonable during early language design, but it's worth acknowledging that it has real consequences for ecosystem growth.

One should note that other newer languages have put significant effort into minimizing this churn, precisely to allow the latter type of ecosystem to also form. So it's kind of an experiment, and it will be interesting to see which approach ends up producing the larger ecosystem over time.


An example for this is the Blender addon ecosystem. Blender moves very fast, breaking API changes every few versions. Now I am not an addon developer myself, but from github issues I follow, changes are fairly often trivial to do.

Yet, someone has to do them. Ideally it is the creator of the addon, sometimes it's the users who do it, when the addon is not maintained anymore (in case of trivial changes).

It kinda works that way, but it also is some kind of gamble for the user. When you see a new addon (and a new addon developer), you can't know if they gonna stick to it or not.

If you have to pay for the addon, it's more likely they maintain it, of course. But also not a guarantee.


Zig's worth it though. I don't care for libraries that aren't maintained anyway. If they don't maintain it, it's just bad anyway. Why do you always nag about Zig. Stop shilling your C3 here.


This kind of thought is popular in the web world where browsers get an update every 3 days and you don't control the hosting services, so constant maintenance is unavoidable.

But in the world of desktop development it's possible for a library to be "done", having a 100% stable codebase going forward and requiring no maintenance. And it's not bad, it's actually good.

Requiring every dependency to be constantly maintained is a massive drain on productivity.


Once Zig hits 1.0 it will essentially be done. They don't plan on making further changes to the language, so they want to get it right while they can.


Looks like all the 2 users of C3 came here to complain about Zig. Why should we listen to you?


I've never even heard of C3, I use C++ and Jai.

I've never used Zig either, I just disagree that requiring constant maintenance is a good.


> However we value explicitness and allow the developer to do anything they need to do*

* except for having unused variables. Those are so dangerous the compiler will refuse the code every time.


They are indeed dangerous, and I think this is a pretty good example of why.

https://andrewkelley.me/post/openzfs-bug-ported-zig.html


don't know if it's still on the table, but Andrew has hinted that the unused variables error may in the future still produce an executable artefact but return an nonzero return code for the compiler. And truly fatal errors would STILL produce an executable artefact too, just one that prints "sorry this compilation had a fatal error" to stdout.


It’s hard to say that one needs unused variables.


If I comment out sections of code while debugging or iterating I don't want a compile error for some unused variable or argument. Warning. fine, but this happens to me so frequently that the idea of unused variables being an error is insane to me.


It is insane and you are completely right. This has been a part of programming for over 50 years. Unfortunately you aren't going to get anywhere with zig zealots, they just get mad when confronted with things like this that have no justification, but they don't want to admit it's a mistake.


But even the solutions would be so trivial - have a separate 'prod' compiler flag. With that, make these errors, without make these warnings.

Problem solved, everyone happy.


i think the plan is to make no distinction between error and warning, but have trivial errors still build. that said i wouldn't be surprised if they push that to the end because it seems like a great ultrafilter for keeping annoying people out so they don't try to influence the language.


You are right of course, the solution is trivial.

They also made a carriage return crash the compiler so it wouldn't work with any default text files on windows, then they blamed the users for using windows (and their windows version of the compiler!).

It's not exactly logic land, there is a lot of dogma and ideology instead of pragmatism.

Some people would even reply how they were glad it made life difficult for windows users. I don't think they had an answer for why there was a windows version in the first place.


I'm not sure why you shouldn't make your compiler accept CRs (weird design decision), but fixing it on the user-side isn't exactly hard either. I don't know an editor that doesn't have an option for using LF vs CRLF.

The unused variable warning is legitimately really annoying though and has me inserting `_ = x;` all over the place and then forgetting to delete it, which is imo way worse than just... having it be a warning.


I don't know an editor that doesn't have an option for using LF vs CRLF.

And I don't know any other languages that don't parse a carriage return.

The point is that it was intentionally done to antagonize windows even though they put out a windows version. Some people defend this by saying that it's easy to turn off, some people defend it by saying windows users should be antagonized.

No zig people ever said this was a mistake, it was all intentional.

I'm never going to put up with behavior like that with the people making tools actively working against me.


> And I don't know any other languages that don't parse a carriage return.

fair enough.


Same for kernel drivers


It’s good to see that this is finally addressed. It’s been a well known broken part of the language semantics for years.

There are similar hidden quirks in the language that will need to be addressed at some point, such as integer promotion semantics.

To address the question about stability: the Zig community are already used to Zig breaking between 0.x versions. Unlike competitors such as Odin or my own C3, there is no expectation that Zig is trying to minimize upgrading problems.

This is a cultural thing, it would be no real problem to be clear about deprecations, but in the Zig community it’s simply not valued. In fact it’s a source of pride to be able to adapt as fast as possible to the new changes.

I like to talk about expectation management, and this is a great example of it.

In discussions, it is often falsely argued that ”Zig is not 1.0 so breaks are expected” in order to motivate the frequent breaks. However, there are degrees to how you handle breaks, and Zig is clearly in the ”we don’t care to reduce the work”-camp.

If someone is trying to get a more objective look at the Zig upgrade path, then it’s worth keeping in mind that the tradition in Zig is to offload all the work on the user.

The argument, which is frequently voiced, is that ”breaking things will make the language get better and so it’s good that there are language breaks”

It is certainly true that breaking changes are needed, but most people outside of the Zig community would expect it to be done with more care (deprecation paths etc)

Secondly, it should perhaps be a concern for Zig, now at 10 years old, to still produce solidly breaking code every half year.

10 years is the common point where languages go 1.0. However, the outlook for a Zig 1.0 is bleak from what I gather from Zig social forums: the most optimistic estimate I’ve heard is 2029 for 1.0.

This means that in the future, projects using Zig can still expect any libraries and applications to bitrot quickly if they are not constantly maintained.

Putting this in contrast with Odin (9 years old) which is essentially 1.0 already and has been stable for several years.

Maybe this also explains the difference in actual output. For example the number of games I know of written in Odin is somewhere between 5 to 10 times as many as Zig games. Now weighing in that Zig has maybe 5 or 10 times as many users, it means Odin users are somewhere between 20-100 times as likely to have written a playable game.

There are several explanations as to why this is: we could discuss whether the availability of SDL, Raylib etc is easier on Odin (then why is Zig less friendly?), maybe more Odin has better programmers (then why do better programmers choose Odin over Zig), maybe it’s just easier to write resource intensive applications with Odin than Zig (then what do we make of Zig’s claim of optimality?)

If we look past the excuses made for Zig (”it’s easy to fix breaks” ”it’s not 1.0”) and the hype (”Zig is much safer than C” ”Zig makes me so productive”) and compare with Odin in actual productivity, stability and compilation speed (neither C3 nor Odin requires 100s of GB of cache to compile in less than a second using LLVM) then Zig is not looking particularly good.

Even things like build.zig, often touted as a great thing, is making it really hard for a Zig beginner (”to build your first Hello World, first understand this build script in non-trivial Zig”). Then for IDEs, suddenly something like just reading the configuration of what is going to be used for building is hidden behind an opaque Zig script. These trade-offs are rarely talked about, as both criticism and hype is usually based on surface rather than depth.

Well, that’s long enough of a comment.

To round it off I’d like to end on a positive note: I find the Zig community nice and welcoming. So if you’re trying Zig out (and better do that, don’t let others’ opinions - including mine - prevent you from trying things out) do so.

If you want to evaluate Zig against competitors, I’d recommend comparing it to D, Odin, Jai and C3.


it seems to me like the acceptance of breaking changes by the Zig community is more coming from "it's worth it to get a nice, polished language" than some "source of pride" which i personally have not seen in my time in a/the Zig discord server.

what is the problem with zig being developed for 10+ years? if people want stable languages there are stable languages to be used. if a language like zig is not achievable in less than 10 years, should it just not be developed from the start?

i think your problems with build.zig are overstated. where do you see someone saying "to build your first Hello World, first understand this build script in non-trivial Zig"? you can literally just do `zig run file.zig`, so if someone is advocating for that then i think many would agree they are teaching the wrong way. i wonder if you saw an example project with a build script that was intended to show the power and possibilities of Zig rather than to be a starter guide.


> Secondly, it should perhaps be a concern for Zig, now at 10 years old, to still produce solidly breaking code every half year.

Not at all, if the team needs 30 more years they should take it.

> However, the outlook for a Zig 1.0 is bleak from what I gather from Zig social forums: the most optimistic estimate I’ve heard is 2029 for 1.0.

Funny you see it as bleak when most of the community sees it as the most excitinh thing in systems programming happening right now.

I think you comment is in bad faith, all the big zig projects say that the upgrade path is never a main concern, just read HN comments here or on other zig threads, people ask about this a lot and maintains always answer.


> Not at all, if the team needs 30 more years they should take it.

Yes, I understand that is the opinion in the Zig community. As an outsider, it seems odd to me to pick a language that I constantly need to maintain.

>> However, the outlook for a Zig 1.0 is bleak from what I gather from Zig social forums: the most optimistic estimate I’ve heard is 2029 for 1.0.

> Funny you see it as bleak when most of the community sees it as the most excitinh thing in systems programming happening right now.

You misread that one. I was talking about the odds of seeing a 1.0 version of Zig soon.

> I think you comment is in bad faith, all the big zig projects say that the upgrade path is never a main concern, just read HN comments here or on other zig threads, people ask about this a lot and maintains always answer.

Maybe you didn't read what I wrote carefully enough. This is part of the protectiveness from the Zig community that prompted me to write in the first place.

WITHIN the Zig community it is deemed acceptable for Zig upgrades to break code. Consequently it becomes simple survivor bias that people who use Zig for larger projects don't think that this is a major concern BECAUSE IF THEY FELT IT WAS A CONCERN THEY WOULD NOT USE ZIG.

Whether programmers at large feel that this is a problem is an unknown still, since Zig has not yet reached to point of general adoption (when people use Zig because they have to, rather than because they want to).

However, it is INCORRECT to state that just because a language is not yet 1.0 it needs to break older code aggressively without deprecation paths. As an example, Odin removed the old `os` module and replaced it with the new "os2". This break was announced half a year in advance and lots of thought was put into reducing work for developers: https://odin-lang.org/news/moving-towards-a-new-core-os/

In the case of C3, breaking changes only happen once a year with stdlib going through the general process of deprecating functions long before removing them.

I wanted to highlight how these are quite different approaches. For established languages, this is of course even more rigorous, but neither C3 nor Odin are 1.0, and still see this as valuable and their communities then end up expecting it.

So please understand that when you say "it's never a main[sic] concern", this is simple survivor bias.


> In the case of C3, breaking changes only happen once a year with stdlib going through the general process of deprecating functions long before removing them.

zig release happens once a year, either a breaking change. I don't really get how you tried defended yourself, do you think it's any "different"?


Zig is great. If frequent updates are the price for that, so be it.

Honestly it’s kind of narrow-minded not to appreciate how different its approach to low-level programming is. You either see it or you don’t.

C3 is basically just C with a few extra bells and whistles. Still the same old C. Why would I use that when Zig exists? Actually never mind, don’t bother answering.

And the doomer posts are so predictable. That’s usually when you know Zig is doing something right.


A pretty obvious explanation as to why Odin has more games written in it is that the language is somewhat explicitly marketed towards that use-case, even going as far to have a vendor library collection that includes many popular game dev libraries.


I am using games, because they pop up more often (gamejams etc), but we can see the same if we'd look at utility apps. Do you want to broaden that to "Odin is more explicitly marked towards writing applications", but if so what would that say of Zig?


I would begin by questioning the premise. Do you have actual numbers on this? I’m not really aware of any widely adopted software that’s written in Odin. Can name multiple in the case of Zig


C3 0.7.10 introduces constdef, syntactically and semantically making a clear distinction between proper enums and "enums is collection of constants".

Other improvements in this release:

- Much improved MSVC cross compilation

- Quality-of-life fixes

- Custom LLVM builds to reduce external dependencies


C3 is a language designed as an evolution of C, without retaining strict backwards compatibility, but excellent interop with C.

This version departs from its rather uncommon module-based generics, but doesn't go all the way to C++ style templates. Instead generics can be grouped for common constraint checking on template parameters and instantiation. As usual it also contains fixes and additions to the stdlib.

Some older posts on C3:

- https://news.ycombinator.com/item?id=46463921

- https://news.ycombinator.com/item?id=43569724

- https://news.ycombinator.com/item?id=24108980

- https://news.ycombinator.com/item?id=27876570

- https://news.ycombinator.com/item?id=32005678

Try out C3 in the browser:

- https://www.learn-c3.org

Here are some interviews on C3:

- https://www.youtube.com/watch?v=UC8VDRJqXfc

- https://www.youtube.com/watch?v=9rS8MVZH-vA

Here is a series doing various tasks in C3, slightly dated:

- https://ebn.codeberg.page/programming/c3/c3-file-io/

Repository with link to various C3 resources and projects:

- https://github.com/c3lang/c3-showcase

Some projects:

- Gameboy emulator https://github.com/OdnetninI/Gameboy-Emulator/

- RISCV Bare metal Hello World: https://www.youtube.com/watch?v=0iAJxx6Ok4E

- "Depths of Daemonheim" roguelike https://github.com/TechnicalFowl/7DRL-2025


I didn't see any similarities to C3, quite the opposite.


C2 (http://c2lang.org) similarly compiles to C, but arguably more readable C code from what I can see. The benefits are (1) easy access to pretty much any platform with little extra work (2) significantly less long term work compared to integrating with LLVM or similar (3) if it's readable enough, it might be submitted as "C code" in working environments which mandate C.


It would be interesting to hear the motivation for it.


But it of course move semantics and destructors affect all things. If the goal is to call and be callable from C without special constructs, how would you make the C code respect the move semantics and destructors?


C++ is already callable from C, you can make functions have any call signature that you want. What is the actual problem?



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: