Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
C++ at the End of 2022 (cppstories.com)
114 points by signa11 on Jan 2, 2023 | hide | past | favorite | 210 comments


I think the most interesting thing about C++ in 2022 is that big C++ names at Google are saying in public that if you can use Rust instead of C++, you should. https://github.com/carbon-language/carbon-lang/blob/trunk/do...


The context behind this endorsement is that senior engineers at Google are unhappy with the decisions taken by the C++ committee. The big one is described in this document - What is ABI, and What Should WG21 Do About It? (https://www.open-std.org/jtc1/sc22/wg21/docs/papers/2020/p20...).

The document ends with “I call on WG21 to make a conscious and explicit choice here, with the clear awareness that status quo is an endorsement of indefinite ABI stability. If we wish to be the systems language known for performance, we have to act now. If not, we have to be aware that we are giving up on some important user bases.”

The committee chose an endorsement of indefinite ABI stability. A defensible technical choice, certainly. However, not one that works for Google.

That explains why they’re going ahead with the development of Carbon, as well as their endorsement of Rust. They don’t think the future of C++ development at Google is bright.

This has other effects too. Someone in this thread pointed out clang is lagging behind in implementing C++20 compared to MSVC and GCC. This might be because Google is committing fewer resources to the maintenance and improvement of clang.

Further reading: Difficulties improving C++ (https://github.com/carbon-language/carbon-lang/blob/trunk/do...)


There was a bit of discussion yesterday about 'immaturity' in the programming community. To me, this is an example of a broader immaturity.

Mature people/organizations are willing to give up their local minimum for greater good of the community (global minimum). This would include staying with a language where the standards don't always go your way.

Immature people/organizations say "I'm going to create my own language! With blackjack and hookers!". And the programming landscape fragments further...


> And the programming landscape fragments further...

This process is called evolution.

There are only the two options when trying out new things: You fragment your language, or the language landscape.

Most languages don't want to have incompatible versions so as a result new languages get created during the evolution process.


While true, I feel the pace of evolution is too fast at the moment.

I am involved in scientific programming, and it's getting kind of crazy. This is a domain where programming is purely utilitarian, and what you are programming is more important. Unfortunately, new languages are driven by people who love programming for the sake of programming, and are often employed in big corporations that can absorb the financial cost of ecosystem duplication and fragmentation.

The time scientists spend on 'infrastructure' is increasing, and I'm not sure that's always a good thing. (Some is good, of course).


>I am involved in scientific programming, and it's getting kind of crazy. This is a domain where programming is purely utilitarian, and what you are programming is more important.

In my experience this leads to a lot of bad programming by people who only care about the utilitarian aspect, not about maintainability, user experience or failure modes.

Scientific programming would benefit the most from a stricter language that catches errors early on. Especially because most of the participants are usually not software engineers that have some semblance of a chance to catch errors in C++. Even they make a lot of mistakes.


In evolution, most branches are dead-ends (just like in programming languages). Carbon for example is by design an evolutionary dead-end designed to solve some problems for Google. They’re trying to sell it as a generic tool, but knowing Google’s track record it’s hard to say how many will bite.


> Carbon for example is by design an evolutionary dead-end designed to solve some problems for Google.

I would concur.

This shows actually nicely that the analogy to biological evolution matches well.

Google is building some kind of cave fish. Biology did also such things when they were right for a given ecosystem niche.


A lot of people don't think that a frozen ABI is for the greater good of the community. It is specifically good for the set of people who cannot recompile their dependencies or cannot recompile their binaries. That's an ever-shrinking population.


> That's an ever-shrinking population.

Is that really so? I really can't imagine that "recompiling the world" when the ABI is changed would be considered a normal thing to do or expect from users. Like, all of the sudden all the software that we use on our machines becomes subtly broken since, well, new version of C or C++ runtime library is now out.

I really don't see how this can be considered as a reasonable thing to do and therefore I understand and support the slow and careful ABI increments because the language is there to serve the goal which is beyond being a hostage to a handful of big players.


> I really can't imagine that "recompiling the world" when the ABI is changed would be considered a normal thing to do or expect from users.

> I really don't see how this can be considered as a reasonable thing to do […]

Well, Linux distributions do exactly this, since decades, and it works just fine.


Except endless crying about hardware vendors not making drivers for Linux, because lack of stable ABI.


That's not the language ABI. This is the same term used for two different things. Language ABI is stuff like "this is how name mangling works", "this is how parameters are passed", and "this is the widest integer type."


That makes not sense as the C++ compiler ABI issue is independent of the OS.

Also the OS ABIs are usually C ABIs because there is no C++ ABI at all…


There is no such thing as C ABI either.


I know.

https://faultlore.com/blah/c-isnt-a-language/

That's why I worded it with "ABIs". ;-)


That's a completely separate problem.


And with the loose ABI model, how much more often they'd have to do that?


Does it matter?

It's not like some dude needs to operate a crank for it to happen…

The computer does the actual work.

You know, a computer, this machine that can automate tedious tasks.


I think it does. It also opens another big hole of worms of maintaining and porting the existing software. And it also opens another question which is: how often should we allow to break the ABI? On regular basis, semi-regular basis, rare, very rare etc. And who defines this threshold?

What do you think the reasons are that ABI transition didn't happen already?


"How often" is indeed a decent question, but the ranges aren't too wide. It'd either be once per C++ release (every couple of years) or in some of the C++ releases (every five years or so). It isn't like if the committee decides it is okay to increase the widest integer type every week.

The reasons why the ABI transition hasn't happened already is because C++ has a historical position of breaking the ABI only very very rarely. This is, in part, a legacy from a world where it really was going to be frustratingly difficult to get new builds of their libraries.


In an ideal world you would just need to "press a button". So this could happen every second day without any issue.

Of course things are more complex. Not everything is a Linux distribution where it's simple to rebuild the world.

But as the history of C++ shows people managed to life with this issue for a very long time. So it's not a show stopper in any way.

Even other very conservative languages like Java found a balance for breaking changes. The "problem" C++ has is more of a mental one, imho.

Also C++ could easily introduced the "editions" concept form Rust. This would give you all form both worlds. You could improve future version (or editions) but stay compatible to old ones at the same time.


Rebuilding trillions LoC that are deployed on literally every computer architecture, dozens of toolchains and OS's is a high impact and high risk problem with arguable benefits. Although it is technically possible, I don't think it is a process which "just happens" but it is a process which probably take years to be implemented on each and every platform.

So, I don't think it's a "C++ mentality" but its rather a difficult problem to solve.

> Also C++ could easily introduced the "editions" concept form Rust. This would give you all form both worlds.

We'll see in about 20 years time if "editions" will solve this problem and if Rust becomes as widespread as C++ is. LoC deployed in Rust is currently a statistical error in comparison to C++. FWIW C++ also had similar proposal and it didn't went through AFAIK.


> Rebuilding trillions LoC that are deployed on literally every computer architecture, dozens of toolchains and OS's is a high impact and high risk problem with arguable benefits. Although it is technically possible, I don't think it is a process which "just happens" but it is a process which probably take years to be implemented on each and every platform.

It was mentioned already that Linux distributions are doing exactly this. On every major compiler update.

> So, I don't think it's a "C++ mentality" but its rather a difficult problem to solve.

Sure it's not trivial to set up. But it's possible and done so for a long time already.

Sure, things could be made simpler. For example by using some stable intermediate representation that gets compiled on the target to the appropriate machine code. Oh, wait, all big platforms starting with mainframes do exactly this already… (Mainframes, the JVM, .NET, Android, Apple stuff, etc.)

> > Also C++ could easily introduced the "editions" concept form Rust. This would give you all form both worlds.

> We'll see in about 20 years time if "editions" will solve this problem and if Rust becomes as widespread as C++ is. LoC deployed in Rust is currently a statistical error in comparison to C++. FWIW C++ also had similar proposal and it didn't went through AFAIK.

That's not an argument against the "editions" idea. Actually what you've said doesn't even go into the proposal.


> It was mentioned already that Linux distributions are doing exactly this. On every major compiler update.

I guess you're aware not everything runs on Linux neither are all Linux deployments vanilla Linux deployments. At this point I think you're just intentionally ignorant or completely unaware of the world outside your bubble environment.

> For example by using some stable intermediate representation that gets compiled on the target to the appropriate machine code. Oh, wait, all big platforms starting with mainframes do exactly this already… (Mainframes, the JVM, .NET, Android, Apple stuff, etc.)

Eh?

> That's not an argument against the "editions" idea. Actually what you've said doesn't even go into the proposal.

It is an argument because suggesting to "easily introduce" the idea from another immature language which (1) is not proven and (2) which will take long long time before it does is wishful thinking, if not nonsense and far from reality. Rust "editions" do not solve the problem because, well, to begin with Rust doesn't have the problem of the same scale that C++ does.

We'll see what C++ will do in this regard but I wouldn't hold my breath. Disadvantages of breaking the ABI frequently currently outweigh the advantages and my opinion is that it will stay the same way for unforeseeable time.



Thanks for sharing the links but that's what I already said above in the comment.

> FWIW C++ also had similar proposal and it didn't went through AFAIK.


Ah I missed that! Well now you have the link showing you’re right :)


It's not the users who would recompile the world though, but the OS vendors. And really in 2022 with a threadripper, rebuilding the world is a matter of hours: my 2018 laptop can build a complete yocto environment from scratch, from GCC down to a Qt GUI running on X11 or wayland in 8-ish hours IIRC and is waaay slower than a TR, so really there's no excuse to not rebuild.

Previous ABI would still be provided for running proprietary software on Linux just like you can still install libstdc++5. On windows apps ship the c++ runtime along with them so nothing changes for old apps. MacOS users are used to compatibility breaking every 3/4 years and went through actual architecture changes once every decade so obviously changing c++ abi is minor in comparison


Please don't nitpick, I'm perfectly aware that the OS vendors would be the ones to recompile everything and users the ones to update their systems.

I don't agree with your sentiment as I read it as a little bit narrow-minded and I also consider the examples given to be bad, with design choices not quite feasible to be applied to generic and widespread community language such as C++.

> On windows apps ship the c++ runtime along with them so nothing changes for old apps

That combination is probably the worst from the both worlds so not exactly a shiny example.

> MacOS users are used to compatibility breaking every 3/4 years

Apple has always been special and I wouldn't consider their choices, which are almost exclusively aligned with what their business wants to achieve, to be something to be followed. They can do whatever they want because they're under total control of almost everything on their platform. Unless when they aren't and when they start to develop their own programming languages. Pretty much the same story as with Google.


Which users are you even talking about then? If you don't take into account Windows and macOS you remove 99.9% of the people which are going to be impacted by this (and I say this as a full-time linux user for a decade).

> That combination is probably the worst from the both worlds so not exactly a shiny example.

I really disagree, Windows's model of shipping DLLs along the app is the only one that works and does not cause headaches for the end-users who want to run their software from 15 years ago (e.g., me, my parents, my non-technical neighbours, etc) ; this is the only thing that matters, the end-user experience.


> If you don't take into account Windows and macOS you remove 99.9% of the people which are going to be impacted by this (and I say this as a full-time linux user for a decade).

How about server-side code powering virtually any service that we're using? I think that your 99.9% would quickly become 0.01%. Desktop is important but irrelevant in this regard.

> I really disagree,

I beg to differ. I think it's really terrible for reasons I have no time to dig in through right now. Many of them are in fact quite obvious.


> How about server-side code powering virtually any service that we're using? I think that your 99.9% would quickly become 0.01%. Desktop is important but irrelevant in this regard.

the huge huge huge majority of the software I use for doing, like, useful stuff, are offline. Internet could loose all interactive features tomorrow and be restricted to just displaying fixed hand-written HTML pages like it's 1993 and my day-to-day activities on my computer would not meaningfully change - making music with the software I develop, editing photos in Krita and Darktable, writing papers in TeXStudio, the occasionnal 3D / CAD project with Blender / KiCad / LinuxCNC, etc etc


In a world where dependencies are either delivered as source or through contract relationships with internet-connected businesses, I imagine that this is a normal thing. In the past you got code shipped to you on CDs. Today, I can get my dependencies via a network request.

If you don't have the ability to rebuild or otherwise get updated dependencies, you've got problems beyond ABI stability. "I cannot rebuild my dependencies" is unacceptable from a vuln management perspective, for example.


And we all know how well Goggle is at keeping interest in its side projects.


> The committee chose an endorsement of indefinite ABI stability.

It actually didn't. It chose to further delay an actual decision here. The effect is almost the same but it means different things when it comes to understanding how the committee thinks and whether they largely agree or not.


The longer the committee holds onto the current status - that we neither promise stability nor will break it - the more users who value performance over never running a compiler will leave.

The remaining members are increasingly likely to vote in favour of stability, despite that meaning performance overhead. By induction, the expected result should be that stability is chosen over performance in the future as well.


I think that is indeed true, but if that is the future I'd really like the committee to actually say it. Right now, the folks who really do depend on being able to link against code compiled a decade ago can't fully trust that the status quo will remain.


My personal opinion on the whole thing is that the decision to not standardize the ABI was correct, because it has been easy since forever to define a static binary interface, if you (the developer) choose to do so. See OpenGL, for example.

If the compiler randomizes the ABI of stuff that you didn't explicitly export as an interface, that shouldn't bother anyone. Unless, of course, you're using undocumented APIs. And even then, you can solve all those issues by statically linking against your dependencies.

In 20+ years of building distributed C++ systems I've just never seen inter-compiler ABI compatibility being an issue in practice.


Intra compiler abi is the problem. I have to support plugins someone else built with GCC old. Break abi and I can't upgrade my code.


Then it seems like your plugin API was not specified well. extern "C" and ABI will stay the same, no matter which compiler is used.


Hindsight is 20/20. Though I will say that management at the time hid this aspect from engineers, and so we didn't know about the risk until it was too late .


> If the compiler randomizes the ABI of stuff that you didn't explicitly export as an interface, that shouldn't bother anyone.

I somewhat agree with the parent's point if we broaden "compiler" to include all of the ecosystem tools that care about the ABI.

E.g., debuggers really benefit from knowing about the ABIs being used.


Yes, but debuggers are useless without the source code. And if you have the latter, re-compiling to match the new ABI is easy.


> Yes, but debuggers are useless without the source code.

That's not true in my experience. Sometimes it's helpful to see the call stack even without symbols. And sometimes there's benefit in debugging at the disassembly level, even without access to the source code.

> And if you have the latter, re-compiling to match the new ABI is easy.

I think we're talking about the scenario where the code was already compiled to the new ABI, but the debugger doesn't understand the new ABI.

A few thoughts on this:

1) Depending on circumstances, even if you have access to the source code, you might not be able to recompile it for the sake of better debugging.

2) If the code is linked against libraries that use the new ABI, or uses compiler features that require the new ABI, you can't necessarily produce a build that uses only the old ABI.


Google isn't some grand authority. It's an old, tired and dysfunctional legacy IT company, like IBM or Oracle.


They aren't a grand authority, but they maintain a huge amount of C++ code


C++ code using their flavor of strange guidelines... Google C++ is not how the rest of the world writes C++.


Right, every org and individual has their own subset of c++ they use and exclude. C++ may have standards but that hardly matters when projects/orgs/individuals have their own variant of the language that is used.

It’s a fallacy to think that there’s some idiomatic C++, there isn’t in practice.


Not more than Microsoft.


Microsoft has also officially and unofficially started replacing C++ with Rust.


So far it looks like experiments, besides a few projects like Azure Sphere SDK, Azure IoT (C# and Rust), and Rust/WinRT (which is even worse than C++/WinRT in regards to tooling in its current state, no authoring support for COM or VS integration).

Office and Windows teams love C++ and COM too much to use anything else.


> Google isn't some grand authority.

Agreed, but collaborative efforts like C++ language steering, Clang development, etc. can benefit from a large pool of contributors.

So IMHO, Google reducing investmentin Clang could be a net loss for most Clang users.


They said the same thing for Java, Kotlin, and Go:

https://github.com/carbon-language/carbon-lang/blob/trunk/do...


As a Rust developer [1], I can't help but look at C++ as anything but a very dangerous Perl that has outlived its welcome.

C++ can do all kinds of wacky stuff, all of which feels bolted on as the language tried to grow support for each passing paradigm and fad. The syntax is arcane and makes PHP look downright delectable. Pointer sigil placement and const correctness (dangerously) matters, template compile errors look like alien machine code, and no amount of new best practices will save you from old C++ codebases.

I can't see new people reaching to learn C++. It will die with its current users. It's difficult to learn anyway, mostly because the materials and community are inaccessible and stuck in the 90's. And the important lessons on how to actually properly do memory management aren't enforced and only come from painful failures, direct tutelage, or reading an entire book on the subject twice over.

If all Rust had going for it was Cargo, an improved syntax, and the better docs and compiler messages, it would still win. Thankfully it's got so much more than that, and it fixes many of the systemic problems that C/C++ can't address.

[1] (in production, generating revenue)


> no amount of new best practices will save you from old C++ codebases

That's true.

But at some point maintaining this code will give you your weight in gold on a daily basis. Like it's currently for COBOL.

No, I'm not looking for such a job. I fear C++ and especially the legacy code. But there will be likely people willing to look after this code.

> I can't see new people reaching to learn C++. It will die with its current users.

Successful languages don't "die". They fade out. Very, very slowly…

I think this process in some way resembles nuclear decay and the concept of half-time.

> If all Rust had going for it was Cargo, an improved syntax, and the better docs and compiler messages, it would still win.

Well, yes, but the syntax is no real improvement. Rust is as ugly as hell. (Which does not say anything about the actual language as such. Syntax is "just syntax". But Rust has chosen deliberately an ugly syntax to attract C/C++ people, I guess).


Ugly? The one thing I've seen complaints about wrt. Rust syntax is the lifetime specifiers, and that's a required part of the language.


Of course it's ugly. As hell.

All the unnecessary syntax noise everywhere. Unnecessary braces, unnecessary semicolons, unnecessary commas, angle brackets, some wild mixture of symbols and words (where clauses in types, WTF…), ill lambda syntax, etc.

A modern language shouldn't be designed to be foremost convenient for the machine, but instead convenient for the human in front. Parsing is cheap nowadays. Very cheap.

But OK, syntax is much of a personal opinion based thingy. It wouldn't be a show stopper, at least for me, if the actual language has merit.


Except that COBOL devs are not that paid that well. If they are worth their weight in gold the average COBOL developer must be pretty skinny.

By all means specialize in a legacy language if that is what you enjoy, but you are setting yourself up for disappointment if you do it for the money.


> Except that COBOL devs are not that paid that well.

I think there is no other software developer role even remotely paying as well as COBOL in Europe. You get mid level FANG salaries, which are twice or trice what you get for the usual development gig in the EU.

EU is banking land. And they locking deliberately for people with COBOL and mainframe skills.

> By all means specialize in a legacy language if that is what you enjoy, but you are setting yourself up for disappointment if you do it for the money.

I guess this is very true.

Even the money looks interesting I would not enjoy such a job, I guess. (It would be fun to find out about the mainframe but that's nothing I would like as a day job. It would be more historic interest. And COBOL, naa, there are more terrible things likely but it's not nice either).


> I can't see new people reaching to learn C++

I guess you wrote this while knowing very well that it's just whishful thinking. People will keep learning C++ for the foreseeable future, in order to reach the job market that this language opens up.

And if "people" in this sentence meant the companies using it, then again not a chance in the short term. Most companies using C++ have huge codebases that will only be ported to something different when it makes financial sense for them to do it (which is something that almost never happens for an already existing codebase)


> I guess you wrote this while knowing very well that it's just whishful thinking. People will keep learning C++ for the foreseeable future, in order to reach the job market that this language opens up.

Just like FORTRAN.

(Sorry for the snark! I do agree with your points.)

> Most companies using C++ have huge codebases that will only be ported to something different when it makes financial sense for them to do it (which is something that almost never happens for an already existing codebase)

Again, not disagreeing with you, but this is why it's good when companies get eaten by more nimble startups without the baggage. Legacy systems die due to business failure.


Yeah, I agree with you. Note I'm a C++ dev who would like to see less of C++ and more of newer languages that have improved upon mistakes of previous ones. But my opinion is that C++ will remain king for a long time in those niches where it has established itself.


There are billions more lines of C++ code in production than FORTRAN. It's not a good analogy.


And people still write FORTRAN today.


Hell, the school I've been to has one of its curriculums still centered around Fortran - people come out of here and it's their primary language (and they've got no issues finding jobs at all)


Because Fortran is still top of its class at number crunching


> Legacy systems die due to business failure.

You mean like in the case of COBOL? ;-)

OK, admitted, banks can't fail. They're protected by law of nature. (Or something like that).


The funny thing is, to somebody who's neither involved too much in C++ nor Rust, Rust syntax looks just as arcane as C++ ;)


Rust looks and feels a lot like Ruby (expressions/functional), Java (generics), and Swift (error handling, though Rust was the inspiration).

The novel parts are lifetime specifiers (infrequent), reference/slice sigils (not a big deal), and the weird macro language.


Also, https://www.val-lang.dev/ seems more interesting. And http://www.jot.fm/issues/issue_2022_02/article2.pdf

> Safe by default: Val’s foundation of mutable value semantics ensures that ordinary code is memory safe, typesafe, and data-race-free. By explicit, auditable opt-in, programmers can use unsafe constructs for performance where necessary, and can build safe constructs using unsafe ones.

Versus Carbon:

> Carbon's premise is that C++ users can't give up performance to get safety.

https://github.com/carbon-language/carbon-lang/blob/trunk/do...


Are you writing this because it’s really interesting or because you’re a Rust promoter? Sneaking in comments about Rust in discussions about other languages is a classic guerrilla marketing move that the Rust community is known for.

Assuming the best intentions, one should think how misguided it is to look up to amoral entities like corporations as role-models. At a macro level, Google does what brings them money. At a micro level what brings the engineers promotions and improves their reputation.


Rather than about Google, which is just a corporation albeit a large one, this is about the direction of C++.

I've written about this before, there's a tendency to insist that if you just asked surely future C++ can accommodate whatever it is that is needed. What a bunch of people (many but not all from Google) did was write a C++ Proposal which says "This is what C++ needs" and the committee said "No". P2137 "Goals and Priorities for C++"

That finally means you can have the next conversation, instead of "Surely C++ can do that" we can get to "OK, C++ won't do that - what are we going to do instead ?"

The answers include a lot more Rust, as you have seen from Google and other entities.

I'm presumably part of the "guerilla marketing" you're talking about, although I don't think it's useful to imagine a community as engaging in "guerilla marketing" when they do what people naturally do, communicate. When people whose first language is Spanish speak Spanish to each other on the bus and you overhear them, that's not "guerilla marketing" for Spanish by any useful definition.

Here's the thing: If you think of this point as "guerilla marketing" then C++ has been riddled with "guerilla marketing" for Rust for several years. Vittorio's (failed) Epochs proposal for C++ 20 more or less just says "Look, Rust has this cool feature [Editions], we should do that too".


> I'm presumably part of the "guerilla marketing" you're talking about, although I don't think it's useful to imagine a community as engaging in "guerilla marketing" when they do what people naturally do, communicate.

Even I agree with all the rest, it's known that grass roots marketing is very strong in Google.

There are people who's job includes to write on high impact channels like HN to promote things.

The Rust hype does not come from thin air. It gets generated in part with the help of a lot of money in the background.

Of course this time the task isn't difficult as Rust sells itself in large parts just on the ground of it's features. But this process gets accelerated with money of course.

There are not much languages that made it purely by (or despite ;-)) their virtues.

One honorable exception is Scala. It made it into the Top20 even it does not have a marketing division, only a small team behind, and a community that seems to love to produce bad publicity consistently (even the language excels at quite some things!).

And there are cases like PHP, C, Objective-C (and for some likely, JS)… But let's not talk about those, …, historical accidents…


I would consider whether people like writing it to be a virtue, so I would argue that the widespread love for Rust is, in fact, an example of a language thriving on virtue.

I have learned a great many languages (and forgotten some of them, a while back I was prompted to re-discover that last century I wrote a bunch of Scheme, my name is on the work and the timeline checks out but I don't remember it) over my lifetime. Rust is the first language where I want to go back and rewrite stuff in Rust because of how nice it is.


I'm a Scala fan. (Maybe this shows :-)).

But I'm also quite excited by Rust lately.

But coming from Scala my feeling is constantly: Rust is missing so much still!

I would really like a language more like Scala but with the performance of Rust.

Of course other people get very excited when using Rust because for many this is the first proper language they've ever encountered.

But I'm quite unimpressed in general regarding "hyped" Rust features. I had immutable values, HOFs, ADTs, type-classes, pattern matching with exhaustivity checks, macros, and all that "since forever". But I miss HKTs, implicits, proper macros, and some other things in Rust. It will take at least a decade for Rust to catch up. But than Scala will be even farther away…

But OK, at least one can say that the ML family of languages is finally succeeding. (Scala and Rust are both descendants of ML).


Sure, lots of things would be nice to have in Rust that aren't done yet. Some of them you can see in nightly, because their rough shape exists but isn't stabilised. Rust is taking things rather steady because of a strong preference for soundness.

In particular I want a stable niche (my nook crate uses the unstable niches because that's the only way to do it, the intent is to use a stabilised mechanism when one exists), I want a stable way to write const implementations of traits which aren't necessarily const (and thus const for loops) and on the same lines I want const panic.

A strong macro system is very dangerous. I use and like Rust's declarative ("By example") macros, and I appreciate the need for proc macros or other technology but it's very dangerous. Are Scala's macros somehow less dangerous? Or you just prefer how they work?

Rust has come a pretty long way since 1.0. Once upon a time u8::MAX couldn't exist. There was no way to express the idea that a type has an associated constant, so that's why std::u8::MAX is (deprecated but) there.

It's probably never going to be as comfortable to write Rust as Scala, but on the other hand it's definitely never going to be possible to deliver the performance of Rust in Scala.


As someone that had to dabble in Scala.

Most of stuff Scala has fill me with dread. "proper macro", implicits, custom operator overloading, etc. Are extremely powerful concepts. Too powerful if you ask me.

They tend to make code utterly unreadable (without an IDE).

Why is `int` a `Date`? Don't know. Use an IDE.

What does the ~%+# operator does? ¯\_(ツ)_/¯ [2]

But I'm a bit conservative in language syntax. I'm against async and its in most languages already (it makes debugging more convoluted).

[1]https://lprakashv.medium.com/how-to-keep-your-sanity-working...

[2]http://www.scala-graph.org/guides/core-initializing.html


Have you actually read what you've linked?

From [1]:

> While this can be really disheartening, this is not enough of a reason to move away from an amazing programming language, just because of a really abused feature!

Please also note: The blog post talks only about implicit conversions. That's actually the most uninteresting part of implicits. (Watch the video there to learn what implicits are actually for).

Also the post is about the old version of Scala. Implicits got redesigned in Scala 3. They're not even called implicits any more.

https://docs.scala-lang.org/scala3/reference/contextual/inde...

All problems mentioned in the blog post are solved. For example you can't import `given`s (like implicit definitions are called now) by accident. You need to do this explicitly. They got removed form the normal import scope.

Implicit conversions are btw. now heavily guarded, with very explicit declarations in form of a type-class instance & lang imports, besides the other changes that make application much safer so less surprises possible:

https://docs.scala-lang.org/scala3/reference/changed-feature...

But anyway, everybody know since years that you should not overuse them.

Also other languages like C# have this feature (and I'm not even talking about dynamic languages where this is the "normal" way everything works without safe guards by a static type system). Nobody ever complained about C# in this regard.

Besides that: The Rust people are looking envious. I've read about some ideas that were more or less a direct copy of Scala's implicits. And I'm quite sure, if Rust would introduce something like that most people would love it! Because it would solve some issues in Rust, and would also allow to remove complex boilerplate code.

Than, [2] is a pure obscurity. Never heard of.

But even the "WTF-operators" defined in this library look very strange (this is not usual Scala code!) I would not know without lookup what a `WLkUnDiEdge` is, even when spelled out…

And to stress it once more: This external lib is not part of Scala. Something like that wouldn't be ever accepted into the std. lib!

The BDFL wants to even remove unrestricted operator syntax. But I hope this does not happen as it would only make the language more complex for no gain. Someone who defines a `~%+#` operator would likely still do it even when you would have to write it as ` ~%+# `. So nothing won. [The operator method is written without the spaces around the back-ticks of course. But this does not render well here]

> Too powerful if you ask me.

If you don't like powerful languages with modern features I guess Rust is also not for you.


> Also the post is about the old version of Scala.

It wasn't my language by choice. Neither was the version.

> If you don't like powerful languages with modern features I guess Rust is also not for you.

I like Rust's pragmatism. Allow limited operator overload. Eschew HKT for a simpler abstractions. Don't go in the deep end with type power, nor too much in opposite direction and avoid any complicated feature.

The more powerful feature the more abusable it is, and Scala loves the power at all cost.

Why would anyone care? The more flexible/powerful something is the harder it will be to parse by humans and tooling.

Plus Scala has the big deal breaker. GC and no custom primitive types.

> Besides that: The Rust people are looking envious. I've read about some ideas that were more or less a direct copy of Scala's implicits.

What do you mean exactly?


> I like Rust's pragmatism.

Me too.

But Scala is also a very pragmatic language. If you want something academic go for Haskell.

> Allow limited operator overload.

Nitpick: Scala does not have any operators. So it doesn't have operator overloading at all.

Scala simulates operators by infix method syntax.

Instead of writing `1.+(2)` you can just write `1 + 2`. But the later is the same method call as the first one!

> Eschew HKT for a simpler abstractions.

AFAIK HKTs are more or less "just postponed" in Rust, AFAIK.

People would like to add them of course. The discussion goes on forever by now. Some small insight (there is much more when you look for it):

https://github.com/rust-lang/rfcs/issues/324

https://internals.rust-lang.org/t/higher-kinded-types-the-di...

> Don't go in the deep end with type power, nor too much in opposite direction and avoid any complicated feature.

While having a full ML style type system with afine types on top, and quite some other type level mechanics up to singleton types?

Sure sure, no power in here. :-)

> The more powerful feature the more abusable it is, and Scala loves the power at all cost.

Everything is "abusable". This is not an argument.

But that Scala loves power at all cost is simply not true. The contrary is.

Just to cite one of the most influential post in Scala land of all times:

https://www.lihaoyi.com/post/StrategicScalaStylePrincipleofL...

This, and the BDFL constantly complaining about unnecessary complex code people write speaks for itself.

Scala lately even reduced the power of some features just to prevent "abuse". (Which is partly an overreaction; but that's another story).

> Why would anyone care? The more flexible/powerful something is the harder it will be to parse by humans and tooling.

That's also not true.

Scala has a very small and simple syntax (despite all the language features).

Scala is on the surface much much simpler and much more regular then Rust!

https://github.com/e3b0c442/keywords

(You could also compare the language grammars. This would be even more in favor of Scala in this regard).

Scala 3 looks even almost like Python!

https://docs.scala-lang.org/scala3/book/scala-for-python-dev...

> Plus Scala has the big deal breaker. GC and no custom primitive types.

What a "deal breaker"?

https://github.com/carbon-language/carbon-lang/blob/trunk/do...

You've seen this here in the thread?

Also:

https://docs.scala-lang.org/overviews/core/value-classes.htm...

As soon as Valhalla lands in JVM-land this will be full blown value types without any limitations.

And in Scala Native you can have of course native structs today. (Only that Scala Native isn't ready for prime time just now).

In the long run Scala Native could also run without GC. The Caprese project will bring something that is more powerful than Rust lifetimes. Lifetimes will fall out as a special case of a more general concept.

> > Besides that: The Rust people are looking envious. I've read about some ideas that were more or less a direct copy of Scala's implicits.

> What do you mean exactly?

Implicits get discussed every now and than in Rust land. Even the above Rust internals discussion start with them.

Or this here:

https://tmandry.gitlab.io/blog/posts/2021-12-21-context-capa...

Also I've once read something that looked like a brain storming for future Rust features. They came up with more or less implicits (only that they didn't call them like that, so I can't find this any more, didn't bookmark it).

Someone even once proposed directly Scala's implicits for Rust. But this went nowhere as the other people on the forum actually didn't understand them (which was no wonder as the example was quite terrible and the proponent was not really experienced with Scala so couldn't explain it well). People came than to quite wrong conclusions (some of them even mixed implicits in general even the dreaded implicit conversions, which were in fact mostly overused and caused trouble in Scala; but things got redesigned exactly because of that).


Don't respond to the Scala stuff; the poster is just trolling you: https://news.ycombinator.com/item?id=34218609


I'm not trolling anybody!

The parent is not fanboying Rust much. Instead this looks like an interesting exchange of insights and opinions.

I actually want to learn why people seem to really like Rust even it's "just" like a "Scala light".

What was deemed "too complex" or "academic" in Scala is now everybody's darling in Rust. This is actually quite interesting.

I'm looking for hints how to resolve the marketing issue with Scala.

The other comment was meant more lighthearted and shouldn't be read out of context.


> Rust is taking things rather steady because of a strong preference for soundness.

As does Scala.

Scala has even formal and machine checked proves for large parts of the language. Something that almost no other languages have. Especially no mainstream language.

> A strong macro system is very dangerous.

In which way? What do you mean exactly?

> Are Scala's macros somehow less dangerous?

Hard to say without knowing what is meant by "dangerous". :-)

But I guess you can judge for yourself:

https://docs.scala-lang.org/scala3/reference/metaprogramming...

The new meta-programming features are at least type safe.

(Also this is most likely the most advanced macro system out there. Fresh form top notch research, after many years of experimentation in the field).

> It's probably never going to be as comfortable to write Rust as Scala […]

As long as Rust doesn't change it's syntax (very unlikely) and adds a garbage collector (actually likely as there are some explorations already done in this direction) this will stay true, yes.

> […] but on the other hand it's definitely never going to be possible to deliver the performance of Rust in Scala.

I see no technical reason for that.

Rust doesn't do magic.

It just optimizes things quite well.

Key is aggressive erasure, specialization, and monomorphisation. Things that every compiler can do, if implemented.

Scala on the JVM can't do that really as this would break the expectations and semantics of the JVM. But Scala Native can do that!

It would "only" take someone to build this stuff…

Scala had already almost a quite advanced optimizer. But the dude who was building this left as soon as he got his PhD for that project and didn't finish it at all. Someone would need to pick up the remains and push this over the finish line.

As long as you would write code like in Rust (which is perfectly possible as Rust is kind of a subset of Scala) performance could be very close. (Of course you would need to avoid some features, like dynamic dispatch, and also excessive OOP patterns, but there is no technical reason why this shouldn't be possible at all).

Creating all the desugarings like the ones used in Rust would be of course some work. The optimizer that was almost there did not do that. It "only" optimized user level code, not the implementation of base types and build-in language features. (But still it could generate code that was en par with Java written by hand in the best possible way; the performance of such code is almost on the C level; the JVM suffers from issues with memory, not with performance, which is actually very competitive; there are even cases where the JVM beats C/C++ code regarding performance; but usually with one to two orders of magnitude more memory used). But like I said, Scala Native could take even one step further and optimize things in a way C/C++/Rust compilers do.

In case you missed Scala Native:

https://scala-native.org/en/stable/

Please bear in mind when comparing to something else: It's still pre v1.0 and still needs quite some love.

But it has partly better performance than GraalVM, and is much more stable than Kotlin Native (which is still alpha quality).

But for example the manual region based memory management in Scala (which can be used along the GC) would need improvement.

To keep up with Rusts borrow checker the whole capture checking stuff needs to land first; and this could take some time. But with this features Scala could do also completely without GC in the long run! (Only for dedicated "no GC code" of course).

https://www.slideshare.net/Odersky/capabilities-for-resource...

https://docs.scala-lang.org/scala3/reference/experimental/cc...


> Hard to say without knowing what is meant by "dangerous". :-)

Yeah, so when I say strong, I don't mean the sort of minor nibbling Scala is doing in the linked metaprogramming feature or that C and C++ have in their "pre-processor".

The published nightly_crimes! procedural macro for Rust mostly "just" replaces your running compiler process with a new one that thinks it is compiling itself (or its standard library) and so it's allowed to provide the unstable nightly features even though it's a stable compiler.

But it's clear that if you said "Aha we fixed the compiler to stop that happening" Mara could write a new macro which finds say a WiFi connection, guesses your password, and uses it to download an actual nightly Rust build, and installs it so that it can replace the running compiler with that.

You can't stop Rust's proc macros from seizing control of everything and doing whatever they want, they're full blown code execution during your build process, that is what dangerous means. Rust proc macro authors must exercise extreme care as a result.

This is probably too much power, but when people talk about a "real" macro system they clearly are expecting less power than this, and so lets agree we're talking about a less powerful macro system, not a more "real" one.


Thanks for the explanation. This makes sense!

The described issue gets actually neglected until now. It's like people are waiting that something happens (like just opening a project in an IDE installs malware. VSCode will at least ask whether is should trust build tasks. But nothing is in place for full blown code-gen through macro systems triggered by a mere compile).

As someone in the Scala community also pointed out once the code generated by macros or multi-stage compilation could do "anything", and some sandboxing mechanism would be required. But the discussion went nowhere. People just said you need to trust the code anyway…

In case the Rust community tackles this problem it would make also a nice reference for Scala. I think I should look how this is handled in Rust and maybe point to that in some discussion on the Scala forums.

Thanks for the pointer. I would agree that something like that is indeed dangerous!


Dang, I should've been collecting paychecks this whole time!

Sadly, my Rust evangelism comes purely from the heart. I've been so impressed by the language -- how more than worthy it is as a replacement for C[++], how it managed to break the ancient yet false dichotomy of "fast but dangerous"/"safe but slow (garbage collected)".

I'm convinced that switching to Rust will lead to less losses in both dollars and lives (as in actual deaths) than C[++] bugs have and will continue to account for.

Heck, if someone spins up a bunch of GPT bots to flood every discussion with pro-Rust propaganda, I can practically read that activity as humanitarian.

I think the zealotry is due to people who have come to the same conclusions I have (and who probably know the pain of CMake vs. Cargo) rather than shady corporate Nakatomi space psyops.


What is needed by whom? Google are essentially an ad-sales corporation . Microsoft are a former OS development corpo which wants to be Google. Amazon are an online store. It’s sad how all these corporations which have repeatedly shown they don’t respect privacy or local laws ended up influencing technical standards for the entire world.

If they have one thing in common is that they don’t want to invest in creating reliable products. They re-invent the wheel all the time, invent products that nobody wants and cancel them after a few years, create their own tools and languages and then shove them down everyone’s throat. They continuously churn new software, new features and updates because they don’t know when to stop growing, like tumors.

The software industry is a swindle. Rust is the band-aid solution to bad practices that companies don’t want to reform because it’s not profitable to do so.

Perhaps saying guerrilla marketing is being too lenient when it’s more like hustling and insinuating themselves in nearly all language discussions and trying to sell Rust. I’ve seen this in threads about C++, but also Go, Nim, C or even Python.

See the self-declared Rust programmer replying to the parent comment for a good example.

It doesn’t even matter if Rust is good, these people are as likable as pushy bazaar salesmen. If every time you wrote a comment here in English someone asked if you considered writing it in Spanish because it sounds more exotic, you’d get tired of that pretty fast.


I for my part decided to "fight" the fanboyism by even more fanboyism—instead of hatred. So we can even the level a little bit.

If someone tries to "sell" their language I will start to sell "my" language, which is btw. Scala.

Have you actually seen the new major version of Scala?

Scala 3 is once again way ahead of the pack when it comes to modern language features!

Rust looks like a little stripped down (but fast!) toy in comparison.

Now you may beat me. ;-)


Even more interesting that rust has such an obnoxious showing in forum comments and GitHub surveys but little actual corporate adoption. Seems like nobody with actual money on the line wants to take the hit to productivity to fully switch to rust. Google is literally writing a whole new language instead of using rust and rustaceans are still trying to spin it as a pro rust move. And that’s after Go, which funnily enough has also seen way more industry usage than rust. Hobbyist language.


AWS serveless runs on a type 1 hypervisor written in Rust.

Google is using Rust on Android and Fuchsia.

Carbon is for C++ code that they can't write from scratch in Rust, given its size.

Azure Sphere SDK only supports C and Rust is now in preview mode. No C++ support planned.

So while it is decades away from reaching C++ adoption level, it isn't as if the big names aren't making use of it on key projects.


Can't wait for CBMC and Why3 connection efforts for rust to start being used by AWS and their formal methods people (I know they're into TLA+ and I wish there was also some work going on to connect rust/SPARK and temporal logic tools better too).


I bring up Go because it’s of similar age to rust and yet significant projects have been written entirely or almost entirely in Go (ie. kubernetes, docker). Nothing of that scale has been written in rust. You don’t get to list things predominantly written in other languages lol.


Go is not similar age to Rust. Rust was not really viable pre-late 2018, when non-lexical lifetimes and async were added. Even Rust 1.0 (half-baked in retrospect) only came out in 2015.


Well, some CNCF projects seem to be moving from Go into Rust, if you want to use that as measure point as well.


Google has just literally put 1.5M lines of Rust code inside their Android runtime and reaping the benefits [0]. Is that pro Rust enough?. Sure not every project can migrate easily and they need to have a matching semantic language, that said, Carbon is still way (3-5 years) too early to be practical.

[0]:https://security.googleblog.com/2022/12/memory-safe-language...


Little corporate adoption? Rust is most viable for pure greenfield projects, which are rare, or gradual conversions from pure C most of which is now on weird embedded chips where Rust is inherently at a disadvantage for all sorts of reasons. When you account for that, Rust adoption is actually happening very rapidly. There's even a lot of RIIR activity that wouldn hardly be expected for any other language.


Steve is gonna subtweet you so hard for this!


As noted on my Reddit comment, the low numbers on static analysis and sanitizers adoption are a bit sad, and shows the reality versus the typical comment that everyone is already using them.


Low numbers in the 649 people that found and answered that survey, and are in the intersection of "people that answer online surveys" and "probably aren't beginners at C++".

It would of course be helpful if all the many, many, many, build systems for C++ have 'integrating sanitizers' in their "hello world" tutorials, rather than buried in some arcane man page.


That is why it was so relevant that clang introduced static analysis as part of the compiler, instead of having to get hold of lint.

However it isn't for it being there (VC++ can even do analysis per build), that people are rushing to adopt them after all these decades (lint was born in 1979).

So the social aspect matters a lot, and we keep losing there.


Had I answered that survey, I would have added a 0.1% or so to the sanitizers. Everywhere I've worked (now) runs sanitizers over the build.

> So the social aspect matters a lot, and we keep losing there.

Unfortunately.

For example, there's a "cmake" starter on github, that is everything including santizers, but it is, frankly, inaccessible to newbies that just want to learn C++ "from scratch". Compare that to... Rust. You create a new crate from the cli in one command, and then you never have to mess around with the build system. Or even java/kotlin... `gradle init`... and you're mostly done. C++? "Spends 15 minutes reading "build system X" docs...".


I wish it was only 15 minutes.


Static analysis I strongly associate with wasting my time.

Valgrind on the other hand, I struggle to imagine writing working C++ without it.


Maybe I am behind in the times but it was a little surprising to me to see MSVC have full C++20 support while GCC and Clang are still trying to get there.


Afaik Clang development has massively slowed down since Google gave up on trying to improve C++ as a language (it required breaking ABI, which the C++ standards committee rejected). They moved on and created Carbon instead for future C++ development where they need to work with existing C++ code, and recommend Rust for greenfield development.


Was clang development so dependent on one organisation? I can easily believe that many Google engineers are choosing to spend their time on Carbon now, but it seems strange that there was so much dependence here. Could there be some other reason for clang lagging behind?


Yes, it was also dependent on Apple's contributions, which nowadays focus on Swift and Objective-C.

If you look at the C++ versions being used across Apple's platforms, they are more than happy staying with C++17 (Metal Shading Language is even lower, based on C++14).

Almost everyone else seems to care only about LLVM, otherwise how to explain that LLVM gets almost as many contributions per year as the Linux kernel, while clang is lagging so much behind?


Far more likely to me is that bringing libc++ up to C++20 standards isn't important for google3. Carbon is a O(10y) project, not a O(3y) project.

  * absl strings provides the same compile-time guarantees that <format> does
  * absl date/time libs have integrations with the proto time types
  * google3 has many threading & synchronization primitives that it prefers over the STL types (all of which already has integrations with fleet-wide performance monitoring and other toolchain customizations, e.g. deadlock detection), so there isn't much reason to move the threading libraries forward


> Carbon is a O(10y) project, not a O(3y) project.

So are C++ language updates.


Standard updates, not STL implementation.


Shouldn't that be big Omega?


From what I am aware of, google wanted to break API significantly, not just ABI, by changing a bunch of standard library guarantees.


Good. It is high time for some breakage. When your toe is gangrenous, you should cut it off, even if you're sad to see it go. This applies doubly so if you're as strange and mysterious a beast as C++ is — sprouting body parts in random places seemingly overnight, only incidentally producing anything valuable. Maybe you don't actually need a thumb in your forehead in the first place, especially since it bleeds into your eye whenever you try to use it.


> sprouting body parts in random places seemingly overnight, only incidentally producing anything valuable. Maybe you don't actually need a thumb in your forehead in the first place, especially since it bleeds into your eye whenever you try to use it

I've been using c++ for a while, and this comment is perhaps the best and funniest description of what I think about the language but was never able to put into words. Kudos for the laughs!


It is not surprising given the licenses, that the two major clang contributors decided to focus elsewhere (Swift, Objective-C and Carbon), while everyone else[0] is happilly just consuming clang, and only contributing to LLVM.

At least GCC has Red-Hat supporting ISO C++ improvements.

[0] - ARM, Intel, Embarcadero, IBM, NVidia, IBM, Nintendo, Sony,...


But Red Hat is IBM, nowadays.

Also AFAIK IBM is one of the most dedicated supporters of C++ in general.


Yes and no, while IBM owns Red-Hat, it doesn't dictate how Red-Hat does their own business decisions.

Not in what concerns ISO C++ support, and if it was up to IBM, trigraphs would still be supported.

See what ISO version their xl compilers support, across all mainframe, Aix and Linux deployments.

https://en.cppreference.com/w/cpp/compiler_support


> Yes and no, while IBM owns Red-Hat, it doesn't dictate how Red-Hat does their own business decisions.

Even that's true, it's now IBM's money that gets invested.

> Not in what concerns ISO C++ support, and if it was up to IBM, trigraphs would still be supported.

I know about the trigraphs. That they were fighting for that shows imho exactly that IBM is heavily invested in C++. They have a shitload of old but hightly important code that can't be migrated realistically.

So IBM will keep throwing money on C++ to extend its live infinitely, I guess.


You forgot to mention IBM


Microsoft does a pretty good job there. clang has been lacking behind ever since google reallocated resources somewhere else.

What bothers me most about C++ is that it's mostly old people now. All the young people who are actively involved in language design, tool development, ... are putting their effort into other languages.


There are young people still specially those wanting to work in cool domains like game development, however many older ones have some vested interests in keeping C++ the way it is, hence why even advocating for security is such an uphill battle.

Or see how WinDev is so resistent to change, instead of providing .NET like tooling for C++, like Borland/Embarcadero have done for the last 25 years, they stick to their ways of manually editing IDL files and COM.


While I agree the young people who work on the languages and frameworks are increasingly working with other languages. The rust crowd is much younger than the people who are publicly working on popular C++ frameworks.

In 10 years, there won't be many people left working on C++ as a language and on the tooling.


Might be, but until Microsoft et all rewrite their tooling in Rust, there will be enough people to keep C++ going.

And if it changes, so be it, I am not coding today as I was in 1986, when I typed my first LOAD "".


> but until Microsoft et all rewrite their tooling in Rust, there will be enough people to keep C++ going.

At least Microsoft started to do exactly this. They started to touch the Win API even to create Rust bindings. Because MS saw that this is the only way to future prove this stuff.

Other big players, like IBM, don't look like they're moving, though. But OK, IBM moves so slowly as a pitch drop falls. So maybe we just don't see them moving.


Those Rust bindings are a joke for any serious Windows development.

If you feel like C++/WinRT development experience is bad, good luck with Rust/WinRT.

And since it is the same team leads, which clearly have proven that they don't care about developer productivity, given how they have managed the C++/CX to C++/WinRT transition, I wouldn't wait too much out of it.


Yes, Microsoft is Microsoft.

And I've heard rumors the Win team is an issue in general. (As the product they ship).

But they're at least moving. And the direction is clear.


What's wrong about old people?

The problem are people that don't want to change anymore. But this trait is independent form age in my experience.

But nevertheless, that C++ doesn't attract new developers may have reasons indeed. C++ refuses to get substantially modernized. In the long run such a stance will attract less and less people I think.

C++ would need a fresh start without all the baggage collected through several decades of backwards compatibility. A C++v2 is overdue, imho.

Only then it would find itself eventually competing with other new languages, without the current killer feature that reads "we can run ancient code". I'm not sure most people behind C++ are keen on such a challenge.


> What's wrong about old people? > > The problem are people that don't want to change anymore. But this trait is independent form age in my experience.

Nothing is wrong with old people. That's not what I was trying to say but people in their 50s or 60s will not be working for much longer. I don't think there will be many people left working __on__ C++ and its tooling in 10 years.

> But nevertheless, that C++ doesn't attract new developers may have reasons indeed. C++ refuses to get substantially modernized. In the long run such a stance will attract less and less people I think.

I think there are some pretty nice additions in the recent standards but all the backward compatibility is definitely a problem for the development of the language.

C++ is basically a "do it all" language and it feels like newer languages have rather specific use cases which they excel in.


> I don't think there will be many people left working __on__ C++ and its tooling in 10 years.

I would also extrapolate this form how things look now.

But in a sense the language has deliberately chosen that way.

> C++ is basically a "do it all" language and it feels like newer languages have rather specific use cases which they excel in.

Yes, I also see this.

I'm not really happy with that as I think it's easier to learn one "real general purpose language" properly than have shallow knowledge in two dozens of glorified DSLs.

But C++ isn't this language for me either. I personally would love to see Scala (v3) to become a better general purpose language. It's modern with a lightweight syntax, it's powerful (up to compile time verification of your code), it can embed all kinds of DSLs (which makes dedicated external DSLs superfluous), now it starts even to "run everywhere", but it's slow(er) and fat(er) compared to C++, Rust, and the other contenders in some very important areas, which sadly rules it out for many tasks.


> But in a sense the language has deliberately chosen that way.

Agreed, which is a pity but it is what it is.

> But C++ isn't this language for me either. I personally would love to see Scala (v3) to become a better general purpose language. It's modern with a lightweight syntax, it's powerful (up to compile time verification of your code), it can embed all kinds of DSLs (which makes dedicated external DSLs superfluous), now it starts even to "run everywhere", but it's slow(er) and fat(er) compared to C++, Rust, and the other contenders in some very important areas, which sadly rules it out for many tasks.

I have been doing C++ for the last 6 years now for scientific computing so safety wasn't the first priority, but chasing pointers while debugging is certainly no fun.

Recently, I have been getting into Rust and I really like what I have seen so far, at least for my use cases.


If you rephrase it, kind of sounds bad. What bothers you about C++ is that it's mostly experienced people now?


Has been the case for the past half decade, too.

MSVC's stdlib has been open-sourced, and MSVC is used by plenty of companies writing games, native Windows software, etc.


Not fully open-sourced. There are bunch of pieces that are available only in compiled form - .obj/lib files. Like math.h functions.


Can someone explain the differences between Google and the C++ standards committee? I understand it has to do with breaking ABI, but what exactly does this imply?

As someone much more experienced in other languages like Python and Swift, my biggest surprise when using C++ is that the language standard and the compiler are two different things. In Python, the default CPython interpreter is the standard - if it's not in CPython, it's not standard Python. But in C++, there's like 3 different compilers, all of which implement different subsets of the latest C++ standards. IIRC, the latest standard that all compilers implement fully is C++ 14. I understand C++ is a broad and complicated language, and I actually have liked using it in the few times I've had the chance, but the compiler/standard situation seems like complete lunacy from where I stand.


> As someone much more experienced in other languages like Python and Swift, my biggest surprise when using C++ is that the language standard and the compiler are two different things.

Yes, that's because C++ is a formally standardized language, and Python and Swift are not. C++ is an ISO standard, and it is published in text form. You are free to create your own compiler, and if it works according to the rules written in the official standard, you may call it a standard-conforming compiler. There are of course other languages which are also officially standardized like this (does not have to be ISO), for instance Common Lisp, Scheme and of course Javascript (ECMA). If a language is defined by a reference implementation, then this is an informal way of standardizing your language, and if you want to create your own compiler/interpreter for such a language, you have to carefully evaluate what the reference implementation does.

> I understand it has to do with breaking ABI, but what exactly does this imply?

Very roughly, it means that you cannot link object code with the current ABI with object code that was generated by older compilers with the old ABI (well, it's usually worse: you can link just fine, but your program might or might not crash). This is not unprecedented in C++, we had this when switching from gcc4 to gcc5, it was definitely pretty painful and I'd rather not have this again.


Google is a company which uses C++ in a specific way. For example, they build most of their code from source with the same compiler and compilation settings, including external dependencies. They do not use exceptions, some parts of the standard library, some parts of the language. They also can upgrade their codebase in more-or-less atomic company-wide refactorings which also run most (if not all) tests for the affected code.

The C++ standards committee is a body of people from different backgrounds, different companies and different needs from C++. Moreover, they kind of have to cater to everybody. There may be people programming microcontrollers, there may be game developers, there may be people supporting/porting decades-old software which uses old versions of Qt, there may be people wanting all the modern bells and whistles in C++, and there may be people using pre-compiled third-party libraries from a long gone vendor or vendor not willing to upgrade their compiler.

These needs may contradict each other. ABI (Application Binary Interface) in C++ can be thought of as a `.pyc` file in Python. You don't expect _any_ compatibility of `.pyc` files between Python versions, so all libraries are distributed in source code in `.py`, and it's up to Python to process them into `.pyc` files as it wants. In C++ world, lots of libraries (including all OS libraries, actually) are distributed in a pre-compiled binary form only. The way your program interacts with the library is the ABI. If you upgrade your compiler, but the library does not, you can no longer use the library.

One example is the memory layout of standard library types. E.g. a release version of `std::vector` (the standard dynamic array container) may only need three fields: a pointer to the allocated memory, maximal capacity of the vector, and its current size. A debug version may also include stuff like "where this vector was created". If one part of your program (or an external library) expects a vector to be 24 bytes and have such and such fields, and another part expects something else, Everyone Dies(tm). To make things worse, everything breaks silently: there are little to no checks for ABI compatibility, and reading a byte almost always works.

As to why the standard is affected by this, even though "ABI" is not mentioned anywhere in the text: you cannot add/remove fields or virtual methods within the standard library in the next standard. If you do, the newer standard library _must_ become incompatible with all the pre-compiled code expecting the older version. (Almost) no way around it. A similar thing in Python would be if one has tried to use both Python 2 and Python in the same project simultaneously, in the same process.

So if everyone starts building all their code and dependencies from the source code, there will be no ABI concerns anymore. I don't see it happening any time.

> IIRC, the latest standard that all compilers implement fully is C++ 14.

Not even than, garbage collection from C++11 was never implemented by any compiler: https://en.cppreference.com/w/cpp/compiler_support/11

Not a big problem though, as it's never used by anyone. I've heard of people using non-standard garbage collection extensions instead, long before C++11. That's another thing with the standard and compilers: not everyone needs every feature, so some are given a priority depending on the compiler's users. And there are lots of existing solutions which probably won't be migrated to the standard.


> I've heard of people using non-standard garbage collection extensions instead, long before C++11.

Anyone using Unreal C++, Managed C++ (.NET 1.0) or C++/CLI (.NET 2.0 onwards).


Just another request to stop citing Tiobe. This index is misleading and harmful.

https://blog.nindalf.com/posts/stop-citing-tiobe/


Where's the petition? Need to sign it!

Tiobe is outright nonsense.

Citing it (besides jokes) is usually a sign that you can't trust the source that does so in the first place. Really everybody should know (at least for the last few years) that the Tiobe index is just completely made up nonsense.


The most exciting news about C++ would not be added features, but removed features. That is what everyone in their heart is waiting for.


The only problem being that “everyone” can’t agree on which particular set of features should be removed. Breaking existing code is always going to be a tough sell, for hopefully obvious reasons.


Would code need to be broken? I thought it was normal to deprecate features and then have the compiler throw a warning.


"Treat warnings as errors" is a standard best practice.


> "Treat warnings as errors" is a standard best practice.

It's a good general practice, as long as exceptions can be made.

Some compiler warnings are rooted in the compiler being unable to prove something that is actually true.


Treat warnings as errors is terrible practice.


It's the only sane thing to do.

And everybody who doesn't do so really deserves the pain that results later on.


No it's completely insane because you never know what a different or future compiler is going to decide to bitch about.


A different or future compiler is going to "bitch" about more bugs in your code.

But I see, you're not interested in correcting the bugs you produce. You obviously prefer the "three-monkeys solution" to correctness problems.


What is the point of having warnings then.


You can suppress them in the rare cases this is needed.

Such suppression are than big warning signs in the code telling you that something exceptional is going on which needs extra attention when touched.

Also warnings may be wrong. Errors mustn't be and can't be as strict as warnings therefore.

But it shouldn't be possible to "just ignore them". If you do, this needs to be done deliberately and explicitly.


Yeah at best you can force can force people unfamiliar with the code to waste their time suppressing a dumb ass warning that means nothing. Or worse accidentally introduce a bug.


I don't want good practices to be optional.

I want good practices to be the thing people actually do!


The problem is that other people or even worse one self decide what is "good practices" which might actually be terrible practices.


no, i really want reflection and better code generation features for instance ; this would allow me to simplify and remove a lot of useless code. likewise, "deducing this" which made it into C++23 is likely going to allow me to remove at least a few hundred and hopefully a few thousand lines of the codebase I'm working on. And all this means less chances for bugs and more time actually writing features.


> i really want reflection and better code generation features for instance

I know this won't be applicable in many cases regarding C++ projects, but if you want to see state-of-the-art code gen features in action have a look at Scala 3:

https://docs.scala-lang.org/scala3/reference/metaprogramming...

With something like that in place you actually don't need any (runtime) reflection (which is also available on the JVM OOTB, just in case).


Most of what you sent under this link falls under the general idea of reflection, e.g. the quoted macros, and is fairly similar to the kind of things that are being proposed for c++. Of course all the reflection systems proposed for c++ are purely compile-time, no one even proposed runtime reflection I think


For most people "reflection" is a synonym to "runtime reflection".

In contrast quotes and splices aren't reflection at all! The whole point about them is that they're "black box". You (usually) can't "extract information" form a quote in a safe way (so it's the exact contrary to reflection). That's still something academia is trying to solve (even there are some proposals out)¹.

Where can I learn more about the C++ proposals to compare?

---

¹ https://dl.acm.org/doi/10.1145/3136000.3136005

¹ https://dl.acm.org/doi/10.1145/3158101

And the recent work that underlines the solution included in Scala 3:

¹ https://se.informatik.uni-tuebingen.de/publications/stucki21...


> For most people "reflection" is a synonym to "runtime reflection".

I mean, that's just because many languages don't even have a compile-time so of course reflection can only be runtime. Every compiled language has at least some amount of extremely basic concept of compile-time reflection, e.g. sizeof in C for instance which yields a constant expression ; likewise, `if constexpr(std::is_same_v<T, some_other_type>) { ... }` which we can do since C++17 is very much in the domain of compile-time reflection.

Here's the most recent paper: https://www.open-std.org/jtc1/sc22/wg21/docs/papers/2022/p12...

If you want this is a very good read: https://www.open-std.org/jtc1/sc22/wg21/docs/papers/2018/p07...

Finally, there is an implementation of the reflection TS: https://clementpirelli.wordpress.com/2021/12/08/cpp-reflecti...


Oh, cool, direct links to the relevant papers!

It's not always easy to find the relevant stuff in C++ land.

Thanks a lot!

I have to read this stuff first, but my gut feeling would still be that `constexpr` and such isn't reflection at all.

The `sizeof` example seems more in this direction (because some expression gets actually inspected).


i'm talking about "if constexpr", not "constexpr" ; "if constexpr" allows to make choices at compile time depending on properties of a variable and that is definitely what reflection is about.

e.g.

     void f(auto x) {
       // common code path
       if constexpr(x is floating-point) { 
          // code path 1
       }
       else if constexpr(x is a string) {
          // code path 2
       }
       // common code path
    }


Oh, I see. You're right.

Thanks for the pointer.


>"That is what everyone in their heart is waiting for."

Nope. Not waiting for that. Does not mean I am using or ever will every feature. However if it exists and being used by others I have nothing against it.


Operator delete still exists. Dereferencing void * still exists. Access like a[i] is still unchecked. Etc.

But indeed, these things can't be removed or reworked without breaking backwards compatibility. (And if you seriously take on reworking them, you'll end up with a different language, which we already have several.)


> Dereferencing void * still exists.

... huh? i'm fairly confident dereferencing void is not valid C++ ; at the very least gcc/clang/msvc do not compile it: https://gcc.godbolt.org/z/hK6qb1z6o


I'm glad C++ has added more features, it is a much nicer language than it used to be. string_view from C++17 is very convenient and just this past week I had to use std::latch from C++20.


Yes, but the problem in my view is that for each of those small things we find useful they introduce tons of other features no one needs ...


Can you list some that are present in c++20?


    s/no one/I/


One major selling point of C++ over other languages is vast volume of the existing C++ code. If some critical portion of which required major updates to cope with the removed features, that might not be something that everyone is looking forward to.

I am just worried that some proponents of "move fast and break things" might be undervaluing what they are breaking, although C++ committees are are generally good about backward compatibility.


They should simply kill the compatibility and provide a migration tool. clang-tidy provides the modernize module which can help.

If the only side effect of breaking backwards compatibility is making the people that write C++ as if it was C89 irritated and upset, then that's probably a good thing, let's do it more.

C++11 is 11 years old. If you haven't modernized your code base in 11 years you never will.

If you are using ancient C++, compiler security updates are the least of your security concerns.

If you have code that nobody understands and cannot be touched, you should see that as the real problem and an urgent call to action to fix it, not modernization.


> C++11 is 11 years old. If you haven't modernized your code base in 11 years you never will.

That doesn't make it a dead codebase. New code still gets written - and it helps when it can be written using newer, better techniques while working nicely with old code. I once worked on a codebase that had ancient code from mid-90s with explicit COM AddRef/Release side by side with C++11 lambdas and TMP.


> I once worked on a codebase that had ancient code from mid-90s with explicit COM AddRef/Release side by side with C++11 lambdas and TMP.

Sounds like pure joy.


It was certainly an ... interesting experience. But it would have been much, much more unpleasant if we had to write new code 90s-style, too.

OTOH if we tried to rewrite the whole thing from scratch, we wouldn't ship a new feature for many years - only new bugs.


"I stopped caring 20 years ago" achievement unlocked.


Not everything needs to be rewritten. And a lot of issues can be introduced in rewrites which have nothing to do with the improved resource management that RAII style development brings.

For example, if you port numeric libraries are you sure you maintain the same guarantees for numerical stability? If you are porting containers, do your new backing-array growth strategies achieve the same efficiency as using realloc? Do your new threading primitives have equivalent efficiency on all of the target architectures?

I would prefer to have warnings, or even errors, for 'known bad behaviour' that are enabled by default with new standards - think along the lines of `-fpermissive`. Forcing people to suppress these means that the onus is on them to accept the risks their choices bring.

However, just as with things like void*, the new standards are bringing in their own footguns. Ranges have pointer semantics, not value semantics, for example, which is totally non-obvious and actually means a lot of use cases for ranges just aren't possible. Newer isn't necessarily better, the people writing the original standards were brilliant as well and had their own insights which we now forget.


> For example, if you port numeric libraries are you sure you maintain the same guarantees for numerical stability?

But your test suite would catch any issues, wouldn't it?

You have an exhaustive test suite for your important numeric libraries, right?

> If you are porting containers, do your new backing-array growth strategies achieve the same efficiency as using realloc? Do your new threading primitives have equivalent efficiency on all of the target architectures?

The benchmarks suite, which is part of your exhaustive test suite, and is guarding against performance regressions would catch any issues for sure!

You have a benchmarks suite to protect against performance degradation, right?


> You have an exhaustive test suite for your important numeric libraries, right?

> You have a benchmarks suite to protect against performance degradation, right?

In theory, in a perfect world, yes.

In practice, people use open source libraries like Eigen because they are the best, not because of their extensive test coverage. Doing things which might break these libraries will fork the community.


How a lib where you don't even know whether it computes correct results can be called "best"?

Only because something spits out numbers really fast doesn't say anything about the quality of this something. Especially when considering that optimizing for speed comes most times with very hacky code! This, plus the fact that this code is usually written by laymen (scientist aren't software engineers!) makes such things very questionable.

After looking into scientific computing and the usual software "quality" there I lost quite some trust in anything that comes out of there. This was actually very disillusionary and extremely sad to find out. This could even destroy the broader trust in science in general (as everything today in this field depends on statistics calculated by computers). This would be an catastrophic outcome! But most people would currently even deny the existence of an issue…

Related (together with the parent): https://news.ycombinator.com/item?id=34224186

Also related as people (scientists!) got blinded by the promise of speed without considering correctness:

https://yuri.is/not-julia/

A house of cards… :-(


This is normal for open source. You get what you pay for, and nobody is paying for test suites for potential future users to freeload off of. Why would numerical libraries be any different from OpenSSL pre-heartbleed? And how would you fix this issue without spending other people's time and money on things they don't want to spend it on?

This is why subtle compiler rule changes are so potentially devastating.


Then use an old compiler and be on your own.

It seems you know better than the C++ standard committee and Bjarne Stroustrup anyways.

https://www.stroustrup.com/bs_faq2.html#renew


The C++ standard committee has a lot of internal disagreement on many topics, presenting them as a unified body is disingenuous. What I state is the position of a sizeable minority.

And yes, you shouldn't use realloc... but what should you do if your buffer size is more than half of your physical memory?


Even Go gets to add features.


Its a bit disappointing that the C++ community and stakeholders haven't found a way to address perceived shortcomings and (why not) reinvigorate and generate excitement about the ecosystem. The fact that people choose to embark on enormous investments to build alternatives from scratch (e.g Rust) might suggest that the cost calculus between maintaining compatibility with legacy code and future-proofing the language has now shifted decidedly against the former. C++ does not feel like the kind of beast that is destined to become the new COBOL. Its continuing popularity in game development is just one of the areas that could see enormous growth and evolution.


> C++ does not feel like the kind of beast that is destined to become the new COBOL.

Strange conclusion.

For me it looks like C++ has chosen to become exactly this. The only reason it's still relevant is all the legacy code. Almost no new projects get started in C++. So it looks like they double down on that.

> Its continuing popularity in game development is just one of the areas that could see enormous growth and evolution.

I don't see this. Game dev is also moving.

The most popular engines may be written in C++, because there was no choice as this projects started. But the games made on those engines are mostly not written in C++.

Most code in games is "script like". And it's even often written in scripting languages. When not, it's something like C#, or some other more modern high level language.


> Strange conclusion

I don't have a crystal ball but there are important differentiating factors: C++ is much more capable and it is relevant in more diverse domains whereas COBOL was very confined.

> Game dev is also moving

There are several new ECS projects implemented in C++ that have enthousiastic following.

My hunch is that it is still up to the C++ world to lose it entrenched position by not doing the right changes but maybe that won't be the case in a few years from now.


> There are several new ECS projects implemented in C++ that have enthousiastic following.

Would you mind to share?

I'm currently looking into game dev. Would be nice to have a better picture of the landscape.


there are several, following slightly different designs [0]. entt and flecs (mix C/C++) are currently the most visible [1]. older and more established engines are getting into the game as well [2].

its just a data point that C++ has constituencies that have not given up on it yet.

[0] https://github.com/topics/ecs?l=c%2B%2B&o=desc&s=forks

[1] https://github.com/SanderMertens/flecs

[2] https://github.com/GodotECS


Thanks for the links!

But the two bigger projects don't contribute to the argument actually.

This flecs (which looks very interesting on first sight!) is a C project with C++ API bindings on top as I see it. The other one is a Godot library (which I have to examine closer as I'm specifically looking into Godot).

The collection of GitHub project looks mostly like engines. So not really games as such. (Also I'm not sure how recently those projects where started. There was no real alternative to C++ for game engines for a long time. So all older projects are likely C++. This does not prove the point that anybody would grab C++ today).


I very much doubt that existing large C++ codebases - which are plenty - are going to fully or largely migrate to Rust (or whatever) within the next two decades. Given that C++ originates in mid-80s, that would make it as old as COBOL was in 2000.


This is similar to saying that no one would migrate to TypeScript because there are so many existing JavaScript codebases. The success of TypeScript and Kotlin suggest that one of the languages that hope to interoperate with C++ like Carbon could become more popular over time.


I suspect that the average liftime of a typical c++ and js codebase differ by one or two orders of magnitude.


You’re assuming the codebase would need to be rewritten. It’s always possible that there might be a gradual transition within the same codebase. New features could be implemented in the new language while keeping everything else the same.


I have experience with that kind of thing (https://news.ycombinator.com/item?id=34216117). For a sufficiently large codebase, you won't get there gradually even in two decades.


TypeScript is a strict superset of JavaScript, so that one was more of a C/C++ situation (which was a big reason for C++ success to begin with).

OTOH Kotlin, while popular, is still tiny compared to Java. Except for Android, which is a bit of a special case because mobile platforms generally have smaller codebases and shorter app lifecycle due to forced API breakage by the platform.


I haven't kept up with C++ for over a decade, but the little glimpses I catch of it makes me think that though it has received good and useful stuff, it has also grown needlessly complex.


ctrl+f "reflection" no results found.

maybe next year ..


What do they aim for? Runtime reflection? That's something that's indeed not very important. At least as long as you've got some form of "compile time reflection". But OK, C++ does not have either. (You could build something for sure, templates are Turing complete, but I'm not sure I would even like to see the result; could be very scary…)


> That's something that's indeed not very important

To you. Meanwhile many of us who do need it are stuck with buggy half implemented code generation tools.


I keep hearing the Java people saying the same.

But there is (almost) no use for runtime reflection when you have proper compile time features.

Of course this is something one can only find out when using a language with those features.

But now I'm curious: What do you concretely need (runtime) reflection for? Maybe it's indeed one of the very rare cases where one would really need runtime reflection.


Maybe not relevant to your interests, but I found this very interesting on the topic https://www.youtube.com/watch?v=aJt2POa9oCM

A case of "brilliant coder wants to make game but don't know art well, so they create new complicated programming tasks instead", but, the brilliant coder part is still pretty relevant.


With luck, maybe after C++26, if at all.


There is something very wrong if they don't have reflection in 26. It is sorely needed.


What is wrong is having 200 plus ISO members with different mindsets in what to vote for.

People often talk about Bjarne Stroustoup as if he was a BDFL, while in reality he only has one vote from those 200 plus.

Welcome to standard body processes.


Try C++ Builder


Ctrl-F "networking", close tab...


is ChatGPT actually the most important thing that happens... not just for C++, but in terms of all programming languages? How will it affect programming?


I’ve seen ChatGPT spit out both correct and incorrect code.

The idea that you’d rely more on tools is in a sense the Java approach. Java is somewhat more verbose than other languages, in practice, and you deal with the verbosity by making more use of code snippets / templates, autocompletion, etc. Basically, your IDE writes more of the code for you. I think this was, in general, a dead end in language research for various reasons, and improving code generation (via ChatGPT or some successor) does nothing to solve the actual problems with using generated code.

At the end of the day, somebody has to at least read the code and verify that it does what is requested.

Maybe at some point, someone will hook a more formal front-end to ChatGPT or something similar, so you can write the interfaces and specs, and the AI will generate the implementation. That may take a while, however.


> Basically, your IDE writes more of the code for you. I think this was, in general, a dead end in language research for various reasons, […]

I think language researchers would strongly disagree.

Something like that is likely the future of programming!

For example:

https://www.youtube.com/watch?v=X36ye-1x_HQ

From the video description:

> In Idris, types are a first class language construct, meaning that they can be manipulated and computed like any other language construct. It encourages a type-driven style of development, in which programmers give types first and use interactive editing tools to derive programs.


I tried using GPT to write code and it was very impressive but this cartoon cuts to the truth of it: https://www.reddit.com/r/ProgrammerHumor/comments/zdvpwb/how...




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: