> Embracing Resource Acquisition Is Initialization (RAII) makes C++ the safest, most productive and fun C++ has ever been.
This seems like an extremely low bar.
Anyway, what use is there for C++ in 2025 aside from maintenance of legacy codebases and game engines? Off-hand I'd say C++ programmers are twice as expensive as rust programmers for the same semantics, and then you're still stuck with a C++ programmer you need to figure out how to socialize with.
Author here, I do prefer a language like Rust that makes all of this automatic but that isn't always an option. There are other parts of Rust that can be hard as well.
I wrote this article for a few friends who have recently started working with an existing, large C++ code base. Some industries like the video game industry have also stuck to C++ (they have their reasons).
A good programmer can work within the given constraints to make a useful program — sometimes, one of those constraints is the choice of language.
Once Qt is properly accessible in Rust, I think your claim will be a lot more realistic. This goes for other important C/C++ libraries (e.g. libavcodec, vips/ImageMagick, VTK) without a solid (maintained, documented, etc...) Rust interface.
Qt is a C++ kitchen sink library - perhaps it feels very important to C++ programmers but I can't imagine Rust people are looking forward to having more redundant types that do exactly the same thing as a type they've used for years but like, now with a Q in their name.
The fact that lists like yours so often end up being "Look at all these C libraries" ie not actually about C++ at all is revealing. It's an endorsement of Bjarne's position that he needed that C compatibility, decades later C++ alternatives remain unpopular but it also tells you that you're never going to raise the bar this way. C++ is not a route out.
AFAIK there is no equivalent of rustls-openssl-compat for C++. The knowledge that this library (OpenSSL) is trash never spurred any C++ programmers to do better and provide the same ABI but with a C++ implementation.
What's your point? There's a lot to criticize about these libraries, but the fact remains that they're considered important and basically impossible to rewrite from scratch. Of course, Rust could get a rock solid portable GUI toolkit that doesn't rely on a webview (probably not much harder than wrapping Qt, to be honest), but it's not there yet.
I'm not in love with them or the abomination known as C++, if that's what you're implying.
How so? This definitely needs at least a citation of somebody who can explain in detail why this code is "impossible to rewrite". Who wrote the one we have now, Martians?
Is it, though? Most mainstream languages fail to support anything resembling RAII, at least as first-class support. Do you actually have an example of a language that does a better job at resource management than C++?
Rust simply because the constructor being a static method on the type that can return basically anything including optional and result types and copy / move semantics are handled very easily through macros instead of constructor / assignment operator overrides.
> Rust simply because the constructor being a static method (...)
This assertion makes no sense. RAII is not defined by whether you use factory functions or special member functions to initialize an object. In fact, RAII is only incidentally related ro memory management. RAII is a strategy to manage any and all types of resources, ranging from memory management to even files and TCP connections, by leveraging assurances that the runtime provides regarding scopes and object life cycles. None of this changes if you employ factory functions to initialize resources.
> Wikipedia says: "RAII is associated most prominently with C++, where it originated, but also Ada,[3] Vala,[4] and Rust"
From those you listed, only Rust can be described as mainstream. Do you think that one out of a couple dozens refutes the statement that "most mainstream languages fail to support anything resembling RAII"?
> They don't necessarily offer a pattern like RAII, but what about "try-with-resources" in Java, or "use" in Kotlin that goes with `AutoCloseable`?
Try-with-resources and the disposable pattern offer similar features but they still fall a bit short. Unlike RAII, they are not thread-safe, require specialized syntax and boilerplate code, and require manually specifying scopes.
But even if you consider try-with-resources and Disposable pattern as a perfect replacement of RAII, now point out how many mainstream languages support them. You have Java and JVM languages, you have C# and .NET languages, Python, and...
> I find those simpler than RAII.
Arguably this boils down to personal taste, but RAII actually ensures your resources will be released, and you can tell exactly when this will happen. Disposable patterns don't, and screwing up a using with statement is all it takes to get your application to leak resources and fail silently.
I agree, this is a real benefit of RAII compared to defer. That said, there are disadvantages of making the resource freeing part invisible. Not having to debug destructors that do network IO (e.g. closing a buffered writer that writes to the other side of the world) or expensive computation (e.g. buffered IO writing to a compressor) is a define plus. Don’t get me started on error handling…
> I agree, this is a real benefit of RAII compared to defer. That said, there are disadvantages of making the resource freeing part invisible.
RAII doesn't make resource freeing invisible. It makes it obvious, deterministic, and predictable. The runtime guarantees that, barring an abnormal program termination that's not handled well, your resource deallocation will take place. This behavior is not optional, or requires specifying ad-hoc blocks or sections.
How does RAII solve that? Developers can "forget" to use RAII, right? Or are you saying that it's easier to spot because RAII requires quite a bunch of boilerplate (whereas a one-liner like "defer" is easier to forget)?
RAII when supported natively by the language, is done at the implementation level, not the usage point.
So in languages like Rust, D, Ada, Swift, C++, the compiler will do the rest unless you go out of your way to avoid the call to take place, like placing a value type on the heap using plain pointers.
With the other approach, even if you implement IDisposable, AutoCloseable, ContextManager and similar, you have to remember to manually write the code pattern that takes care of calling close(), or whatever the method/function happens to be called.
In languages with good support for FP patterns, like trailing lambdas, currying and such, there is another pattern, that is much safer, in case you don't want a static analysis tool to track resource usage, the with pattern.
You do something like withDBConnection connection (fun db -> all related db operations).
Assuming the lambda doesn't do naughty things to have db parameter escape the scope, the withDBConnection function will take care of handling the whole connection lifecycle.
When I switched from C++ to a bunch of other languages, I missed RAII initially. However, I quickly learned that other languages just do it differently and often even better (ever had to check an error when closing a resource?). Nowadays, I think that RAII is a solution to a C++ problem and other languages have a better solution for resource management (try-with-resources in Java, defer in Go, with in Python).
> However, I quickly learned that other languages just do it differently and often even better (ever had to check an error when closing a resource?).
I don't think that's even conceptually the same. The point of RAII is that resource deallocation is ensured, deterministic, and built into the happy path. Once you start to throw errors and relying on those to manage resources, you're actually dealing the consequences of not having RAII.
> try-with-resources in Java, defer in Go, with in Python
Or 'goto error8;' in C. Still RAII is much more convenient, especially for cases where you allocate a lot of interdependent resources at different time points. It keeps deallocation logic close to allocation logic (unlike, say, defer), makes sure deallocation always happens in the reverse order of allocations and doesn't force you to create a nested scope each time you allocate a resource
Where are C++ programmers paid better than Rust programmers? I thought Rust salaries are being driven high by all the crypto/fintech companies + scarcity
productivity counts not hours. If you can get the same work done faster then you cost to the company is lower even - possible even if your salery is higher.
though it isn't clear how much of rust's increased productivity is caused by being a new language where the architecture mistakes of the past decades are not slowing you down. We will need several more decades to answer that.
> stuck with a C++ programmer you need to figure out how to socialize with.
You're obviously trolling, but:
IME Rust attracts the same 'difficult' characters that are also attracted to C++ (because both are 'puzzle solving languages' instead of 'problem solving languages') the typical Rust coder may be even worse because of the missionary fervor and 'holier than thou' attitude.
IDK about Haskell, but iterating in REPL with Lisp is the most practical form of programming I've experienced. In other mainstream practical languages this approach is reintroduced as a productivity tool like Quokka, etc.
C++ was practical some decades ago (hardware-friendly variant of OOP for GUI), but it failed as a library language and the domain where it's practical on modern hardware is much smaller. I will not say anything about Rust.
This is very inaccurate. Essentially every high-performance library, user-mode driver, desktop application, and more is written in nothing but C++. Give me any library you can think of, and I assure you it is written in C++ (or maybe C, but this is masochism on the part of the developers). Even libraries for other languages like numpy, pandas, pytorch, etc are written in C++.
> or maybe C, but this is masochism on the part of the developers
C is the better choice when interoperability with other languages is needed (technically: a C API, the implementation language doesn't matter - but if a C++ implementation is wrapped in C API as an afterthought the result is usually a shitty C API). Personally I switched to C from C++ for writing libraries ca 2017 and don't regret that decision one bit.
Also, many C++ coders only have a foggy idea how convenient working in modern C can be, because their idea of C is usually the 'common C/C++ subset' that exists in C++, and this is stuck deep in the early 90s (for instance the designated-init feature in C++20 is a mere shadow of what C99 designated-init can do - to a point that the C++20 version is pretty much useless for real world usage).
> only have a foggy idea how convenient working in modern C can be
Here is a list of C++ features that C doesn't have, that are an immediate deal-breaker for me:
- reference and move semantics
- templates, and type constraints with concepts
- namespaces
- `constexpr`, `consteval`, and other compile-time magic
- `auto` type deduction
- trailing return types
- RAII (can't believe I put this this late, but eh)
- a passable (although still not perfectly complete) standard library that blows the C standard library out of the water
- improved semantics that allow programmers to reason about the logic in code better, than obsess over pointer arithmetic
- built-in virtual functions, function pointer tables, etc
I can list more, but this is going to end up as a list of 'essentially every feature in C++ that isn't in C', which is the very reason for the former language to exist.
Repl is cool, but Python also has a repl, and has a much more intuitive programming model (and much fewer braces).
Totally agree about the domain - no one is writing enterprise applications in C++ these days, luckily. It still does have its domain though, where there is not much choice apart from C++ or Rust (or C if you are a dinosaur)
Oh no, C++ has template meta programming and the ability to mask a DSL as advanced architecture, and then you can even implement Lisp or pick your favorite while claiming you're coding in C...++
> the typical Rust coder may be even worse because of the missionary fervor and 'holier than thou' attitude
Anecdata, but the number of times I actually encounter these missionary Rust coders (the RIIR types) is utterly dwarfed by the number of times I hear people complaining about them. The memes making fun of the insufferable rust evangelists are at least 10x as prevalent as the evangelists themselves.
Why do they use "and"? Why not use an unambiguous joining token like `/`? This just feels like an abuse of informal language to produce fundamentally formal data.
As it stands, it certainly does not resemble readable or parseable english.
The person who designed it was solving primarily for lexical sorting of the author field, thought maybe having more than two authors was an edge case, and wanted the two author case to be a logical extension of the single author one?
Wait, this code uploads data to a server somewhere? To what end? I would not have expected capture to come with mandatory redistribution, nor would I trust any third party with my location, let alone the output of my car's camera feeds. And I definitely wouldn't trust meta with, well, anything, let alone my own personal identifying information.
I'm not sure what you expected, Mapillary is built to make pictures and upload them with the most information possible. Street View but for everyone, and there's no need to have a 360° camera.
I actually think I may have misunderstood, and this doesn't upload to mapillary by default.
...but that said, what is the kind of person to upload specific times and places where they were to a private corporation? What would motivate a person to do such a thing? Can you get paid for it?
It is done for cartographers in OpenStreetMap to map where they have been (or where others have been).
I use it to add metadata to my local area, things like business names, postboxes, benches, etc.
Mapillary is not a private cloud for your own personal Google Street Map. It's a public Google Street Map with appropriate licencing (open) for mappers to add data to OSM, using Google Street Map would be in violation of its licence.
Not to get too snarky, but the number of people that knowingly or not have GMaps location history enabled has to be in huge millions. I talked to a few of them personally and they saw it as some little neat feature to eg. quickly list the cities they've been to in the last year.
Hostels often offer private rooms, to varying degree of privacy. But i've certainly stayed at hostels that offer very comfortable single private rooms with private bathrooms for a third the cost of a local hotel room. Expensive for a hostel, but great value for the privacy.
But if you're traveling with your family, just get hotel rooms. Hostels only came up in the first place in response to a gripe about solo travel.
This has the added complication that oral historians were/are a political institution in many parts of the continent (unlike, say, reproducers of folklore). So "official" history very clearly predates written history we have today—and certainly in European languages—but it's still the product of conscious maintenance of image. That said, written records (say, inscriptions on a victory stele) have this issue too.
It's also worth noting that there is strong indication that pre-colonial states in subsaharan africa well outside the horn of africa did keep written language for the purposes of managing bureaucracies. Hell, arabic was adopted in east africa many centuries before europeans ever set foot there. The technology was certainly not unknown. However, if indeed this was the case, it clearly did not spread far beyond the needs of centralized bureaucracy, nor was it likely used for what we would now call private commerce, and we have no surviving records showing the scripts.
The nice thing about written records is that the victory stela necessarily tells you the same story that it told the literate subset of Ramesses's subjects 3200 years ago. Oral history can be extremely well preserved, but it can also be tailored to the listener. And it can be hard to date reliably, though there are exceptions. For example, people in many places in the world have oral traditions of having lived there since the world began or for specific numbers of years that are much greater than the archaeological evidence supports.
> It's also worth noting that there is strong indication that pre-colonial states in subsaharan africa well outside the horn of africa did keep written language for the purposes of managing bureaucracies. (...) The technology was certainly not unknown. However, if indeed this was the case, it clearly did not spread far beyond the needs of centralized bureaucracy. However, if indeed this was the case, it clearly did not spread far beyond the needs of centralized bureaucracy, nor was it likely used for what we would now call private commerce, and we have no surviving records showing the scripts.
> Timbuktu Manuscripts, or Tombouctou Manuscripts, is a blanket term for the large number of historically significant manuscripts that have been preserved for centuries in private households in Timbuktu, a city in northern Mali. The collections include manuscripts about art, medicine, philosophy, and science, as well as copies of the Quran.[6] Timbuktu manuscripts are the most well known set of West African manuscripts. (...) Some 350,000 manuscripts were transported to safety, and 300,000 of them were still in Bamako in 2022.
> The dates of the manuscripts range between the late 13th and the early 20th centuries (i.e., from the Islamisation of the Mali Empire until the decline of traditional education in French Sudan).[11] Their subject matter ranges from scholarly works to short letters. (...)
> Scribes in Timbuktu translated imported works of numerous well-known individuals (such as Plato, Hippocrates, and Avicenna) as well as reproducing a "twenty-eight volume Arabic language dictionary called The Mukham, written by an Andalusian scholar in the mid-eleventh century."[15]: 25 Original books were also written by local authors, covering subjects such as history, religion, law, philosophy and poetry. (...)
> Some manuscripts contain instructions on nutrition and therapeutic properties of desert plants, whilst others debate matters such as "polygamy, moneylending, and slavery."[15]: 27 The manuscripts include "catalogues of spells and incantations; astrology; fortune-telling; black magic; necromancy, or communication with the dead by summoning their spirits to discover hidden knowledge; geomancy, or divining markings on the ground made from tossed rocks, dirt, or sand; hydromancy, reading the future from the ripples made from a stone cast into a pool of water; and other occult subjects..."[15]: 27 A volume titled Advising Men on Sexual Engagement with Their Women acted as a guide on aphrodasiacs and infertility remedies, as well as offering advice on "winning back" their wives.
This is far beyond the needs of centralized bureaucracy, and substantial numbers of records do survive despite the best efforts of Boko Haram.
Ah yea, sorry, I mean in addition to what we already know for sure—Timbuktu is emphatically not what I was referring to (although—I had forgotten about Timbuktu libraries, and it makes my point better than I did, so I appreciate your bringing it up!). I'm referring to oral evidence of writing in Great Zimbabwe (among other places I'm sure). If they had developed script, we unfortunately lack evidence of it.
My point more broadly is that prevalence of an oral tradition doesn't imply the lack of capacity to develop a written one. As Timbuktu is perfect evidence of—their libraries coexisted (and still do today) with griots, and the two repositories of knowledge seem to serve distinct functions in society.
Britain had a habit of showing all its religious/political (can't really separate them at this point in history) minorities the door (and to be fair, some of them were basically lunatics) which is likely a large part of why things shook out the way they did. A bunch of ideologically opposed groups cast onto another continent had no choice but to learn how to self govern despite their differences.
Religion has a relatively minor influence on UK politics these days. 37% of people are non-religious. 46% identify as Christians, but only 10% actually attend Church. And the majority of those Christians belong to moderate denominations whose politics isn't that different to that of the general population.
Democratic in the modern sense. The past millennia of English history could be understood as a slow progression of the devolution of power. The actual politics were pretty messy, but the evolution in legal and political theory was more steady. Compare that to most other civilizations, where the evolution of democracy was much more abrupt and epochal, not to mention even bloodier and altogether much more recent.
There were democratic movements elsewhere, but almost all were squelched by king and tsars (domestic or foreign) and the legal and political environments reset to square 0.
Also, the modern notion of the history of democracy is the devolution of power to the masses. But I like to think of the evolution of English history, at least legally, as the (albeit slow and uneven) elevation of the masses to the aristocracy, and in that way something similar to how the Greek's viewed democracy--with power comes responsibility and stricture. Though, that was partially the product of the expulsion of certain groups from the island; yet, that process was carried over in the US where many of those groups landed.
The republican tradition never really died out in Europe. From city states to merchant / maritime republics to free imperial cities, there were always polities that can be best understood as republics. Venice lasted for 1100 years, and San Marino is even older, with its origins lost in time.
That's largely acceptable, and certainly preferable to underproduction, for resources that we simply can't do without. Dairy was (and still is) considered one of those resources as a superfood. Now maybe milk might not hold up anymore as being so critical to childhood nutrition (though I'm skeptical), but I think the reasoning behind it makes sense.
> Tarifs just incentivize purchasing local.
Sure, they also incentivize not eating. But commodification of basic resources is nothing new to americans, I suppose.
Some things are worth everyone pitching in for. Tariffs place the burden of living here on the individual. I don't really see any benefit from this.... fuck local businesses if they can't compete. The entire pitch of living here is that we'll let the market determine every aspect of our lives; why would we not double down when it came to letting businesses fail?
I think you distinguished them admirably. It tends to be pretty obvious from context which meaning is intended.
Hell, we use "design language" even if it's clearly not language; i see little reason why this should be different. And of course the rest of the non-verbal chomsky hierarchy has little relation to how most folks use the word (hell, I bet most coders can't even tell you what a regular language is despite using regular expressions).
But, particularly when it comes to stuff like bird song, it shows a lot of features of syntax. I just don't want to throw the baby out with the bathwater arguing over what to call it.
This seems like an extremely low bar.
Anyway, what use is there for C++ in 2025 aside from maintenance of legacy codebases and game engines? Off-hand I'd say C++ programmers are twice as expensive as rust programmers for the same semantics, and then you're still stuck with a C++ programmer you need to figure out how to socialize with.