Code that is easier to read is easier to maintain and easier to debug. Code that is easier to read and more intuitive will, more often than not, result in a better product and better experience for the users.
I disagree with the premise. "Readability" is an excuse people use for writing slow code. It's not an inevitable tradeoff.
Like, most of these people are not saying, "we could do this thing which would speed up the app by an order of magnitude, but we won't because it will decrease readability." They have no idea why their code is slow. Many don't even realise it is slow.
My favourite talking point is to remind people that GTA V can simulate & render an entire game world 60 times per second, 144 times on the right monitor. Is that a more complex render than Twitter?
Computers are really fast, it doesn't take garbage code to exploit that.
IMHO it’s in part because of what I assume are different business models between a game like GTA and many / most businesses where a website / web app is core to their product.
Different business models result in different environments in which to conduct software engineering; different constraints and requirements.
IMHO constant and unpredictable change (which I assume happens less for games like GTA) is one of the big differences, as is the relationship between application performance and profit.
But I like what you’re saying and would love to see that world.
More than a website that preloads its structure into the cache and then transfers blocks of 280 characters, a name & a small avatar, rather than gigabytes worth of compressed textures.
Is the difference because GTA has more "readable" code?
I have other games that do load up quicker than Twitter, which I do think is damning, but it's not really the point I'm trying to get across here.
Well, the "less readable code"—ie, the goddamn mess that a lot of game code is, slapped together barely under deadline by staffs working 80 or more hours a week—is part of why AAA games like GTA have so many massive bugs requiring patches immediately after release.
But then, you brought up GTA and games, which aren't even apples and oranges with a website. Websites—even the Twitter website—don't require GPUs or dedicated memory, they don't have the advantage of pulling everything from the local hard drive, and yet they actually work as designed, not merely in a low-resolution, low-effects mode on computers more than a couple years old.
And while I wouldn't point out the Twitter home page as remotely fast for a web site, have you actually even looked at it recently? It shows a lot more than just a few tweets and avatars. It's got images, embedded video, etc.
This is a dumb argument. My point is that readable doesn't imply slow, and "readability" is not actually the reason slow things are slow, most of the time. I don't think you even disagree with me.
There's definitely another discussion to be had about why web tech is so disastrously slow given what computers are capable of, but it's not worth having here. We're never going to settle that one, and regardless if you are a web guy, you're stuck with JS.
I just logged into nest for the first time on a new laptop. It took 15 seconds to load and get to the screen to change the temperature on the thermostat.
Then I refreshed the page and it still took 10 seconds to reload.
It's actually pretty good analogy, when you're searching for an analogy of something being technically accurate while missing literally the entire point.
Making your code "more intuitive" does not result in a better product. Making a better product does. The argument is that software development is one of the few jobs where the employees' experience seems to be equal or more important than the client experience. Sacrificing a restaurant goer's experience (taste) because it makes the restauranteur's experience (customer LTV) better is a decent analogy.
A: "Okay, but the new place next door serves identical food with a more efficient kitchen, at lower prices you can't match without making significant changes. If you don't improve efficiency somehow you'll start to lose customers."
Because the universal rule is that 90% of everything is terrible, including software. Corollary rule is that work expands to fill all available time, and software expands to fill all available resources.
If you go back to before the ascent of age of front-end frameworks, you would find that there were still tons of sites that were slow and poorly performing, despite running entirely in server-side technologies.
Something that made Google incredibly appealing when it first came out was it’s instantly-loading front page with a single search box and a single button. This was in drastic, shocking contrast in the age where every other search engine portal had a ton of content on it, including news, stock tickers, and the kitchen sink.
In the end, unless the developers of the sites make performance a priority, it makes absolutely no difference what the tech stack is. The problem is that companies don’t prioritize it.
Maintainable code can quickly and easily be extended into new features for the customer.
Unmaintainable code usually results in a ton of support tickets and late nights hunting and fixing bugs that originated from deploying into production that day. This leads to heartache and frustration for the customer.
The customer comes first, yes. Good maintainable code, is a way to achieve this goal.