Hacker Newsnew | past | comments | ask | show | jobs | submit | hugryhoop's commentslogin

He was calling 1995 text editors which used 4 MB of RAM incredibly bloated compared to the lean software of his youth which ran in 32 KB.

Now we do the same, but we look at the text editors of 1995 which used 4 MB of RAM as incredibly efficient and well made, paragons of craftsmanship.

Things never change, the old generation fights the new one and calls it stupid.


That 1995 text editor didn't handle unicode. Didn't edit all the languages of the world. Didn't handle emoji. Didn't do auto-complete. Didn't replace colors in CSS with their actual color and popup an inline editor to edit them. They didn't edit remotely (editing remotely is not the same as tmux + vim). VSCode not only edits the files. When you're in a terminal on the remote machine and type 'code somefile', somefile opens on your local machine. When you start a web server in the VSCode terminal, VSCode auto forwards it to your local machine.

I'm not saying old editors weren't more efficient but the stuff editors handle today got more complex. LSP servers do way more analysis than any 1995 editor and they do it in an editor agnostic way. It costs more memory but it also lets us all jump into the future faster rather then every editor having to implement their own for every language.


I know that this is becoming a trope, but Smalltalk and Lisp Machines did all those things far before 1995. Similarly, GNU Emacs today is capable of all of the above and has been managing for multiple decades at this point in a more modern take of the world...

Remote editing back in the 1980s was such a common thing on the Smalltalk and Lisp Machines that all system code was on another machine, more times than not you wouldn't even notice that it was a remote file!

One could do "emoji" just fine as well, and files would have WYSIWYG like look to them using "fat strings" -- that is 1980s technology. There is a dungeon crawler map using that feature to render the map as graphics, it is how you would implement chess pieces, or other "picture" like stuff.

Auto-complete was already standard, similar look up of "who calls" / "who uses" functionality to figure out where things are used, online documentation, etc etc etc...

So all this was perfectly possible, and already used and abused in 1995 -- VSCode isn't doing anything new in that regard.


> Smalltalk and Lisp machines

Lest we mention Plan 9!


And Inferno as well.


None of what you describe requires a lot of resources. Remote editing stubs are decades older than VS code, but also, many of us used X - for many years I did all my work over the network because there was no reason not to.

A color dialog was tens of KB of code in the 1980s.

My own editor handles Unicode well enough for most users in a few dozen lines of code. RTL would take a bit more, but not much. LSP servers if anything reduce the need for the editor resource use to grow.

It's not that these justify no extra resources use because they do, but they don't need to significantly increase resources use.

A lot of apps get away with huge resource use simply because people aren't used to paying attention to it any more, because for most it affects them little enough in isolation, per app, that when it matters addressing the resources use of one hardly makes a difference.


It's a bit theoretical because no editors exist that are smaller that do all of what vs code does. And a lot of what it does relies heavily on the notion that it's running in a browser. So, just tossing that out won't fly since you kind of need it for at least some of the features.

It's only when you subjectively remove all the features that you don't care about that it becomes doable to make smaller editors. And that's fine. But you can't have your cake and eat it.

The reason most people don't care is because it simply doesn't matter. Not even a little bit. Laptops are cheap. Memory is cheap. CPU is cheap. Your time is not. And it takes investing your time to make this stuff more optimal and faster. And VS code just does a lot of nice things that make you more productive. I use Intellij myself which uses even more resources. But it's a bit smarter and saves me even more time. The point with both is that you lose more than you gain by replacing them with something faster. It's not worth it.

My first computer was a commodore 64, so I'm well aware what that thing could do (and couldn't do). I'm writing this on a M1 macbook. Orders of magnitudes faster, doing things I could not imagine back when I had a commodore 64, etc. You can have one second hand / refurbished for next to nothing. Basically below my day rate when I'm consulting.


Back in 2014 my company switched from Skype to this hot new tool called Slack for messaging. On my £10,000 workstation with dual xeon processors, 64GB memory and a 1TB SSD, you know what was the second most resource intensive app after my c++ compiler, and above my IDE? Slack. We used to close our chat program to compile to save the 1GB memory it was using.

> It's a bit theoretical because no editors exist that are smaller that do all of what vs code does

You can't ever compare two things if you look for all features to match. Sublime is a pretty good comparison - it's wicked fast, has a bunch of the same features and language extensions. Emacs handles unicode just fine and has a huge extension surface area.

> The reason most people don't care is because it simply doesn't matter

Hard disagree here - the reason people don't care is because features sell, and as you said, the alternative option isn't there. I work in Unreal Engine most of the time, and about 3-4 years ago, there was an almost overnight exodus of game programmers who would live and die by Visual Studio who switched to Rider, primarily because it was faster than VS+VAX.

> Laptops are cheap. Memory is cheap. CPU is cheap. Your time is not

This only applies with one application. Now add Slack/Teams, Postman, Outlook, FF/Chrome, Spotify in the mix, and all of a sudden I'm running 6 full web browsers duplicated with all their resources isolated, using more menoey and CPU than Intellij does. I'm fine with Intellij pegging my 32 core thread ripper to index millions of lines of code. I'm less fine with Postman using more CPU than Intellij to display a json document.

> Im writing this on a M1 macbook

Depending on what software you're working on, your users aren't using M1 Macbookd. My partner's work machine is a5 year old i3 with 8GB of RAM. It's borderline unusable with teams and Outlook running IMO. But the person who benchmarks teams is doing so on the M1 MacBook.


We're talking about developer tools here. Editors are aimed at developers. You can expect developers to have reasonably decent hardware. If you are working wit unreal like you say you do, you presumably aren't using a ten year old macbook air to do your work. That would be madness.

Anyway, end users care even less. The paying user variety typically has a newish computer (of the last five years or so). The rest are not a great revenue stream. But of course, if you develop for users stuck on really old crappy laptops, of course you are going to invest your precious time in making sure they get a great experience and make all sorts of compromises to ensure they do. But for the rest of the users, good enough is good enough. You'll see from your revenue/usage statistics what that is.

I find the people that whine the most about this topic are exactly those people you should expect to have decent hardware (i.e. developers). Either way, use things that are useful to you.

Spotify and Slack, Teams, etc. seem to be doing OK with user popularity for example and don't seem to be getting a lot of churn over their application performance. And of course a lot of this stuff is used on mobile as well. I've used both for the last ten years without much issues on modestly sized laptops. 16GB is more than enough for me running stuff like that, vs code, intellij, a bunch of docker things, and a few other bits and bobs.

People using MS Windows seem to get a particularly rough experience. That's why lots of developers prefer mac or linux based machines.


> Slack/Teams, Postman, Outlook, FF/Chrome, Spotify in the mix, and all of a sudden I'm running 6 full web browsers duplicated with all their resources isolated

If those apps were PWAs instead, it would mean no extra browser copies are running. In my experience this only really accounts for 70-100 MB per app for the browser copy. No reason slack couldn't be a PWA, same with Spotify.

I'm not really sure how slack and others use so much RAM. I've built quite functional, complicated, and non trivial web apps. Mine typically use <50 MB with some coming in at 20-25 MB. When I'm deploying in electron I'm still in the 80-150 range.

The biggest performance questions for me are network latency vs local data and figuring out ways to mitigate network latency. The difference between 200 ms navigation and 5 ms navigation is pretty stark. Even if most people don't flinch at 200 ms.


Since your time is so valuable and you are obviously very upset about this, your company should pay Spotify to write a more efficient app.

Or your company should buy you a new 96 core Threadripper 1 TB RAM system so that when you use Spotify/Slack/Postman it doesn't impact your productivity.


If your only response is a personal attack, don't say anything at all.


I was just reflecting your thinking - somehow you feel that Postman/Slack/.... owe something to you. Pay them to do what you want, or stop using them.

You feel entitled to use a $10K machine to compensate for slow IntelliJ for maximum productivity and convenience, yet deny others (Postman/Slack/...) using the most productive and convenient technology for them (Electron). And while continuing to use their convenient products, you say they are bad. Use IRC, use curl instead of Postman.

The Postman programmers say the same thing: our users have $3K+ machines, no point in optimizing code to be fast, instead lets add more features since it's clearly working and our users are not switching. Obviously they love the iteration speed that Electron gives us.


> Pay them to do what you want, or stop using them.

I've been a paying slack customer for a decade at this point. I pulled up my email, my support ticket for "slack is using more ram than visual studio" was in February 2015. I don't have the political sway over Salesforce to makthem make these sorts of decisions.

> You feel entitled to use a $10K machine to compensate for slow...

Youre doing it again. I don't feel entitled. I don't have a choice in my chat app, my employer forces it on me. And even if I did, slack is on the whole the least worst option. As for postman. I did the same thing. I was a paying customer, I submitted support tickets, provided traces when asked and ultimately I did decide to change tool.

> while continuing to use their convenient products, you say they are bad.

Am I not allowed to have an opinion just because I have a fast machine? Am I not allowed to want my software to be better?

> The Postman programmers say the same thing:

No they say "performance is a top priority for us, we're sorry you're not happy with it. Please send us your hardware specs" and the ticket gets auto closed after 2 weeks.

> Obviously they love the iteration speed that Electron gives us.

It's not just electron - snappy electron apps exist. Startup time aside. VSCode is pretty damn good. Figma is an excellent example of how good it can be (and if you want to compare what it looks like when a company cares Vs a company doesn't, see figma and Miro).


> I've been a paying slack customer for a decade at this point

$10/month is not what I meant by "paying them". I worked at a company where clients would routinely pay us $200K to implement a particular niche feature which was not on the roadmap. If they asked for a non-roadmap feature, yes, the ticket would be closed "not-planned".


You seem to be choosing to engage with your own least charitable inferences rather than what reflects your counterpart's actual position. Viz:

> the alternative option isn't there

> I don't feel entitled. I don't have a choice in my chat app

Your responses are predicated on the option being there and the person you're responding to is just not taking it. This despite the fact that his or her responses strongly suggest they would take it if it were there, but it's simply not an option.


There is always an option. Threaten your employer you'll leave if they make you use Slack, quit programming and become a farmer who touches grass every day.

All this Electron app complaining reads like First-World Problems(TM).


Overly-reductionist arguments are not helpful. Suggesting that I quit my job because I disagree with the tech stack of a billion dollar company might be one of the dumbest things I've seen on this site in the 15 years or so I've been here.


You know we can read back the comments that were posted in this thread and check your response against the context, right? You just moved the goalposts from being willing to pay for the product that would need to be changed to address the complaints, to refusing to use the software complained about.


I'm not moving anything. Parent obviously doesn't want to pay the millions Slack would probably ask to make it "efficient" (whatever that means), you say parent has no alternative, I'm providing alternatives.

Or one can go back complaining "how the world is cruel, people are mean and greedy, I'm a good and misunderstood person which writes the most efficient and user considering software, unlike the evil people at Slack"


You're just shamelessly making up strawman non-quotes now.


For me that wouldn't work, because the impact Slack has is not measured in time loss directly (or at least not only, since Slack is truly a laggy piece of crap), but instead in annoyance and feeling bad about using basically spyware on my system. Basically each interaction adds a bit of pain and questioning, why I am even doing this shit.

Not GP, but they could buy me a 1024 core monster if it exists, it would still not solve the problem of Slack.


I have been running localslackirc (it's in debian) to access slack from IRC.

I still have to open it in the browser every once in a while, to search old threads or other stuff that is not supported. But day to day I can do everything in irc. There is also a weechat plugin afaik.

It's a bit annoying to configure the access but it seems the tokens never expire (or have not yet expired) so it shouldn't be too frequent for you either.


Thanks, I might try that soon.


Next to nothing of what VS code does depends on it running in a browser other than to the extent VS Code has made it so.

It's not special. If anything it's one of the most clunky editor I've used because it tries to shoehorn everything into a convoluted UI. It's because my time matters to me I avoid VS Code as much as possible.

The problem with VS Code is not that it's too slow, or too memory hungry. It could use far less, sure. And it could do so without losing any of the things about it that makes me dislike it.


Markdown and html, image, and other previews, documentation, connectivity, it's a lot more than you think.

If you don't like it, use something else of course. But there are valid reasons for it being browser based and a lot of people choose to use it at this point.


None of which requires VS code itself to be browser based, and of which do not benefit much from using a browser.

A lot of people choosing to use it is besides the point being made, which is not that people won't use it, but that it could be a lot leaner without sacrificing functionality.


there are other advantages to running in the browser. The fact that the editor is written in JavaScript/Typescript html/css means it runs in any browser. It's why there's been an explosion of online IDEs like codesandbox.io, stackblitz, github codespaces, google code cloud, repl.it and 100s of others.


My time is free. I'm not going to see a dime for any time I save by using this tool or that tool on my computer. Hardware, on the other hand, is not free, so I prefer to sacrifice time for being able to use less expensive hardware (within reason).

Of course, Sublime Text exists and does everything I want from VSCode at a fraction of the hardware usage. So I don't have to choose one or the other, because actual good software exists.


If your time is free I think you're devaluing yourself ;)


Consider an economy of time, where you have a finite amount of time to spend, so that time spent on one thing is time not spent on another thing. If you can spend money (or maybe earn less money) to avoid spending time doing things that are uninteresting, boring or unproductive, you are allowing yourself to spend more time on things that are interesting, fun or productive.


People drastically underestimate the cost of things we now expect.

An Unicode font is easily 15 mb in size. Let alone that you'll have several of those. And the code and memory that takes to do all the magic of rendering it, hinting, and subpixel antialiasing.

Then there's that a 4K framebuffer is 32MB in size.

Smooth compositing requires every running program to have a buffer it draws into. So there goes a couple hundred MB more just to make sure you don't see the screen repaint like in Windows 3.1.

Yeah, you can have compact software where your only requirement is that it uses ASCII, does it at 80x25 and doesn't do anything more fancy than editing text.


That's data size, not code. There's no fundamental reason that a program that can smoothly render unicode at 4k needs a GB download when kB could suffice.


We tried that in the Windows 9x days. We called that "DLL hell".

The idea was that programs would share libraries, and so why have a dozen identical frameworks on the same system? Install your libraries into system32. If it's already there but an earlier version, deploy your packaged one on top.

Turns out that nobody writes good installers, and binary level dependency requires too much discipline, and dependencies are a pain for users to deal with.

So shove the entire thing into a huge package, and things actually work at a cost of some disk space and memory.


> and things actually work at a cost of some disk space and memory.

I have ~10 000 .exe files on this machine, if none of them shared code and/or data (or were written in a ``modern`` language with 50+ MB hello worlds), they would not fit on my 1TB disk.


You can improve things significantly with a bit of coordination. That's how package managers work!


True, but I personally discovered this has limits.

What if you're working on something reasonably novel, like say, open source VR? Well, turns out you may want a quite eclectic mix of dependencies. Some you need the latest version, because it's state of the art stuff. Some is old because the new version is too incompatible. Some is dead.

Getting our work into a Linux distro is on my list, but even if dealing with all the dependencies works out, there's the issue of that we sometimes need to do protocol changes and upgrade on our own schedule, rather than whenever the new distro is released.

Distros are great for things that are supposed to integrate all together. They're less ideal for when you're working on something that is its own, separate thing like a game.

So for the time being, shoving it all into an AppImage it is.


You presume one option when the other option is a bundled but smaller renderer. The truetype renderer my terminal uses is about 700 lines of code. The C it's a translation of is about 1500. There's a sweet spot that might well be a bit higher to e.g. handle ligatures etc., but the payoff from going from that to some huge monstrosity is very small.


As somebody who actually works on a pretty large program, no, I'm absolutely not going to use your 700 LOC TTF renderer. I'm going to use the 128K LOC FreeType.

Why? Well, because it's the one everyone else uses. It's what comes with everyone's Linux distro. Therefore, if there's something wrong with it, it's pretty much guaranteed it'll break other stuff and somebody else is going to have to fix that. Also it probably supports everything anyone might ever want.

If your 700 LOC TTF renderer doesn't perform as it should, it might become my problem to figure out why, and I don't really want that.


I'm not suggesting you should. I'm pointing out that these things can be done with a whole lot less code. And a lot of the time so much less code that it is less of a liability to learn a smaller option. Put another way, I've had to dig into large font renderers to figure out problems before because they didn't work as expected and it became my problem, and I'd much prefer that to be the case with 700LOC I can be intimately familiar with than a large project. (I'm old enough to have had to figure out why Adobe's Type1 font renderer was an awful bloated mess, and in retrospect I should have just rewritten it from scratch, because it was shit; that it was used by others did not help us at all)

I ended up with this one in large part because it took less time to rewrite libschrift (the C option I mentioned) and trim it down for my use than figuring out how to make Freetype work for me. I now have a codebase that's trivially understandable in an hour or two of reading. That's what compact code buys you.

No, it won't do everything. That's fine. If I need Freetype for something where it actually saves me effort, I'll use Freetype. It's not about blindly rewriting things for the sake of it, but not lazily default to big, complex options whether or not they're the appropriate choice.

A lot of the time people pick the complex option because they assume their problem is complex, or because it's "the default", not on the merits.

There are tradeoffs, and plenty of times where the large, complex component is right, but far too often it is picked out of laziness and becomes a huge liability.


> We tried that in the Windows 9x days.

You say that as if it was some kind of failed one-off experiment of the 90s. We tried it in the Multics days, it caught on and the design philosophy is still popular to this day. It works quite well in systems with centrally managed software repositories, even if it doesn't in a system where software is typically distributed on a 3rd party shareware collection CD or download.com.


> Didn't handle emoji

Behold! The peak of technological prowess! So many poor souls of the past died in misery and 4 MB of RAM. They could not taste those sweet fruits of progress.


There is waste that is fine, and there's waste that doesn't really come with an upside.

E.g. in "waste" that is fine, I'd categorise AmigaE's choice to read the entire source file into memory, instead of doing IO character by character, or line by line from small buffers. It was a recognition that there was no compelling reason to not sacrifice that small amount of RAM for simplicity and speed. What you gain can differ, but as long as the benefit is proportionally good enough relative to the cost, that's fine.

But so much modern software pulls in huge dependencies for very little benefit, or try to be far too much, instead of being focused tools that interoperate well.

It's not so much that the new generation is stupid, as that a lot of people (of any generation) always choose the easy option instead of stopping to think. Sometimes that's the right tradeoff, often it's not.

And hardware advances mean you can get away with more and more. Sometimes that justifies more extravagant resource use. Often it doesn't.


> in "waste" that is fine, I'd categorise AmigaE's choice to read the entire source file into memory, instead of doing IO

This is only an issue if your OS doesn't have virtual memory and mmap. Modern OS's automatically prefetch files into free RAM (so there's no such thing as "free RAM is wasted RAM" either). I think newer versions of Amiga OS were supposed to be getting virtual memory support at some point, too.


Yes, but because it was an issue, even now decades later a lot of compilers still use file IO instead of just reading a file in one go even when you gain benefits (e.g. no "ungetting" or building a token buffer - just keep the index of the start and end). I'm guilty of that myself.

It was an inspired choice, and about 30 years on it's still underutilized, on machines that typically have 3-4 orders of magnitude more RAM.


>Things never change, the old generation fights the new one and calls it stupid.

I was with you until here, which I think is the wrong take. That is, this gets it exactly backwards. It's not just that every generation gets upset at the previous generation so let's all shrug and move on, it's that this is really a thing that is unfolding from one generation to the next.

It seems like the reflex of oh well the previous generation said it so let's ignore it comes up a lot, to the point that I have this go to example that I use every time it does. I'm a baseball fan. And one thing you used to hear in the '80s, with a guy like, say, Rob Deer or Steve Balboni, was that they tried too hard to hit home runs and they struck out too much. Then you heard that in the '90s as well. Then you heard that in the 2000s, especially with money ball and guys like Jack Cust. Then it just kept getting even more extreme with guys Carlos Pena and now Joey Gallo.

So one thing you could say is, well, every generation says that there were less strikeouts in third day. But there's actually data on this and..... it's true! Almost every decade, from the 1800s through every decade of the 1900s through now, strikeouts really have been going up year to year. And so that intergenerational commentary, well, it's describing a real thing that really is happening.

The same can be said of other things, like people saying they always used to remember the environment being better. Or people saying attention spans are getting shorter. But, they are.

The instinct here I think is to dismiss these since every generation says it. But I think the conclusion should be opposite, that these are real things unfolding on a multi-generational level. So if you see it happening with software, maybe that's because there's really something to it.


There is real thing happening. But the dissatisfaction from that thing happening is what's the criticism is all about. And those are two separate things. Things changing might be real or not. But the dissatisfaction of older generation is constant for thousands of years.


>But the dissatisfaction from that thing happening is what's the criticism is all about.

The argument seems to imply that the dissatisfaction would be there regardless of the circumstances. But the point I'm making is that the dissatisfaction can be understood as meaningful and not as just a generalized disposition that's a natural consequence of getting old.

And whether or not I'm right on that, the devil is in the details there, and it's going to depend on a case-by-case basis. But it won't do just to say well the older generation is going to complain no matter what, because that doesn't credit them with the possibility of complaining for a legitimate reason.


> But the dissatisfaction of older generation is constant for thousands of years.

And there is no dissatisfaction of the younger generation? There is no, this is the old way let's do it differently because we are more clever by the young generation?


There is. But it's the other side of this interdenominational coin. Basically both generations in conflict think the other is stupid.


You are focusing on the generational trends, I was focusing on the individual.

Tell a kid learning to program today "you should program in assembly because it's efficient, like I did back in my days".

Kid looks around and sees it would take him 3 days to implement a hello world in assembly, but only 3 seconds to do it in Python. He has a 16 core computer with 64 GB of RAM. Both hello worlds run instantly. So how does that advice make sense? Kid calls you a crazy old man out of touch with the times. Kid goes on running locally a 50 GB LLM to make it do a hello world and feels very excited about the future of programming.


I'm not sure I would agree that the point I'm making cleanly transposes onto the details that you've selected.

For one, I think the accepted premise in this conversation up till now was that there is a real issue with software bloat. And you've switched that detail out for a different one where we assume no discernible bloat or difference in performance as time passes.

I also don't think that I understand what's going on in the pivot from generational examples to individual ones. I feel like at least the comment I replied to was pretty clearly about generational trends. But on another level I think that the upshot is the same regardless of whether your surveying that disagreement at a general level versus its equivalent manifestation at an individual level.

I think the upshot would be the same in each case as long as you keep all the details the same, and I think somewhere in the transposition from the general to the individual and agreed assumption about bloat and underperformance of software, as well as some implications about what that means about prevailing assumptions and practices surrounding software development, got lost in the translation from one to the other.


> So one thing you could say is, well, every generation says that there were less strikeouts in third day. But there's actually data on this and..... it's true! Almost every decade, from the 1800s through every decade of the 1900s through now, strikeouts really have been going up year to year. And so that intergenerational commentary, well, it's describing a real thing that really is happening.

I agree with the factual observations in your post, but there's an additional bit here, and that there's qualitative value being assigned to what The Youths don't mind and The Olds protest. In baseball, the guys who strike out a lot but hit a ton of home runs create more runs, and therefore create more wins, than most base-hit machines (obvious outliers exist, but you get the idea). On my computer, VS Code does more things that benefit me than vim does (and the outlier here, I guess, would be "a lovingly crafted vim monstrosity that uses all the LSPs etc. designed for VS Code et al in the first place"--doable but not the happy path, etc.).

There's also (and IMO this is more in code than baseball) some kind of bizarre moral valence assigned, and that I don't even pretend to understand, but that's a different story.


>I agree with the factual observations in your post, but there's an additional bit here, and that there's qualitative value being assigned to what The Youths don't mind and The Olds protest.

A few things here. I want the main center of gravity in the point that I'm making to be a way of approaching intergenerational reports of a given phenomenon, namely that they shouldn't just be dismissed as a function of old age or a function of changing perspective. After that point, pretty much any point you want to make is fair game as far as I'm concerned. In the case of baseball, there are positives and negatives. It clearly seems to be a positive trade-off for hitters who are choosing which style to take. I suppose there's another consideration at a higher level as to whether it benefits the game itself. So that can go either way in my opinion depending on what's important.

I tried at the end to throw in some other examples, shortening of attention spans, and environmental degradation. I think in those cases it's clear that there's something negative going on. But in general we don't have to agree with the value judgment if it's negative, but I think the positive or negative value judgments is an independent thing from the phenomenon of multiple generations attesting to some thing happening.


>Now we do the same, but we look at the text editors of 1995 which used 4 MB of RAM as incredibly efficient and well made, paragons of craftsmanship.

That's because the text editors don't exist in a vacuum; the 4MB-RAM text editor would be slow on a 1995 computer but blazingly-fast on a 2024 computer.

VS Code is slow and annoying to use, and RAM is just a more measurable symptom of that.


I don't care about memory usage for editors as much I care about input latency and responsiveness.

Jetbrains (IDEA IntelliJ, Pycharm, etcetc) put a lot of effort into making their IDE low latency as it was getting to a point of being almost ridiculous. Their editor is built in Java, and they run on their own runtime as they have so many hacks and tweaks to make it work as a desktop app as well (font rendering, etc).

Pavel Fatin has a [great article](https://pavelfatin.com/typing-with-pleasure/) about typing latency and his work around implementing this in IntelliJ, well worth a read.


1995 text editors didn't use a "layout engine" that executes JavaScript, attempting to JIT fragments of code. They were an event loop that processes native OS events and responds with repainting areas of the window. They also weren't able to automatically recognise language syntax and didn't have Git integration :-)


One of my favourite editors from the 1990's was FrexxEd, co-authored by the author of Curl, whose main event loop processed Arexx commands and events for every internal command, so you could rebind every event to a script in their own scripting language (FPL; C-like), and access every internal function from it. (It incidentally also came with FPL scripts that provided some degree of syntax highlighting for a few languages, and you could add more, though not as expansive as most modern editors).

Running a heavily scriptable editor with a GUI was entirely viable on a 7.16MHz Amiga with 1MB RAM, and more responsive than many modern editors.

Integration with other tooling, like RCS, compilers, or linkers was a given for Amiga apps at that time.

(FrexxEd was also notable for exposing internal buffers as files in a special filesystem, so you could e.g. lint or compile or call your revision control - limited as they were - directly on the buffers without any custom integration)


CSS grid exists now, so it should be easy enough to achieve the same "repaint small areas of the window, don't do global layout" workflow in a web-based app.


BBEdit was first released in 1992; I'm not sure what that version was like, but I'm still using it now. That said, the version on my disk right now (30.3 MB) wouldn't have fitted into the RAM of the Performa 5200 I had in the 90s (8 MB initially, can't remember what I upgraded it too in the end, 24 or 32…)


But it's true, in fact it's an understatement: a 8gb mac makes you look way cooler than an 32gb pc.


Well yeah, because with only 8GB of RAM you have much more free time to practice looking cool while your machine grinds away in the trenches of swapistan


Maybe to the average consumer, but here it does not.


The default DDR5 ECC is not exposed to the outside, not reported to the OS.

You only caught errors on DDR5 because it's stability is very brittle. Market forces push it to the limit - everybody cares about the MT/s and runs XMP unlike previous generations.


DDR5 seems to have optimistic XMP settings, whereas DDR3/4 were more conservative.

I have, however, had very good luck so far just buying faster DDR5 and then running it a little slower at the same timings, which is how I arrived at the idea that DDR5 XMP is usually too fast to be actually stable.


I think it's just the memory controllers not being as mature yet.

For some point of comparison: Intel 12~14th gen operate DDR5 at 4GHz with 2DPC (DIMMS per channel), AMD Ryzen 7xxx at 3.6GHz with 2DPC.

With 1DPC the numbers are slightly better, with Intel 12th at 4.8GHZ and 13th/14th at 5.6GHz and AMD Ryzen 7xxx at 5.2GHz.

All a far cry from the ridiculous overclocks at 6GHz and above.


XMP profiles are still seemingly more conservative for DDR4, since I would typically run my 3200MHz DDR4 at 3840MHz. That stopped once I added a couple mismatching sticks though, now I have to run the set at 3105MHz or the system won't POST. Not a particularly big deal.


The neurological basis is quite simple: the brain is a prediction machine, and it's rewarded when it predicts correctly.

This is why music is pleasant - it's highly structured nature is easy to predict, thus rewarding.

This can be further optimized: the "drop" in club music for example, which is anticipated and teased over even longer distance (tens of seconds, even minutes).


The tension between, and variation of, predictable and surprising elements in music are clearly important to it's enjoyability.

But to reduce the pleasure of listening to or playing music to "music is easy to predict therefore it feels good" seems overly simplistic.

Maybe you didn't intend to make that reduction, but your comment comes off that way, to me at least.


> The tension between, and variation of, predictable and surprising elements in music are clearly important to it's enjoyability.

I heard this idea a few times as well and I interpreted it more as being about a piece of music in relation to other pieces in the same genre or simular musically (e.g. harmonically, melodically or rhythmically).

When we talk about popular music I think for the majority people it will be nearly perfectly predictable after hearing a composition a few times. Clearly people enjoy re-listening the tunes they like so I think the point of predictability being enjoyable still stands.

Though I guess the idea about the tension between novelty and predictability applies to many contexts. Another example I clearly understand is that the melody needs to be a bit surprising with respect to the harmony to be interesting and enjoyable. Also the structure of the music seems to revolve around this idea as well - even the popular songs more often than note incorporate intros, solos, bridges, breaks and/or key changes to blend the chorus-verse base pattern.


Can you elaborate in what sense you find it simplistic ?

What I would personally add to that model is that if the music is "too easy" to predict for the brain, there's no challenge hence no reward. But that complements rather than contradict the initial theory.


240W per card probably


Indeed.


Derek tackles easier subjects than PBS space.


Human rights have priority over IP rights.


There's a photographer suing a tattoo artist over this right now. The outcome will be interesting.


But I'm sure they are not proposing killing off the tattooed person.


No, but the verdict could have cost an arm and a leg.

@GP: Yes, but what does the law say about transhuman rights? At what point has one added/removed things that they’re no longer a natural person?


the photographer already lost.


How much energy do the data-centers of the 4 hyperscalers use?


These data centers actually serve apps that 100's of millions of people use every day...


Those data centers generally do useful things though.


To be fair, the bulk of what they do could probably be missed without anybody noticing other than that there would be a large collective sigh of relief. There is so much junk produced that I would not be at all surprised if that dwarfed the crypto computers, but it is much harder to identify.


Normalized per transaction?


At least that is not poor technology as Bitcoin. Which is ineffective by design.


That sounds dangerous, since not all emissions will be converted.


If they (or you) use a cap that's opaque to UV but transparent to the visible spectrum, then there won't be any issues with this.


Wait until you find out about fluorescent lamps...


It's complicated.

For 1 employee which disobeyed orders and saved the company you'll have 99 which disobeyed orders and don't produce anything useful or even made harm.

This is called the Halo effect.


Well, if they really thought it was a waste of time, they could have fired him in the first few years, but instead they kept funding his research. The company was doing one thing with its right hand, and something entirely different with its left.

I'm not saying the halo effect isn't real or not applicable here. But a multi-billion dollar invention warrants Extenuating Circumstances, and it's oh so very convenient that the CEO can say "well we don't want to inspire this kind of behavior in other employees!" after the profits are realized.


I think it would have been reasonable to fire him before he made the breakthrough. Sometimes you have to cut your losses. Its his treatment after success that seemed egregious.


It is extremely difficult to fire employees in Japan. Disobeying orders is in general not a fireable offense.


Yes, but the odds that someone who invents the blue LED also invents something else amazing is much better than a random employee who did not invent the blue LED inventing something amazing.


I'm not sure it is the Halo effect here. Imagine how much more money they would have made if more employees ignored the new CEO, or if there was a different CEO.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: