Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Calling this embedded system feels like insult to the spirit of 'running light without overbyte', because it's so comically capable.

My first PC had 66/33 MHz 80486 processor and 8 MB ram, 320 MB HDD. You could run AutoCAD, play Doom, Civilization, have dual boot for Linux, Windows 3.11 etc. It could compile Linux from source.



You were living large! My first Linux computer was a 20Mhz 80386SX with a 40MB hard drive partitioned half for Windows 3.0 and half for Linux (SLS) and swap. It had a whopping 4MB of RAM and I also compiled my own kernel making sure to trim down features as much as possible. It was magic seeing X11 come up for the first time just like the big-bucks Sun boxes!


Hah, you had it great! My first PC was a 286, which IIRC either had no HDD, OR. 20MB HDD (might be mixing it with our next PC, which was a 386).

I'm pushing 40, and still shake my head in wonder sometimes about just how much is possible at such small scale and cost.

But this is now turning into "when I were a lad..." :)


Oh, you kids!

My XT had an 8088 @ ~4mhz in it, and we pushed it uphill to school both ways, in the snow!

It was multi-functional though. The power supply was inefficient enough to heat the room, while simultaneously being heavy enough to bludgeon large creatures for sustenance.

Something inside me misses the radiant heat, sitting here on a cold day with my chilly metallic laptop.


My Heathkit H89 had a Z80, 16K of RAM and a built in cassette interface. And we liked it.


I just flipped a bank of 16 switches by hand with vacuum tube gates and core memory...punch cards are for sissies!


You had a computer? Back then I wrote C programs on paper, then executed them by hand...


One of the main causes of the fall of the Roman Empire was that, lacking zero, they had no way to indicate successful termination of their C programs.


C? All I had was a single book on MS-DOS, so I wrote batch files on paper, then executed them by hand.

(Not even joking, by the way - literally how I got into computers and programming.)


oh kids! i used to crank the wheel by hand on my analytical engine that my hacker friends and me built after there was a leak of babbage's plans! he never knew!

you kids and you petabytes of ram! back in my day....

/s

(i'm looking forward for these comments to be commented on a certain website i'm not supposed to name)


you kids. We calculated with charcoal on the cave's wall while the adults were out hunting for wooly mammoths...

ZX81: Z80/1KB RAM... more recently though PIC12 (C501 IIRC) ...


I'm slightly surprised nobody has yet linked https://xkcd.com/505/



There's a website where people make fun of HN comments? Is it Twitter?


The prime directive of said website is to never name it on hacker news. But I'm pretty sure you can Google the prime directive itself and figure out where it comes from. As a hint, it starts with "ng".


more hints?


Webshit, once a week. I won't go further :)


Start a few Electron apps and it'll warm right up


Few? You can only run 1! We have to yet invent the technology to run two, let alone a few!


I honestly thought you were gonna reference the Four Yorkshiremen sketch by Monty Python after the second line!


Predates Python by a couple years. It’s from “At Last The 1948 Show” from the 60s.


Didn't know that. Thanks for sharing.


Run a threadripper with the window open!


laughs in 6502 dances in 68k cries in 486


Mine was too, albeit 40MB had and a CD tom!

Ran windows 3.1, Civilisation (VGA graphics), railroad tycoon, Star Trek 25th anniversary - and could have them all installed at the same time. Other programs included championship manager 93, and I think day of the tentacle.

But it wouldn’t run Linux - 386 and above.


I was a kid when my father first acquired a similar computer.

We ran Windows 3.0 and my father configured a boot entry for running Doom or Ultima 7, which required so much memory that it wasn't possible to run Windows, only DOS.

I remember feeling a bit envious about my neighbor having the more powerful 486, which could run Doom much faster than in our PC


Made me laugh about “Dad configured a boot entry”...I remember the hours with my Dad trying to stuff whatever device drivers into that 640K via Autoexec.bat and Config.sys. Got the CD-ROM running but damn if sound now doesn’t work. Those were the days trying to get Tie Fighter in my case to work.

Worse part was my Dad in his wisdom bought a family PC without a Pentium, but a Cyrix P166. Had zero floating point processing. Ran like a damn dog on any 3D game.

Any Brits out there might remember Time Computers. On every back page of any newspaper ever selling PoS computers with whatever subpar hardware they could cram into a big beige box ;-)


Not to brag but I got a 4.77MHz 8086 XT PC for my 8th birthday :) It had a whopping 256KB of RAM, a monochrome display (later upgraded to a 4 color CGA that blew my mind), and two (yes, TWO) Floppy drives.


My dad got one of those for us both to use...what was truly "whopping" about it was its price...

If I remember right it was almost $3k...but that included that blazing fast 300bps Hayes modem for wide-area networking.

And my mind was truly blown when we replaced the 2nd floppy with a more-memory-then-we-will-EVER-need 20MB hard drive...what could we do with all those bits??


Yeah the prices were crazy! My first was a 10Mhz AMD 286 clone with 1mb and a 40mb HDD. I recall it being $2600 with a 13” SVGA monitor.


Same! We got 640kb of ram and a mouse! The HDD didn’t arrive until we got a 286, tho.


And a decade before, the distributor DigiKey presented as a one-column advertisement in Popular Electronics, selling 7400 DIP ICs for $0.29 . Inflation-adjusted, that $0.29 now buys a an IC that can run Linux.


> 8086 XT PC for my 8th birthday

Wow, I admire the far-sightedness of your parents, to give a child a computer at such a young age (and in those early days of PCs).

I was of a similar age when my dad brought home an NEC PC 9801 - with an 8086 CPU (8MHz), 640KB RAM, kanji (~3000 characters) font ROM, even a Japanese word processor. I think it ran MS-DOS 2~3.

"In 1987, NEC announced one million PC-98s were shipped."

That was a big wave, and I'm so glad my parents let me play with a computer as a "toy" - it was a huge influence on my mental development.

Kinda like the monolith moment in 2001: A Space Odyssey. :)


My parents had no idea, and they couldn't afford it anyway :) My dad asked his cousin, who happened to be working with PCs way, way back and recommended he get a PC instead of a C64 which is what everyone else had. My dad asked his dad and my grandfather forked over what must have been an insane amount back in 1980's Israel.


Aww, that's so nice that your dad asked around and pulled resources together for you. And I'm sure grandfather knew it was a worthy investment in your future.

My friends around that age all had a wildly popular video game console called Nintendo "Fami-Com". https://en.wikipedia.org/wiki/Nintendo_Entertainment_System

My parents refused to buy it, and instead let me play with the PC-98, where I learned BASIC, Turbo Pascal, even some 8086 assembly language.

I suppose if I have children, I'd discourage mobile phones/apps and instead give them Raspberry Pi, microcomputers, sensors, devices they can build stuff with.


This time of year is ripe for nostalgia. I recall that it was just about this time in 1993 during Christmas break that I loaded Linux version 0.96(?) onto this 386SX machine. This involved taking a stack of 1.44MB floppies and driving to a place where I had access to the internet. I'd choose SLS packages and copy each to an individual floppy. Then I'd drive home and load them one by one until I had a bootable system. And of course with floppies, it was inevitable that you'd get a error now and then. So, back to the car, download, copy and repeat. All to get away from the limitations of Windows 3.0...


Talking about Christmas, we used to go to my aunt's on Christmas day and I remember the time they bought an Amiga for my cousins. We were very young so between games we used to have a good laugh with the program that did text-to-speech (can't remember the name right now), but it only spoke English and we were easily amused by typing phrases in Italian and having them read back as if it was English (so the pronunciation was all messed up).


same here, packard bell but iirc it was 20/40Mhz with 40MHz being the turbo, but before that I did have a zenith 8088 dual 5.25" floppy system that had to boot dos with one disk and run programs off the other (mostly written in gwbasic).


Mine was a Mac IIsi - 20MHz 68030, 1MB RAM. To run Linux, you needed to install a daughtercard with a hardware FPU.


My first few PCs weren't even PCs, I didn't know what they were until I was a little bit older. Somehow I intuited them naturally though; I liked exploring and being curious about what I was looking at.

First one I remember was a Commodore 64, along with a 600 page book full of BASIC that you could type out and record on cassette to have your own game. The book was total gibberish to anyone else; it was just pure code with no explanation. But that's what the C64 gave you; an interactive environment on boot where you could program a new game and write it to a cassette. By default. If you wanted to play a game you had to type `RUN` or maybe one or two other things to set it up. But you wouldn't know that, because you just had an interpreter on a basic blue and white screen.

Worst bit was the 10 minutes of spasmodic strobe animations that showed you the game was loading. But also each game controlled those loading animations. You had to know what game you wanted to play, and be sure of it, or otherwise you could just flip to the B-side and get a different game.

After that I think we had a BBC Micro at school but I'm not sure. All I remember is an adventure game and one of those classic 5" floppies. I still really love the look and feel of inserting a floppy and manually locking it in. Floppies and cassettes and VHS's minidiscs were truly fantastic for the time. They were still mechanical, unlike CDs.

Then on my dad's side I and my siblings got an Acorn PC and a bunch of random floppies. None of them made any sense but some of them did cool things when you ran them. I remember hiding from my family and putting in the floppies that made flashing colours and watching it until time passed.

Must have been 11 or 12 years old before we first got a PC and by that point I was utterly fascinated. It was some shitty off-the-shelf eMachines thing but it was the best we could get; I managed to retrofit a decent graphics card in it a little bit later.


There's embedded devices with an order of magnitude more CPU and two orders more RAM, if you're being strict about what counts as "embedded". If you're not, probably another order of magnitude on each.


For example Nvidia's Jetson AGX Xavier. 512-core NVIDIA Volta™ GPU, 8-core ARM, 16 GB RAM,...


Well now I feel old. I remember buying a 486 with 8MB and thinking "I'm really living in the future now!" The mhz were ridiculous and -- according to my memory -- most instructions ran in a single cycle too! (Warning: my memory is not a reliable gauge of the state of computing at the time.)

8MB was pretty extravagant but it turned out to be a good call even though it could be had for half the price within a few years.


'computing progress' seems mostly resolution increase.


Lucky! We had the same 486 system but only a 40MB disk. I had to spend some lawn mowing money to get a 400MB second drive so I could install Sid Meier’s Gettysburg. :)


Oh geez I can't imagine how long it would take to compile Linux on that slow of a processor.


Remember there was a lot less to the Linux kernel at the time. Wasn't too bad. The single-threaded compile time for Linux has stayed fairly consistent for awhile, since the tree grows about as fast as the average desktop improves. The big improvement in compile time has come from being able to compile with 8-32+ threads.


I once read somewhere that the time to compile Linux from source on high-end desktop hardware has remained remarkably stable over the years.


Yes, that's true. It has been O(1 hour) for pretty much the entire history of Linux. And if you think about it, this makes sense. If it were significantly less than that, the development cycle would speed up so it would be easier to add capabilities, which would slow down the build. If it were significantly longer, the addition of new features would slow down until the hardware started to catch up. So the compile time acts as a sort of natural control mechanism to throttle the addition of new features.


I'm not sure that is the primary reason -- many other projects have had their compile times explode over many years, even though the same logic should apply.

Not to mention if you build the kernel regularly, you benefit from incremental compilation. If you change a few non-header files the rebuild time can be as little as 2-3 minutes. Oh, and "make localdefconfig" will reduce your from-scratch compile times to the 5-15 minute mark. I highly doubt most kernel devs are building a distro configuration when testing (especially since they'd be testing either in a VM or on their local machine).


> many other projects have had their compile times explode over many years

Like, for example?


When I worked at Microsoft, Windows took far, far longer than an hour to build from scratch. I remember walking a few buildings over to the burn lab to pick up DVDs of the latest build. I don’t have any hard data, but running a full build on your dev box was very rarely done.


The dynamics of commercial projects can be very different from open-source. You're much more likely to tolerate sitting through long builds if you're being paid to do so.


Until management notices, does the X hours saved multiplication by number of developers, and the resources for improvement will be found soon.


Definitely not. People are far more expensive than machines.


Was it just the windows kernel? Or did it include all of the utilities in Windows?


It's a lower bound for a highly active project.


I'm convinced that compiler speed is all a matter of how it was initially designed.

Take for example Delphi which can compile millions of lines of code in under a minute or something ridiculous. Then we have D, Go, Rust, and such, they compile rather large codebases that would take C++ a good 30 minutes on high end hardware of today in shorter spans of time (not as familiar with how fast Rust compiles, but I know Go and D do pretty nicely, especially if you enable concurrency for D), which probably takes those same 30 minutes on high end hardware from about a decade ago.

Sadly from what I have heard C / C++ has to run through source files numerous times before it finally compiles the darn thing into anything meaningful. Facebook made a preprocessor in D to speed things up for them, it technically didn't have to be coded in D, but Walter did write the preprocessor for them, so he gets to pick whatever language.


The C++ language cannot be parsed with a context-free parser. A large part of the build times, however, is due to optimizations.

In the early days, clang was much faster in compilation than gcc. Over the years, it has improved on optimization output but as a consequence has lost the compilation speed.

There are many examples for https://godbolt.org/ to show how much work the optimizer does. As example, the http://eigen.tuxfamily.org library relies on the optimizer to generate optimized code for all sorts of combinations of algorithms.


Rust doesn't compile very fast unfortunately, but it's being worked on. General wisdom says it's about as fast as Clang but comparing compile speeds across languages in a meaningful way is difficult


Fair enough, thanks for that, I wasn't sure, all my Rust programs have been small enough to where I havent noticed. I wonder if it would of made sense for Rust to consider this earlier on in the same way that Pascal did. I believe I heard Pascal was designed to be parsed easily by machine code.


Parsing is not the reason why it takes a while to compile.


"Parsing" was probably not the best choice of word on the GP's part, but they meant that Pascal was specifically designed to be implementable with a naive single-pass compiler; of course that would exclude many optimizations we take for granted these days.


It's the layout of the code that allows Pascal (/Delphi) to compile everything in a single pass. By starting with the main file you easily build the tree of deps and public interfaces/functions vs the private & implementation details.

C and C++ even though they have header files, make no restriction on what goes in a header file, so it takes a bit to figure out the dep graph.


The best part about Facebook's preprocessor (warp) was that it was slower than Clang (https://news.ycombinator.com/item?id=7489532).

But the preprocessor isn't what makes compiling C++ slow. It's a combination of the complex grammar, and ahead of time optimization of a (mostly) static language. Turns out you can compile code really fast if you don't actually try to optimize it.


My first Linux, Slackware, took me a week to download (Slackware), was on something like 12 disks. I compiled a kernel, took two days.

That was on a second-hand Pentium-S. I probably wasn't doing it right.


Linux in the 90's wasn't as bloated as it was now. It definitely took well under an hour to compile a kernel. My first Linux box was a 386SX-16. I later upgraded to a 486DX4-100.


That doesn't sound right, it's still not bloated. The majority is drivers then comes the cpu arch folder that takes a big cut. You can check yourself with cloc:

http://cloc.sourceforge.net/


Linux 1.0 was 1 meg, gzipped and less than 200K lines of code, total.

Linux 4.x is over 150 megs, gzipped. Just the lines in the arch/x86 folder are more than the total lines in Linux 1.0.


It used to take me just over two hours to compile a (very minimal stripped down) kernel on my 25 MHz 80486 with 4 MB of RAM (in the late 2.0.x/early 2.2.x days).


Long. I remember it being something like an hour or so to compile the kernel, but the kernel was also much smaller then. I specifically remember Linux 2.0.0 coming in a 2MB tar ball. Because it probably took me longer to download with my 28800 baud modem from an over saturated FTP server than it did to compile. 8)


depending on how you configure your kernel and what storage you're using, it might be doable.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: