SGI hardware was beautiful -- the boxes were just colorful molded plastic (and rubber?) yes, but they had a transcendence about them. You knew upon looking that they weren't normal PCs but "workstations", a cut above what us plebeians could afford.
While SGI boxes were beautiful, IRIX itself was... not. It had a Motif look which looked outdated, even compared to OS/2.
I actually loved IRIX for a couple of very simple reasons: it mostly got out of the way of the applications and it 'just worked', reliabily, month after month. After many years of using many IRIX boxes as workstations and as servers I realized we'd never seen a machine crash, uptimes were in the years unless we rebooted a machine on purpose. Very solid hardware, very solid software and for the times on the workstations amazing graphics. The lack of eye candy in the OS was an asset to me. I still set up my terminals the way the default terminal in IRIX looked because it feels like home to me.
Linux has some of these properties today so that's what I'm using now.
IRIX users would beg to differ. Towards the end of SGI's life, IRIX has a amassed a laundry list of bugs labeled critical that weren't getting fixed and the list was getting bigger, prompting many of the staff to leave SGI and many SGI customers to switch to BSD or Linux, further driving more nails into SGI's coffin.
Anything after 5.3 I missed because I had already seen the writing on the wall by then, so it may well be that both of these are true, but that we're talking about different times. Mine: 1995 until 2002 or so. Right about when they got VR crazy (a bit like what is happening with Facebook right now).
I actually don't, but I think this has to do with the generations of the hardware and software combo. Early stuff would typically be very stable. Towards the end of their gold era, SGI urge out new stuff and reliability went down.
In grad school I ended up managing a small cluster of SGI machines in our department. After setting up two new machines the time was wrong. I fixed it. Hours later it’s off by 12 minutes. Back and forth I go for a week thinking SGI has crappy clocks.
Found out SGI machines seek out each other and elect a TimeMaster based on up-time, clock changes. Turns out we had a machine that predated all of the new ones that had not been restarted in 2 years that had incorrect time.
I remember IRIX as that operating system that had an amazing number of exploits, and they went unpatched for so long. It looked like there wasn't a buffer that they wouldn't overflow...
That's fair, it wasn't exactly secure on the OS level, fun stuff like the help function allowing a backdoor to a user account. But to be fair to SGI this was exactly the era when the internet went from 'small village' to 'megapolis' and that brought the vermin right along with the funding.
Yeah, there were so many easy exploits that we regularly used them for our many SGI workstations when we didn't have the root password for that specific machine.
The SGI O2 was my first professional computer. I still customize my bsd desktop env to look like Irix and find it far better looking than the modern default look. I also use Screen font on my older Mac because it looks great. The SGI workstations never had the best performance but they looked great to me and never crashed in my experience. Closed source was simply not the way forward for me.
The Irix (and Solaris) ports used the Lattitude porting library from Quorum Software Systems [0]. It was sort of a reverse engineered portable macintosh Toolbox library.
This is a general problem with old OS nostalgia and most UNIX variants: there’s literally nothing interesting to do with most of them. Not only that, but what you can do is not radically different from how things are still done.
I worked with stuff like AIX and Solaris for years but like why would I ever bother with them again. What am I going to do, install DB2 for fun?
in the 90s i was collecting unix hardware. most of what i actually used at the time (AIX and IRIX in my case. i had a sun box too, but i was mostly running linux on it[1]), but now when i look at that collection, all i can say is that it looks cool. i like the design of the boxes (i am a big fan of the pizzabox format, and of more unusual shapes like cubes instead of the common tower),
but when i think of using them, i am asking myself, what am i going to do with these old systems? on some i can install linux, ok, that's nice, but then all i got is linux in a nice looking box. even at the time, when i had scored an apollo domain workstation that was acting as a door stop in a university library. it had a very interesting networking system. i looked around on it for a while but then i just left it sit there.
i'd probably be more happy if i could find a modern case designer that makes PC cases look like the sun or sgi boxes of old.
Myself I'd never put the effort in, but I have fond memories of telnetting into the Indigo my brother's college roommate had in their dorm room, so when people do this nostalgia retro stuff and post it on the web I enjoy browsing it for a bit.
I think the point being that you can still telnet into someone's Linux box (if it runs telnet....) and it would look exactly the same as 25 years ago. I find it difficult to be nostalgic for something you not only recognize but recognize as clearly inferior. Though, I also have fond memories of a friend who had his private SGI workstation on his desk at my first work :)
I beg to differ. I regret giving up my O2 w/ 1600SW display, but I actually used it for something that is difficult to replicate even today. The AV input/output board allows for capture of s-video output (good for digitization of old VHS tapes).
I also used the SPDIF input on my Octane (connected to a Philips DCC player) to digitize a bunch of my old cassette tapes. Many of those tapes don’t have commercial digital releases.
> I actually used it for something that is difficult to replicate even today. The AV input/output board allows for capture of s-video output (good for digitization of old VHS tapes).
This isn’t at all difficult today. You can buy a S-video to USB interface from Wal-Mart for $25. 20 years ago they were more expensive but still easy to get.
I see these sort of efforts more as archaeology than anything else. Asking "what sort of interesting things can you do with it?" is like asking if we can do anything interesting with stuff we dig out of Flag Fen or Pompeii.
AIX? You'll spend an hour or so to set up the network so it'd even boot.
For the youth: The AIX machine I remember did not even boot without a proper network. They featured obscure key combos, a tiny LCD screen and even more obscure beep codes to let you configure the network parameters. Only then they would boot into a proper OS.
A mate of mine has an absolutely beautiful old Austin 12 which turns 100 years old this summer. It actually drives surprisingly like a modern car, modulo things like no syncromesh so you have to double-declutch - it doesn't have the "funny" pedal layout of a Ford Model T, for example - and its 1800cc engine still produces most of its 27bhp taking it to a top speed absolutely flat out downhill and homesick of about 50mph, not that you'd want to with its cable-operated brakes. Most of the mechanicals would be familiar to anyone who'd worked on any sort of car.
You could take that 100-year-old car and daily it, around town. It is a joy to drive, and everyone wants a good look at it. It's clearly a thing of beauty made in a way that nothing will ever be made again. However!
Pretty much every time you go to drive it you need to fiddle with the carburettor, or do something to the magneto, or take the spark plugs out and give them a clean, or prime something, or oil something, or generally fettle some part. You've got to set the throttle, the choke, and the magneto timing *juuuuust* right before you swing the starting handle or it'll break your wrist. The cooling system either overheats or doesn't heat up enough and the only way to tell is it doesn't run properly and sometimes steam comes out - for both conditions.
I love driving it. I also love old OSes like AIX and SunOS. I've even run 2.11BSD on a real PDP11!
Here's the thing, though. I have Linux on my desktop, and a 25-year-old Range Rover parked outside. They're both a bit old-fashioned with some clunky bits of styling that you wouldn't have these days. They're a bit crude mechanically (it has a 1960s V8 boat engine under the bonnet!? It has a *monolithic* kernel under the bonnet?!) but they both start on the button, run all day without having to get out and fiddle, and they're comfortable to sit at for long period and actually *get shit done*.
No-one stops to turn and gawp as you go past though.
Back in the day, we had AIX running on RS/6000 and it was never connected to any kind of network. Didn't even have networking hardware. It served serial terminals. I'm certain that AIX did not, in general, require a network to boot, but rather your system was configured that way.
i must have been lucky, because my AIX box was working fine on ethernet. the most "fun" i had was once when it didn't boot, i had to mess with ed to fix some configuration (that i must have messed up myself before) to get it to run again. vi experience came in handy.
I believe ours were also RS/6000's but I'm not sure because it was a long time ago. Those were rented machines with an expensive per-hour service contract. My company didn't want to pay for the setup so we did it ourselves. Totally possible the machines were misconfigured deliberately to make us pay for the setup service or maybe they were just in the state the previous customer had let them and that didn't fit our network.
Anyways my point was more that with this old tech you'll probably spend a lot of time on boring details before you'll even get to the fun part, like running DB2.
Funny you should say that. I used to collect old machines (owned a SPARCstation, Mac SE, etc.) and at one point I inherited an IBM RS/6000 with no user manual. I was AIX-curious but I never figured out how to get it to boot. (I didn't connect it to Ethernet)
(this was in the early 2000s when the Internet was still nascent, and I didn't have any IBM experts in my circle)
I recently (restored/put to work) showed an SGI Indy at our office to young ones. Apart from cool case, it was mostly "yeah, looks like any ol' linux desktop we have" which is right, of course... but 30 years ago!
I also have rose-tinted glasses now on IRIX and SGI since I worked on those machines for better part of 90's and early 2000's in VFX (both VFX work and software for it). There was mystique about those daylight robbery machines that made you feel you could and should achieve more. There's nothing like that today anymore, except maybe getting a DGX - Carmack even mentioned something similar ( https://twitter.com/ID_AA_Carmack/status/1398519867280609282... ) which I can definitely relate to from SGI era.
> showed an SGI Indy at our office to young ones. Apart from cool case, it was mostly "yeah, looks like any ol' linux desktop we have" which is right, of course
Except unlike current Linux, the SGI workstations didn't have screen tearing and had HW accelerated GUIs working out of the box. Shots fired! :D
OS nostalgia is stupid but application nostalgia makes sense. If you happen to have applications for IRIX that would be expensive or impossible to replace, then keeping IRIX on life support is reasonable. For example I happen to have a licensed copy of PTC Pro/ENGINEER for IRIX/MIPS. There's no other way to use it. This is the same reason I keep Win2k virtual machines hanging around. That is the only way I can still run my copy of AutoCAD R12.
At least you can use the internet opposed to 8-bit computers (without hardware extensions). On IRIX you even have Mozilla 1.7.12 although I fear that many websites will be unusable with that browser. And a terminal, ssh, gnu toolchain. What else do you need? :)
It’s down right now because the hard drive failed last December after 27 years of operation but the NTP server for my home network used to be and will be again a Digital AlphaStation 200 4/233 running Tru64 UNIX connected to a PPS serial GPS puck.
That depends on whether or not your a casual DB2 enthusiast who plays with it only when it's available, or if you're a hardcore DB2 enthusiast willing to restore an AS/400 just for the fun of it, really.
I'm biased, because I did work with IRIX at SGI and I did in fact touch some kernel code as well.
What I miss from IRIX, that no other system has yet replicated:
1) Realtime mode. RTLinux doesn't count. In IRIX, you could run your own code at a higher priority than the scheduler itself (this was called hard realtime), and you'd give it time slices when you could, or a core or two.
2) Frame scheduling. When rendering 3D, you could have the scheduler connected to the monitor refresh rate to make you less likely to miss a frame boundary.
Yes, Motif was very plain, and X11 was a hot mess to work with, but the thing had capabilities which are still hard to find.
> What I miss from IRIX, that no other system has yet replicated: 1) Realtime mode. RTLinux doesn't count.
FWIW, SGI didn't seem to agree.
From SGI's own whitepaper: "In addition, REACT for Linux adds unique capabilities including sgi-shield and kbar that were not available on IRIX. The Linux based platform delivers better real-time performance than SGI
Origin running IRIX with realtime extensions: 30µs guaranteed interrupt response time versus 50µs for Origin."
Without REACT extensions, Irix realtime facilities aren't any different than the scheduling policies of Linux (this is akin to bypassing the normal scheduler).
I have a soft spot for Irix from the early 90s, and it had some clever accomodations for the technology at the time, but things have moved on and advanced.
>SGI hardware was beautiful -- the boxes were just colorful molded plastic (and rubber?) yes, but they had a transcendence about them.
They were also insanely expensive, affordable to only the wealthiest of established companies. A basic SGI workstation would cost as much as a sports car, and professional workstation as much as a house. At those prices they had better damn be beautiful and well built.
In the early '00s ugly beige boxes with Intel CPUs and Nvidia GPUs running BSD, Windows NT and Linux started wiping the floor with those beautiful SGI workstations at only a fraction of the price, that it made SGI basically irrelevant overnight. It was a bloodbath. The only market they held onto for a little while were supercomputers and compute workstations for wealthy customers that were running massive simulations like oil & gas for which money was no abject.
The VFX team of the original Matrix movie from 1999 was grateful they could save huge amounts of time and money by running their rendering pipeline on networked commodity Dell workstations with off the shelf Intel CPUs.[1] That was one of the last nails in SGIs coffin.
[1] "Manex Visual Effects used 32 Dell Precision 410 Dual P-II/450 Processor systems running FreeBSD as the core CG Render Farm. Charles Henrich, the senior systems administrator at Manex, says, "We came to a point in the production where we realized we just did not have enough computing power on our existing SGI infrastructure to get through the 3-D intensive sequences. It was at that point we decided on going with a FreeBSD based solution, due to the ability to get the hardware quickly as well as the reliability and ease of administration that FreeBSD provides us. Working with Dell, we purchased 32 of these systems on a Wednesday, and had them rendering in production by Saturday afternoon. It was truly an amazing effort on everyone’s part, and I don’t believe it would’ve been possible had we chosen to go with any other Operating System solution."
wow I never had a chance to know what pipeline was used. p-II/450 DELL feels so "average". I wonder what renderer they used ... bmrt ? renderman ? mentalray ?
The DVD commentary (for the original DVD release, not any of the re-issues or blu-rays) mentions renderman, and the person speaking (John Geta I think) even said they didn’t know why anyone would choose something other than renderman at that time given what he felt was its flexibility and quality of output.
I don't think there ever was PRMan for FreeBSD, but who knows at that scale they might've got Pixar to do it for them or somehow emulated? I also don't remember BMRT ever run without PRMan in the chain (trace() heh). Mental Ray was all over the place, but not FreeBSD. It was MOST LIKELY PRMan.
I loved both SGI hardware and IRIX itself. There was a tremendous amount of effort put in to make it a desktop-friendly UNIX, and it showed. Services (NFS, networking, web services, etc.) were easy to configure using desktop apps, and the locations of system and user files was more intuitive than SunOS or NeXTSTEP.
SGI was my favourite UNIX until OS X, and I still have an SGI Fuel (with all the Nekochan extras) that I boot up each time I need a nostalgia kick:
I was a kid in their prime but thanks to making-of, documentaries and magazines it’s so ingrained I still get the aura these machines and OS are above my poor consumer-grade 2023 laptop, and my work is toying compared to what real adults pros are doing on these, even though I’m near 40.
> While SGI boxes were beautiful, IRIX itself was... not. It had a Motif look which looked outdated, even compared to OS/2.
That is true, and it gave a super confusion feeling about seeing stuff like Maya running on it, which was peak software IMO (I mean, algebra friendly reactive DAG based, complex geometry rendering, including near real time constrained physics simulation.. with server/client split and noob-moldable/scriptable UI). So even if the DE felt lagging, you never cared much.
What I remember most was the startup chime - in an era when piezo electric speakers would be the norm the SGI startup chime was incredibly rich and extravagant.
Apparently they had lots and lots of noises that had to be turned off in university computer labs because it was so obnoxious. Like cliche hacker movie levels of noises but worse. I heard some stories as a student working in engineering IT.
We had a few ex-mining company workstations at the computer club at university – I remember that the metal security bar through the Indigo workstation could be pulled out and waved at one's fellow users, usually in good fun. :)
They were fun machines to play around on; even when we first got them c ~ 1999 PCs with a graphics card were catching up, but there was just something solid about them. I miss those days of *NIX workstations.
Not just wasted resources, a lot of modern eye candy actually makes the computer harder to use. For example, the recent 'flat' appearances make selected, unselected, and entirely-non-button areas look almost indistinguishable! Or animations that cause visible slowness locally and make remote display over any media slower than tens of megabits per second unusable.
A computer is a tool, most tools are not sleek, they have buttons and dohickies, sharp edges that do work, etc. There is nothing wrong with polishing something up, sure, but it has to be strictly secondary to usability.
The high contrast 3d look in motif wasn't because its creators had bad taste. They may well have also had bad taste, but the appearance served a utilitarian purpose. :)
> Or animations that cause visible slowness locally and make remote display over any media slower than tens of megabits per second unusable.
Hello, Firefox, I'm looking at you, especially during start-up. I don't know whose idea it was to "fade-in" the toolbars on start up, but I noticed that one right away, when firing it up over both x2go and (especially) X-over-SSH. Cute thing to do in the middle of a pandemic where everyone is using remote displays!
Not-so-Common Desktop Environment is a modern attempt at the style. I don't end up using it, but it reminds me wonderfully of a time when computers were made by and for people.
Uh, CDE was corporation-ware. The computers for the people and by the people used a hacked fvwm and rxvt making magic on Intel PC's with a Pentium I MMX /II running faster and snappier than even Irix and Sparc machines except for OpenGL.
CDE on Sun was a daily driver for my working life through the late 90's / early 2000's. KDE under Linux could be configured to have more or less the same UI semantics.
That changed with KDE 4 and a lot of people rejected it. One would assume those people are the same ones who now use Trinity.
Oof. IMHO back in the day all the Unix implementations of Motif were terribly clumsy. You know who did a really great, coherent implementation of Motif that had nothing to do with the Unix folks back then? GeoWorks for the PC. Go figure.
„The cathedral of Notre Dame is eye candy, all houses should be maximally efficient concrete boxes“ is a hive drone mindset and unbefitting of a species that needs beauty to flourish, or else remains spiritually stunted.
What I meant was that who wants 500 plus megs of RAM dedicated solely to transparency and pretty special effects on the desktop? Idiots that's who.
Edit:
You might not see the problem if you have a computer with 32 GB of RAM and I certainly do. However I like to ensure that responsiveness is first up.
IRIX Motif is 2D accelerated and very snappy as it's multi threaded and designed for power users.
I've never used KDE but remember Windows Aero? Metro? What about macOS's Aqua UI? They use tons of resources and often times when the computer is just a couple years old it becomes unusable with the typical system bloat that occurs.
What I'm basically trying to say is that visual effects that aren't well designed aren't worth the resources they take up. Functionality over aesthetics any day.
Jesus, we’re still arguing about the merit of desktop composition after all these years?
Do yourself a favor and try comparing the responsiveness and resource usage of, e.g., KDE Plasma with desktop composition off vs on. You’ll probably be really shocked when you see how much more CPU you need to do something as simple as scrolling a browser window.
Really, try it. Any browser, scroll around on something and look at your CPU usage and the framerate of your screen.
Or image editing — try panning around an image. Doesn’t matter what editor/viewer either, since they all have to draw on your X server.
Even just moving windows around on top of one another — everything is just so much more efficient when you offload it to a hardware accelerator.
Does it use more RAM? Yeah, a little, because the way it works is by keeping the entire contents of the windows in memory rather than culling anything that’s not exposed in front. It’s definitely not 500 MB more, though, and it definitely can (and does) take advantage of any dedicated VRAM available.
And the trade off of being able to just dump a framebuffer to the viewport instead of repeatedly computing what’s been culled 60+ times a second is definitely worth it to me, but if you still prefer not using acceleration, there’s always the option to just not use it.
I didn't really see the previous poster mention desktop composition at all?
What I have seen is many computers that seem to spend more time calculating various animations than actually animating stuff. When I worked in a computer shop years ago I often turned off animations and transparencies for people who came in with slow computers (XP, Vista, 7) and they were generally happy with the speed-up and didn't mind the lesser visuals at all.
Every window looks perfect all the time, and it doesn't matter if a program is busy.
Without composition, each program repaints itself. Which means there's an appreciable lag, and if the program is stuck you can get a blank box if a previously covered program is uncovered. This can be an annoyance if you need to read something from there.
Eg, an actual example is a program being blocked by a modal dialog stops being repainted. If the dialog asks you "Enter a password" and the what for is written on the no longer repainting parent you may have a problem.
> I've never used KDE but remember Windows Aero? Metro? What about macOS's Aqua UI? They use tons of resources and often times when the computer is just a couple years old it becomes unusable with the typical system bloat that occurs.
You're misattributing blame here. Aqua, Aero, and Metro were themselves just UI themes/design languages. They did not cause performance problems. To the underlying compositor it doesn't matter if a button is brightly colored beveled triangle or a flat rectangle. A 32x32px button is 1,024 pixels that need to be drawn to a buffer irrespective of what's in those pixels.
The performance problems in those UIs were almost always related to the compositor and underlying hardware (or drivers for same). Without hardware accelerated drawing, even just 2D acceleration, the compositor was limited by the CPU and memory.
The hardware limitations are only problematic at the margins though. At those various systems' introduction the performance issues were only at the low end. As the "low end" improved performance became a non-issue. Aero sucked when it was introduced because the 3D compositor was enabled on underpowered graphics hardware at the request of OEMs. Aqua ran great on then-new PowerMacs but sucked on the mobile graphics chips in iMacs and all the Mac notebooks of the time.
I recall for many years running OSX on a PowerBook with 256MB RAM total, with no problems. That machine was in use until the switch to Intels. It was fine - indeed more responsive than most Windows machines laden with crapware that I encounter today with many times the resources.
I‘d agree to that. Aesthetics need not be wasteful, minimalism can be beautiful if done right and be very efficient, as the whole APL ecosystem shows. Its just that Motif attempts neither baroque opulence nor restrained elegance, its just carelessly ugly, and that is a sin against life.
What is "ugly" or "beautiful" is really subjective. See brutalist architecture for example: opinions are (strongly) divided. Or the divided opinions on various styles of music. etc.
I personally don't care much for the Motif looks (although I've only used OpenMotif – I'm not sure how it compares to IRIX's implementation; from some quick screenshots I looked up it seems IRIX looked better) but describing it as objectively "ugly" or "a sin against life" is just wrong.
>What is "ugly" or "beautiful" is really subjective. See brutalist architecture for example: opinions are (strongly) divided
Strongly divided, as to the strength of like or dislike, yes.
But as for the split, it's pretentious architects and a handful of laymen outliers on one side, and billions of people on the other. And whenever people vote with their wallets (as tourists, or picking where to live, etc) they shit all over brutalist monstrocities.
I've seen quite a few people defend brutalist architecture on HN, I'm not a fan of dismissing people's tastes as "pretentious". Plus some of those comments (which I can't be bothered to look up right now) have convinced me that the things people really hate is the worst of brutalism, rather than the concept as such (it's still not my favourite style, but overall less bad than I thought before).
But we can use other examples – I don't really want to talk about brutalism as such – like metal music, or paintings from Picasso or Mondriaan, or the discussion about whether or not Alien is a good film that still holds up in 2023 from earlier this week, or any number of things.
While those are not hideous, they're hardly anything to write home about, much less call beautiful either. Compare them with a traditional budhist shrine, national monument, or church, and they're seen as the regression to ideology and "architect as god" arbitrariness that they are.
> „The cathedral of Notre Dame is eye candy, all houses should be maximally efficient concrete boxes“ is a hive drone mindset and unbefitting of a species that needs beauty to flourish, or else remains spiritually stunted.
That's an extreme view; GP didn't indicate that all UI toolkits needed to resemble Motif.
I mean, would it be fair to sum up your point as "Prisons are maximally efficient concrete boxes, all houses should be eye candy"?
Motif is (was at the time) just one of a large number of GUI designs. I quite liked it myself at the time, too.
Donald Norman has studied this and concluded that attractive things work better. A beautiful UI is one you will want to work at and will inspire creativity and problem solving. It can make your work seem easier.
>X gave Unix vendors something they had professed to want for years: a standard that allowed programs built for different computers to interoperate. But it didn’t give them enough. X gave programmers a way to display windows and pixels, but it didn’t speak to buttons, menus, scroll bars, or any of the other necessary elements of a graphical user interface. Programmers invented their own. Soon the Unix community had six or so different interface standards. A bunch of people who hadn’t written 10 lines of code in as many years set up shop in a brick building in Cambridge, Massachusetts, that was the former home of a failed computer company and came up with a “solution:” the Open Software Foundation’s Motif.
>What Motif does is make Unix slow. Real slow. A stated design goal of Motif was to give the X Window System the window management capabilities of HP’s circa-1988 window manager and the visual elegance of Microsoft Windows. We kid you not.
>Recipe for disaster: start with the Microsoft Windows metaphor, which was designed and hand coded in assembler. Build something on top of three or four layers of X to look like Windows. Call it “Motif.” Now put two 486 boxes side by side, one running Windows and one running Unix/Motif. Watch one crawl. Watch it wither. Watch it drop faster than the putsch in Russia. Motif can’t compete with the Macintosh OS or with DOS/Windows as a delivery platform.
>[...] X will not run in these 4 bit overlay planes. This is because I’m using Motif, which is so sophisticated it forces you to put a 1" thick border around each window in case your mouse is so worthless you can’t hit anything you aim at, so you need widgets designed from the same style manual as the runway at Moscow International Airport. My program has a browser that actually uses different colors to distinguish different kinds of nodes. Unlike a PC Jr, however, this workstation with $150,000 worth of 28 bits-per-pixel supercharged display hardware cannot display more than 16 colors at a time. If you’re using the Motif self-abuse kit, asking for the 17th color causes your program to crash horribly. [...]
>If you have any ANGSTFUL Motif code, comments, documentation, or resources, please share them with me! Here is some of the stronger stuff I've found. Note: this is only for official Open Software Foundation Motif inspired Angst. If you're experiencing TCL/Tk That Only Looks Like Motif But Doesn't Suck Angst, then you should stop whining and fix the problem yourself, if somebody else hasn't already.
/* Note that the text callbacks are "weird" in that they expect values in the callback structure to be set inside the callback proc to determine what actions need to be taken after the callbackproc returns. In particular, the XmTextVerifyCallbackStruct's 'doit' slot is always set to True, and must be set to False if the callbackproc doesn't want the action to be taken. To do this, Set_Call_Data_For_XmTextVerifyCallbackStruct() is called by Wcb_Meta_Callbackproc() after the callback lisp code is evaluated, and the values bound to these settable variables are set inside call_data....
Another inconsistency with the Text widget is that some callbacks on this widget return XmAnyCallbackStruct's (XmNactivateCallback, XmNfocusCallback, XmNvalueChangedCallback), whereas XmNlosingFocusCallback, XmNmodifyVerifyCallback, and XmNmotionVerifyCallback return XmTextVerifyCallbackStruct. In the code below, we look at the 'reason' slot of the call data, (which is present in both XmAnyCallbackStruct and in XmTextVerifyCallbackStruct) to determine the kind of callback that occured and we only bind the values that are appropriate for that kind of callback. Information about which slots are valid for particular callback was taken from the documentation on the XmText(3X) widget, and verified against the Motif 1.1 source -- this is valid for both XmText and XmTextField widgets... */
static LVAL s_CALLBACK_CUR_INSERT, s_CALLBACK_NEW_INSERT, s_CALLBACK_START_POS, s_CALLBACK_END_POS, s_CALLBACK_TEXT;
static void Lexical_Bindings_For_XmTextVerifyCallbackStruct(bindings_list,
lexical_env,
call_data,
client_data)
LVAL bindings_list; /* a list of symbols to which values from XmTextVerifyCallbackStruct are bound */
LVAL lexical_env;
XtPointer call_data;
LVAL client_data; /* XLTYPE_CALLBACKOBJ */
{
extern LVAL true;
register LVAL s_bindname;
XmTextVerifyCallbackStruct* cd;
/* How long can this go on???? */
}
I don't remember if it was IRIX but from the SGI I worked on the one thing that left a lasting impression was the GUI. From what I remember it was a truly vector based UI where you could scale everything arbitrarily without ugly pixel artifacts. This, for me, was living in the future. I thought every system will be like that in a year or two. Also, while I did't witness it in person at that time my machine had this flashy mousepad that showed a impression of a video running in a window. That was outrageous. Admittedly the UI was a bit minimalist compared to flashy OS/2 but at least it had smooth lines compared to the 32x32 pixel icon charm.
"Indy: an Indigo without the 'go'". -- Mark Hughes
"X and Motif are the reasons that UNIX deserves to die." -- Larry Kaplan
>The performance story is just as bad. I was tempted to write simply,
"Try to do some real work on a 16 megabyte Indy. Case closed.", but
I'll include some details.
>In May, I listed some unacceptable Motif performance measurements.
Just before 5.1 MR, someone reran my tests and discovered that the
performance had gotten even worse. Some effort was expended to tune
the software so that instead of being intolerable, it was back to
merely unacceptable performance.
>We no longer report benchmark results on our standard system. The
benchmarks are not done with the DSO libraries; they are all compiled
non-DSO so that the performance in 5.1 has not declined too much.
>Before I upgraded from 4.0.5 to the MR version of 5.1, I ran some
timings of some everyday activities to see what would happen. These
timings were all made with the wall clock, so they represent precisely
what our users will see. I run a 32 megabyte R4000 Elan.
Test 4.0.5 5.1 % change
---- ----- --- --------
C compile of a 25 sec 35 sec 40%
small application
C++ compile of a 68 sec 105 sec 54%
small application
Showcase startup, 13 sec 18 sec 38%
May report file
Start a shell <2 sec ~3 sec ~50%
Jot 2 MB file <2 sec ~3 sec ~50%
>What's most frightening about the 5.1 performance is that nobody knows
exactly where it went. If you start asking around, you get plenty of
finger-pointing and theories, but few facts. In the May report, I
proposed a "5% theory", which states that each little thing we add
(Motif, internationalization, drag-and-drop, DSOs, multiple fonts, and
so on) costs roughly 5% of the machine. After 15 or 20 of these,
most of the performance is gone.
>Bloating by itself causes problems. There's heavy paging, there's so
much code and it's so scattered that the cache may as well not be
there. The window manager and X and Toto are so tangled that many
minor operations like moving the mouse or deleting a file wake up all
the processes on the machine, causing additional paging, and perhaps
graphics context swaps.
>But bloat isn't the whole story. Rocky Rhodes recently ran a small
application on an Indy, and noticed that when he held the mouse button
down and slid it back and forth across the menu bar, the (small) pop-up
menus got as much as 25 seconds behind. He submitted a bug, which was
dismissed as paging due to lack of memory. But Rocky was running with
160 megabytes of memory, so there was no paging. The problem turned
out to be Motif code modified for the SGI look that is even more
sluggish than regular Motif. Perhaps the problem is simply due to the
huge number of context swaps necessary for all the daemons we're
shipping.
>The complexity of our system software has surpassed the ability of
average SGI programmers to understand it. And perhaps not just average
programmers. Get a room full of 10 of our best software people, and
you'll get 10 different opinions of what's causing the lousy
performance and bloat. What's wrong is that the software has simply
become too complicated for anyone to understand.
IRIX was what an OS is supposed to look like: functional. In fact, things took a major turn for the worse when we pushed the engineers aside and let the avant-garde artists take over.
What I do know is that children and smooth-brains alike in the 1990s could figure out Windows 9x and Mac OS 7 with little effort as "ugly" as they were. Even CDE, the ugliest girl in the village, was still usable.
I have a genius-level IQ (it's been tested) and I cannot figure some of this new shit out. It's often poorly thought out and illogical. Modern OS design has turned into a virtual escape room, where the puzzle has now become accomplishing basic tasks but you get to take an acid trip with animations and hamburger buttons along the way.
Before shitting too much on current workstation designs, check how much an SGI workstation would cost back then. Onyx started at 100k USD in 2000's money without adjusting for inflation. For that kind of cheese, they'd better be designed by Gucci.
But an onyx wasn't really a "workstation" anymore. The one we had at work required a 3-phase plug to support all the rendering pipes. Mini-super-computer would probably best define it
I'd say it's more about fashion than cost. I bet Apple spends more on designing and manufacturing their computers than SGI spent on their moulded plastic trim, it's just that the fashion these days is incredibly boring and lacking in imagination.
The cost of an SGI wasn't the plastic trim. They were loaded with a lot of custom chips that were all relatively low production volume. Those components had a relatively high cost because of the production volume and then obviously a nice margin added on top. The plastic exterior was literally just decoration.
It’s worse that that. Workstations have died. I would argue there is no such thing as a PC workstation. To me, workstation implies a high-end machine with hardware and software developed by the same vendor. Apple still does that. Does anyone else? Oracle?
If it had a Reality Engine in it nobody was looking at fucking Motif for long, they were looking at dinosaurs or aliens. Unless they were in a dinosaur movie, and then they were looking at that bullshit 3D ux.
While SGI boxes were beautiful, IRIX itself was... not. It had a Motif look which looked outdated, even compared to OS/2.