I miss when gaming in general was less mainstream and more weird like this. Now the silicon manufacturers hate that they even have to sell us their scraps, let alone spend time on making unique designs for their boxes.
I bought a small press book with a collection of this art and it was a fun little trip down memory lane, as I’ve owned some of the hardware (boxes) depicted in it.
> I miss when gaming in general was less mainstream and more weird like this.
To me, this is a continuum with the box art of early games, where because the graphics themselves were pretty limited the box art had to be fabulous. Get someone like Roger Dean to paint a picture for the imagination. https://www.rogerdean.com/
The peak of this was the Maplin electronics catalogue: https://70s-sci-fi-art.ghost.io/1980s-maplin-catalogues/ ; the Radio Shack of UK electronics hobbyists, now gone entirely. Did they need to have cool art on the catalogue? No. Was it awesome? Yes.
Turns out that the Psygnosis developers in the 1980s used him as a kind of single-shot concept artist. They would commission the box art first, then use that as inspiration for designing the actual game to go inside the box.
Huh. I hadn't actually realised Maplin was gone entirely. They closed in Ireland a while back, but I put that down to a general trend of marginal UK high-street retailers (Argos etc) pulling out of Ireland, but still existing in some form in the UK.
Weird shop; they never really got rid of any stock that was even theoretically useable, so it was at least partially a museum of outdated gadgets.
On the plus side, PC gaming hardware seems to last ages now. I built my gaming desktop in 2020, I had a look lately at what a reasonable modern mid tier setup is and they are still recommending a lot of the parts I have. So I'll probably keep using it all for another 5 years then.
Its a double edge sword. Yes, back then 2 year old computer was old, but at the same time every 2 years a new generation of games came out that were like never seen before. Each generation was a massive step-up.
Today, a layman couldnt chronologically sort CoD games from past 10 years from looks/play/feel, new Fifa and similar is _the_ same game but with new teams added to it, and virtually every game made is a "copycat with their own twist" with almost 0 technical invention.
This is fine? Today’s games look beautiful and developers are hardly restricted by hardware. Games can innovate on content, stories, and experiences rather than on technology.
Feels similar to how painting hasn’t had any revolution in new paints available.
It's not a one-or-the-other. One wouldn't want content, stories, and experiences to stagnate just because graphics were improving, so why would the opposite be assumed?
I disagree, AAA games started nosediving with the seventh generation 20 years ago and only recently have they started to tentatively show signs of recovery.
Do you have any examples in mind from each era? I thought Fallout 3 was quite good around back then. Today we've got stuff like Borderlands 4 (or whatever the newest one is) that barely run on anyone's PC, and general game install size has also shot up drastically so it's no longer really feasible to keep most of your games installed all the time and ready to play.
I mostly play indie/retro/slightly-old games these days, so I mostly hear of the negatives for modern AAA, admittedly. I'm also tempted to complain about live service, microtransactions, gacha, season passes, and so on in recent big releases, but maybe that would be getting off-topic.
> Today we've got stuff like Borderlands 4 (or whatever the newest one is) that barely run on anyone's PC
Just like Crysis did 18 years ago?
>it's no longer really feasible to keep most of your games installed all the time and ready to play.
Crysis was around 5% of common HDD back then. Now, it'd be equivalent of around 80 GiB now. That would be just about what Elden Ring with the DLC takes.
Yes, thanks, I don't need "technical invention" in the form of more shaders for hiding the ass quality of the gameplay. Mirror's Edge Catalyst still looks great despite being almost 10 y.o. and manages to bring 2080Ti on it's knees in FullHD+.
I was planning on hanging on to my Win10 PC, which was perfectly fine except that Microsoft were both pestering me to upgrade and telling me it wasn't possible, but the death of its SSD after 7 years put paid to that.
I have a PC that is probably at least 10 years old but works perfectly well for browsing, spreadsheets and the occasional lightweight game but MS has decided, in its infinite wisdom, to say that it can't be upgraded to Windows 11.
So I will probably install Linux (probably Debian) and move on and forget about those particular games... (~30 years since I first installed Linux on a PC!).
I've got a computer that I built in 2008 and upgraded in 2012. It's pretty solid for higher-requirement games up until about 2015, and can still handle lower-requirement indie stuff today.
I built its successor in 2020, using a GPU from 2017. The longevity of the PS4 has given that thing serious legs, and I haven't seen the need to upgrade yet. It still runs what I want to run. It's also the first post-DOS x86/64 machine I've owned that has never had Windows installed.
I still have a 1080ti which does swimmingly. There just aren't enough AAA/"AAAA" games coming out that I care to play. Oblivion remaster almost tempted me into upgrading but I couldn't justify it just for a single game.
The only reason I'd upgrade is to improve performance for AI stuff.
On the other hand, you're also stuck with design mistakes for ages.
The AM5 platform is quite lacking when it comes to PCIe lanes - especially once you take USB4 into account. I'm hoping my current setup from 2019 survives until AM6 - but it seems AMD wants to keep AM5 for another generation or two...
There's minimal demand for Thunderbolt/USB4 ports on Windows desktops. It won't ever make sense to inflate CPU pin count specifically for that use case, especially when the requisite PCIe lanes don't have to come directly from the CPU.
You'd be better off complaining about how Threadripper CPUs and motherboards are priced out of the enthusiast consumer market, than asking for the mainstream CPU platform to be made more expensive with the addition of IO that the vast majority of mainstream customers don't want to pay for.
One person's "design mistake" is another person's "market segmentation".
x16 GPU + x4 NVMe + x4 USB = 24 direct CPU lanes covers 99% of the market, with add-ons behind the shared chipset bandwidth. The other 1% of the market are pushed to buy Threadripper/Epyc.
Wow, I just checked and that's really underwhelming: Tiny page size, lots of padding around the images and yet there's often 4 images per page. The layout makes it seem like the size a was late decision, it would be appropriate for a large artbook.
> Now the silicon manufacturers hate that they even have to sell us their scraps, let alone spend time on making unique designs for their boxes.
I genuinely don't believe this to be true for AMD. I bought a 6600xt on Release Day and by the time I was able to build my complete PC, it had upstream linux kernel support. You can say what you will about AMD but any company that respects my freedoms enough to build a product with great linux support and without requiring any privacy invading proprietary software to use is a-ok in my book.
I agree with the spirit of your post, and that AMD is probably the lesser evil, but it's worth noting they "just" moved the proprietary bits into a firmware blob. It's still similarly proprietary but due to the kernel being pretty open to such blobs, it's a problem invisible to most users. You'd have to use linux-libre to get a feel for how bad things really are. You can't really use any modern GPUs with it.
I understand the sentiment, but I don't see how devices with proprietary firmware stored in ROM or NVRAM are any more free or open than devices that require proprietary firmware loaded at boot.
And it looks like Linux-Libre also opposes CPU microcode updates[1], as if bugged factory microcode with known security vulnerabilities is any less non-free than fixed microcode. Recommending an alternative architecture that uses non-proprietary microcode I can understand; this I cannot.
That honestly makes little difference to me. There's no useful computer out there that isn't a bunch of proprietary blobs to interact with proprietary hardware. Wish there was, but if it's practically indistinguishable from just having those blobs burned into the hardware directly instead of being injected on boot it's not perfect but still a pretty good situation.
I remember when running linux on your computer at all was hit or miss, these days i can go to lenovo and buy a thinkpad with ubuntu preinstalled and know that it will just work.
>I genuinely don't believe this to be true for AMD. I bought a 6600xt on Release Day
That was 2021 though when AMD was still a relative underdog trying to claw market share from Nvidia from consumers. AMD of today has adjusted their prices and attitude to consumers to match their status as a CPU and GPU duopoly in the AI/datacenter space.
You can still buy their GPUs, they work perfectly fine on linux ootb and they even make stuff like the AI Max for local AI that is very end user and consumer friendly. Yes, GPUs got more expensive which is a result of increased demand for them. But for a hardware company, as long as their linux support is as good as it currently is, they are a-ok in my book.
Just the mention of pieces of hardware we don't really need anymore (sound cards, modems, etc) triggers a flood of nostalgia. I used to spend DAYs poring over PC part catalogues dreaming of my ideal rig. And brands like Hercules, Creative, Matrox all trigger the same feelings.
Crazy contrast to me having spent the past weekend wondering if cloud gaming services like Geforce Now are matured enough that I can fully move to a thin client - fat server setup for the little bit of gaming I still do.
The technology works, but the business model doesn't, so there's the eternal risk that it might get shut down at short notice with no way to export your saves.
Yeah, that's definitely a worry. Also, the fact that you're dependent on them for adding support for future games, and that (like any cloud service) it might not be available right when you want it.
Yeah that's the issue - nobody wants to just rent you a gaming PC in the cloud, they all want a cut of game sales/licensing. But if someone were to do it, the technology is absolutely there.
You don't even need to create any internal tech - Steam Remote Play already has everything you need, and I successfully used it to play Battlefield from an AWS GPU instance (was even good enough for multiplayer).
Besides the box art, I miss the days when 1) the graphics card didn't cost more than the rest of the components put together, 2) the graphics card got all of its damn power through the connector itself, and 3) MSRP meant something.
I'm not in the market for a 5090 or similar, but the other day I was looking at a lower-end model, an AMD 9060 or Nvidia 5060. What shocked me was the massive variation in prices for the same model (9060 XT 16 GB or 5060 Ti 16 GB).
The AMD could be had for anywhere from 400 to 600 euros, depending on the brand. What can explain that? Are there actual performance differences? I see models pretending to be "overclocked", but in practice they barely have a few extra MHz. I'm not sure if that's going to do anything noticeable.
Since I'm considering the AMD more and it's cheaper, I didn't take that close a look at the Nvidia prices.
Looks. I'm not joking. The market is aimed at people with a fish bowl PC case that care about having a cooler with a appealing design, a interesting PCB colour and the flashiest RGB.
Some may have a bit better cooling but the price for that is also likely marked up several times considering a full dual tower CPU cooler costs $35.
The manufacturer can use better fans that move more air and stay more silent. They can design a better vapor chamber, lay out the PCB in a way that VRMs and RAM gets more cooling. But still all that stuff should not account for more than $30-50 markup.
Hey, c'mon now - some of that is flooding the market so hard that it's ~8:1 nVidia:AMD on store shelves, letting nVidia be the default that consumers will pay for. That's without touching on marketing or the stock price (as under-informed consumers conflate it with performance, thinking "If it wasn't better, the stock would be closer to AMD").
>What shocked me was the massive variation in prices for the same model [AMD v. nVidea]
I am not a tech wizard, but I think the major (and noticeable) difference would be available tensor cores — that currently nVidea's tech is faster/better in the LLM/genAI world.
Obviously AMD jumped +30% last week from OpenAI investment — so that is changing with current model GPUs.
I just bought a RTX 5090 at MSRP. While expensive, it's also a radically more complicated product that plays a more important role in a modern computer than old GPUs did years ago.
Compared to my CPU (9950X3D), it's got a massive monolithic die measuring 750mm2 with over 4x the transistor count of the entire 9950X3d package. Beyond the graphics, it's got tensor and RT cores, dedicated engines for video decode/encode, and 32GB of DDR7 on the board.
Even basic integrated GPUs these days have far surpassed GPUs like the RTX 970, so you can get a very cheap GPU that gets power through the CPU socket, at MSRP.
Do yourself/me a favor, and give your 5090's power plug/socket a little jiggle test.
I'm a retired data center electrician, and my own GPU's has been "loose" at least more than once. Really make sure that sucker is jammed in there/latched.
> the graphics card didn't cost more than the rest of the components put together
In fairness, the graphics card has many times more processing power than the rest of the components. The CPU is just there to run some of the physics engine and stream textures from disk.
The existence of scalpers rather shows that the producer set the price of the product (in this case GPU) too low [!] for the number of instances of the product that are produced.
Because the price is too low, more people want to buy a graphics card than the number of graphics cards that can be produced, so even people who would love to pay more can't get one.
Scalpers solve this mismatch by balancing the market: now people who really want to get a graphics card (with a given specification) and are willing to pay more can get one.
So, if you have a hate for scalpers, complain that the graphics card producer did not increase its prices. :-)
That's nice, but they were interactive - You could move around the scene or change the camera angles. The fact that you could do this and prove it was realtime and not prerendered was part of the demo and most of the charm. Lacking that, it's just... lacking.
Most of that charm is gone after 20 years, nobody needs proving the dynamic lights are really dynamic anymore.
They're still some fun to interact with anyways, or just fun as a way to review what was hot shit at the time, but I couldn't get a few of the really old ones to run on Windows 10/11 this summer. A video is a lot better than just saying "well, I'm not going to build an old PC just to play this demo" and not seeing it at all.
I would guess part of the reason for this was box art used to matter because most of these cards were sold through dedicated electronics retailers like Fry's Electronics, Microcenter, and CompUSA. There was basically no such thing as online ordering for this sort of thing. People were physically browsing goods on shelves.
Just chiming in here, but at least two of the generations of cards there are from ~2005-2008 and we old farts definitely bought (or convinced our parents to buy) things from Newegg at the time!
I think it comes from a marketing exaggeration of what the card could do. None of the cards of the day could actually produce their own box art (in real time) but the art implies they could in a way they can get away with. It follows the tradition of box art on 8-bit games wildly exaggerating what the in-game graphics might look like and they'd sometimes post a tiny disclaimer in the corner.
I miss electronics retailers. Any hardware project nowadays requires me to wait several days before I can actually start as I am forced to order online.
From full cases [0] including the CPU cooler in general, to themed components[1], when it comes to gaming makers are going beyond and above to create cool visuals.
The ones in the article are boxes only, the actual cards were different from what were represented on the box. Anime-themed products are products themselves in various themes. I'd argue that these two are different phenomena.
Years ago, I picked up a low profile, single-slot GPU that worked well in Linux to throw in old machines when someone gives me one to mess with or recover. The best fit at the time was a Yeston AMD card, and in a world of cards that are all "Black with {{primary_color}}" the choice of blinding magenta made me smile.
That's fantastic. I recently bought a Lofree mechanical keyboard (they're a Chinese brand) and they definitely have the most unusual hardware designs I've ever seen.
Well there is still an NPC that proclaims Kirkbride's drug binge fueled [affectionately] lore.
Besides, in Skyrim you eat souls of hitherto immortal beings in an act of metaphysical cannibalism, and, among other things, get to witness firsthand exactly what happens to those souls you trap to fuel your fancy sword of zapping.
Meanwhile, in the background, Vivec might or might not have been kidnapped to be on the receiving end of that spear thing, and fascist elves are trying their hardest to undo the universe (it's not plot pertinent though), and also briefly did (or claimed to do) something to the moons (that are not moons, remember) that terrified an entire race into submission.
The point is, the lore is still there. You just have to pay attention, because it's not always spelled outright.
Why play modern games? There is an almost infinite backlog of experiences for you to indulge in from the late 90s/early 2000s alone.
They're also great value; a couple months back I went to a local store and bought 100 or so "old" game CD/DVDs for less than $35, none scratched. For the price of one triple-A game, I'd probably have been able to get 250 at least.
I'll never forget the big, dumb gorilla on the side of my mid-2000's 7900 GTX from a King Kong game tie-in product launch. Never played the game, but when I'm installing a GPU I sometimes think of that stupid decal and chuckle.
Crazy, outrageous graphics on a graphics accelerator box seems quite fitting. Of course these days they do far more than just render 3D graphics (and that which they do has become quite common), so perhaps that also reflects the shift away from this branding.
This is a blast from the past! I remember being really young and buying a GPU based solely on what art was on the box (and yes, it was a scantily clad woman) and getting really, really luckily that it actually worked with my components but it was my intro to upgrading PCs!
I loved the weird boxes back in the 90s and 2000s. I remember dad would always take us to computer trade shows and ham events, and occasionally you'd see someone from ATi or Nvidia (or one of the integrators) demoing their wares with all sorts of bizarre and funny demo software and renders. I don't know if it was just me or what, but they always sent real nice sales or marketing people and it was fun to talk to them about the GPUs as a kid. I think they were as mystified (I recall several of them laughing about it) about the box art as everyone else was.
I had a Powercolor 9000 pro "Evil Commando". My friends and I thought it looked like a terrorist out of some old action movies. It kinda looked a bit like Universal Soldier.
Later on I bought a Sapphire 9800pro "Atlantis" which had some T1000 esque figure on the box art.
After that a lot of stuff becoming more corporate and boring.
When you'd first get a 3d accelerator you'd enter in a completely new world, the graphics and speed you'd get were on a different planet with what your computer could do without them.
I think that the boxes initially reflected that.
My first accelerator (rather late) was that 3D Blaster Voodoo 2; the graphics of the box contributed to the emotion of holding it, they looked better than in the picture.
I was mindblown when I saw what the card could do, and I believe to have thought that the graphics did reflect well its capabilities.
I sure kept the box for many years.
I imagine that then the manufacturers felt compelled to keep making boxes which would stand out; and in part, yes, they tried to attract some purchases from people who didn't originally mean to get a new graphics card.
As usual, when money is to be found the soulless bean counting serious mba types come along and kill all the fun. Not to mention all the pretending money-seekers who can't code their way out of a paper bag.
> As usual, when money is to be found the soulless bean counting serious mba types come along and kill all the fun.
A reminder: Even years after inventing CUDA, Nvidia, the top GPU manufacturer, was fighting for survival. I'm not sure what saved them - perhaps crypto.
If you ignore the money, they appeared quite strong. But they struggled financially. Intel famously considered buying them around 2010 because they knew they could buy them cheap - Nvidia might not survive and weren't in a position to negotiate). Thankfully, the Intel CEO killed the idea because he knew Jensen wouldn't work well with Intel.
Nvidia may not have been saved by "bean counters", but they do have a place in the world.
Can't believe this was a hobby for me and my dad during primary school and now understanding how computers work led me to my current full time job to put food on the table for my own children.
I think what happened is, at the time those were literally more or less examples of the best scenes the cards could render. Nowadays, putting together an example of the best scene the card could render requires a whole art department and a couple months of design. Nobody’s going to spend months on box art, so we get bland rectangles or whatever.
Or it was just a fad when the scene was novel and it ran its course as fads and design elements do. This explanation doesn't require there to be an enemy to demonize but sometimes there just isn't, as much as we might want there to be.
What the best scene you could render is a bit fuzzy. In blender you could render anything at all. But in a game, at what resolution, and what framerate, are the shadows dynamic or baked in?
TFA calls it unhinged, I call it creative and exciting. Now all we get is rounded edges, solid colours, and "copies of reality" - boring; if I wanted reality I'd go outside and touch grass.
> GPU makers have all abandoned this practice, which is a shame as it provided something different through box art alone. Now, we're drowning in bland boxes and similar-looking graphics cards
I feel like there could be a more positive adjective than “unhinged” if you're going to turn around and praise it. OED sez “wildly irrational and out of touch with reality”. How about “whimsical”? I love this stuff and think we need to bring this kind of whimsy back to computing.
> There's a scantily dressed lady in armor
Author neglects to mention that ATi/AMD had a named ongoing marketing character for many many years — Ruby!
I bought a small press book with a collection of this art and it was a fun little trip down memory lane, as I’ve owned some of the hardware (boxes) depicted in it.
For anyone else interested: https://lockbooks.net/pages/overclocked-launch