> The evolution of the modern graphics processor begins with the introduction of the first 3D add-in cards in 1995, followed by the widespread adoption of the 32-bit operating systems and the affordable personal computer.
What? No.
The evolution of the modern graphics processor begins with the development of commercial 3D systems in the 1980's. Jim Clark founded SGI in 1982 based on his work into hardware acceleration of geometry computations for 3D at Stanford. By the mid-1980's, SGI workstations were able to handle 3D modeling and animation locally: http://en.wikipedia.org/wiki/Silicon_Graphics#IRIS_2000_and_....
There's a really neat website going into details of early 3D consumer chips: http://vintage3d.org. It's very interesting, though again it should be noted that these products came out in the 1990's, well after hardware acceleration of 3D was well-established in the workstation market.
My first inclination was to start with E&S, but my understanding is that their 3D technology in the 1970's was based on vector displays: http://archive.computerhistory.org/resources/text/Evans_Suth.... Which is clearly hardware-accelerated 3D, but not quite how a modern GPU works because it doesn't involve rasterization of triangles into a framebuffer. Of course this was before my time, so I'd be happy to be corrected...
In all fairness, they do specifically refer to consumer graphics in a few spots but yours was my initial reaction as well. It's very PC-centric with a side of game consoles.
It's an interesting topic in that I remember just how much of a revolution the original Voodoo card was to gaming (for me anyway). These new cards didn't just enable slightly prettier graphics like a new card would today, they enabled entire new genres of game. If anything, the art quality actually took a step back for a while (some of those sprites were gorgeous), but the new 3D effects more than made up for it.
I imagine products like the Oculus Rift will hopefully have the same effect on gaming. New ideas that we could barely imagine before will now be within reach.
Such as? There have been great enjoyable DOS-based 3D games for a number of years ever since Wolfenstein so I'm curious what 'new genres' you're thinking of that the introduction of a 3D graphics processor enabled.. ;)
Very nice. Though I wonder why the author thought we (3Dfx) were "strongly against OpenGL"? One of our founders, Gary Tarolli, worked on Iris GL[1][2]. We weren't against it, but our pixel pipeline was in the wrong order to implement it precisely. Plus there was the GL tradition of handling things in software that your hardware didn't implement, and we hated the "slow path". Ok, maybe we were against it a bit :-) Coming back to me now.
Correct. It was used in some of the 80-column addon cards, but Woz drove the video signal directly from a sea of TTL gates. It was pretty amazing stuff for the time.
It always amazes me how so bad quality graphics back in the 90s or early 2000 looked so bad but still everybody thought they were great and amazing (at least I thought).
Probably we think now that we have good looking graphics but after 15 years of development they aren't so good anymore.. AMD/ATI and NVIDIA have done great work over the years.
Typically the things that previous made graphics 'good' were simply resolution and color depth. Early 8-bit home computers were often limited to just 1 or 2 colors per sprite. Machines such as the NES increased this to 3 or 4 colors per sprite, which makes a dramatic difference.
Along came something like the SNES, which although it shares the same resolution as the NES (256x220 or something along those lines), it had access to 32,000 colors instead of the 64 available to the NES, and it could use up to 16 per sprite. That's the main reason why SNES screenshots look so much better then SNES screenshots.
Once 3D came along, graphics looking 'good' became more complicated than just resolution and color depth. The original Playstation suffered from texture maps that showed artifacts, became distorted, and had edges that appeared to 'jump' around.
As the hardware became faster and fixed those issues, more techniques were added for lighting, shading, and a hundred other enhancement which got us to where we are today.
Mainly because you were comparing the new graphics with the old ones, and marveled at the "progress", even if they were nothing like real-life.
Also, around the time when the switch to 3D graphics happened, the 3D ones looked a lot uglier than some 2D games that seemed prettier, which had better art and colors.
I think it's less of a matter of good/bad than a bunch of other factors such as detail, immersion and novelty compared to what was available before. The graphics looked good in the context of earlier graphics systems, and I doubt it will be different with the current generation of 3D games.
Ha ha, look at the capacitors soldered across the through-hole ICs on that first ATI board from 1987. Only took a couple years to get to sophisticated SMT boards.
What? No.
The evolution of the modern graphics processor begins with the development of commercial 3D systems in the 1980's. Jim Clark founded SGI in 1982 based on his work into hardware acceleration of geometry computations for 3D at Stanford. By the mid-1980's, SGI workstations were able to handle 3D modeling and animation locally: http://en.wikipedia.org/wiki/Silicon_Graphics#IRIS_2000_and_....
You can find some of Clark's papers on the hardware here: http://www.computer.org/csdl/mags/co/1980/07/01653711.pdf (1980); http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.359.... (1982). I can't find a good reference describing the rasterization side of the pipeline, though the second paper describes it very briefly on page 132.
There's a really neat website going into details of early 3D consumer chips: http://vintage3d.org. It's very interesting, though again it should be noted that these products came out in the 1990's, well after hardware acceleration of 3D was well-established in the workstation market.