Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

When I first entered the working world as a programmer and administrator of an "academic computing center", in the early 70s, you met men like Ray - ex-military, GI-bill educated, learned computers from the electricity on up in their mid-career, rather frequently, either as customer engineers for one of the big mainframe manufacturers (there were 7 or 8, depending on when and how you counted), or from the minicomputer upstarts who were then assaulting the mainframe world of computing with their smaller, cheaper, 12 and 16 bit newcomers. Sometimes you'd get the privilege of a lunch or dinner with one sent out "from the lab" who was actually designing and building the machines you were working on.

It's hard to explain just how new it all felt, then. But in 1973, even though we were sitting on the cusp of the single chip microprocessor and personal computer revolution, the commercial computer was less than 20 years old, and college recruiting materials might well brag that at their institution, there were not one, but two computers on campus. I remember the day the total RAM at our institution passed the megabyte mark - closer to the end, than the beginning, of the 1970s. The ability to "program" was a rare skill - even the people who taught it were still just learning it.



My father was a physicist. He learned to program in FORTRAN in the university in the 70's.

Decades later I, still a teenager, asked him something like this: "Dad, you were a FORTRAN programmer and physicist in the 70's, you could be a very well paid developer anywhere in the developed world... why didn't you?"; he answered me: "I didn't thought this thing about computers would go too far."


We probably are close to the same age. My dad was an engineer who also learned to program FORTRAN in the 70's.

When I asked him a similar question his reply was (quotes are paraphrased): "It was way too tedious to do. You'd spend hours getting the cards just right. We used to put them in a shoebox and mark them with a pen in case we dropped them on the way to the lab. Then you'd wait until the next day to get your results. If you had a mistake you'd repeat the whole process".

Basically it was considered tedious, grunt work in his opinion (at the time...he later of course has come to understand the importance).


> "I didn't thought this thing about computers would go too far."

I almost didn't major in Computer Science because in the late 90s, there were so many negative articles in the New York Times, vis-a-vis software. People don't remember it now, but the media and the culture were utterly hostile towards us, and loved to say our jobs were going to India, that everything there was to know about Computer Science could be studied in railyard switching, in existing abstract math textbooks, etc.

By a combination of luck, and my dad's insistence, I ended up at Carnegie Mellon, and while I was there, I saw what folks at Google were doing, and I thought to myself, no, this stuff is hard, and this is just going to be the beginning.

> "It was way too tedious to do. You'd spend hours getting the cards just right. We used to put them in a shoebox and mark them with a pen in case we dropped them on the way to the lab. Then you'd wait until the next day to get your results. If you had a mistake you'd repeat the whole process"

Even what came after that, e.g. in C / C++ was considerably tedious compared to what we do today. Folks sometimes had to do objdumps of compiled binaries to debug what was going on. We had to get coredumps, load them up, and try to determine what memory error had caused things to crash (this is an entire class of problems that doesn't exist today). You used to legit need that CS degree in order to code in your day-to-day because you had to understand the function stack, the network stack, basic syscalls like wait and poll, etc.

It was a lot of work, for relatively little product, and I think part of the reason why software is paid more today is in part because of 1. faster processing speeds and 2. better tooling and automation, and higher-level programming languages – all of which were enabled in part by cheaper / faster CPU speeds (e.g. people don't have to care about how slow Python is – you can optimize it after you find product-market-fit), and 3. a better understanding of how software should be developed, at all levels of management.


> I almost didn't major in Computer Science because in the late 90s, there were so many negative articles in the New York Times, vis-a-vis software. People don't remember it now, but the media and the culture were utterly hostile towards us, and loved to say our jobs were going to India, that everything there was to know about Computer Science could be studied in railyard switching, in existing abstract math textbooks, etc

I'm glad I'm not the only one who remembers this - whenever I try to explain it to someone they look at me like I'm crazy. In the late 90s and even early 2000s the common wisdom with guidance counselors and even local recruiters was that programming and software design were dead end in the U.S. I remember one article literally said "the bud is off the blossom". I wound up majoring in electrical engineering instead of computer science as a result.

It all worked out in the end, but not following my instincts at the time is one of my few regrets.


It was hard to figure out at the turn of the century when the career fair was literally cut in half after the dot com bust. Although websites had been around for years, web apps were still pretty clunky and it felt like the world of internet-based possibilities still had a long way to go. I decided to try doing application development for pay because it seemed interesting and I figured I could easily switch to something else down the road. Plenty of relatives and acquaintances did inform me that my job was going to be outsourced abroad, though. :) And things looked dire again with the financial crisis but I was shocked that a few years after that, I discovered when recruiting at my alma mater that CS had become the most popular major whereas it was one of the smallest ones when I was studying it! So, lots of predicting that turned out differently...


Yeah, that's why I don't take re-kindling of the "it'll get offshored any day now" panic post-Covid that seriously. Time zones haven't gone away. The communication-based hard parts of software development haven't gone away. The way that delivering what someone asks for usually leads to them asking for more things, not fewer, hasn't gone away.


Yes, this is one reason I am personally really sensitive when various people say how privileged I was to get into computers and that we somehow got all this encouragement unlike young women, etc.

In the 80s we were mocked and called nerds for being interested in computers, and before and after dot com people thought this was dead end career.


Yes. Even as the internet started to become a thing in 1994-1995 when I was in middle school, I'd reckon less than half of my class had a computer at home - and fewer still of them would ever want to mention it.


OT, but when I search for "the bud is off the blossom" the only references I get from google are 2 links to hacker news comments... There's 0 in bing for that phrase. Never heard it before ever.


The idiom is "bloom is off the rose", maybe that's what GP recalled.


In the early days computer programming was considered a clerical job one learned in trade schools. I think people looked down on it partly because many of the early programmers were female, beneath the dignity of a male profession.

It rook my alma mater MIT until 2018 to recognize software worthy of a department in itself (after a huge financial donation). Before then it was a step child of Electrical Engineering. This is kind of ironic because me and most of my classmates ended up writing software for money, though almost none of us majored in that field.


> In the early days computer programming was considered a clerical job one learned in trade schools.

That's because in those days, the term "programming" didn't mean "software development", it referred to data entry. It actually was clerical work, comparable to typing a dictation on a typewriter. Only later, when user interface devices (keyboards, displays) considerably improved and it became more efficient to unify those tasks in one person, did "programming" and "software development" start to become synonymous.

It has nothing to do with "dignity of a male profession", or oppression of women, just a misunderstanding of a shift in the meaning of words.


> In the late 90s and even early 2000s the common wisdom with guidance counselors and even local recruiters was that programming and software design were dead end

My career advice as a teenager was that there wasn't any point doing software, as Microsoft had made it all already with Microsoft Office.


My mother talked me out of going to school for programming, and a decade after I graduated high school that’s what I ended up doing anyway, realizing it was going to lead to better prospects.


Universities are always several years behind the curve. At college in the 90s they were still teaching token ring networking despite Ethernet already being common place. The same college told me that programmers didn’t design any of the code they write; they only transcribe code from flow charts.

Just yesterday I was talking to a grad about DevOps. He said the field sounded boring from what he was taught at uni. Then when we discussed it more it turned out his “DevOps” course was actually just teaching them how to be a scrum master and didn’t include a single thing about automation, infrastructure as code, etc.

I also remember just how garbage general publications were with regards to IT. And to be fair they still are now. But there was always a wealth of better information in specialist publications as well as online (particularly by the late 90s).


That may well be true of some universities today. In 1970, they were pretty much the only place you could get hands on experience with a computer unless you somehow slid into a programming job in the financial industry, or a one of the few other areas that actually used them. And they were not behind the curve on the technology, although they tended to have lower end hardware than industry, because any compute was very expensive. The invoice on a 64k byte HP3000 in 1972, which on a good day could support half a dozen users actually doing any work, was over $100K. Memory upgrades to 128K ran you about $1/byte installed - maybe $8 in today's money. It was a big deal to be allowed hands on use of them.


I was talking about 90s to modern era. Not just modern era.

And having computers doesn’t mean any of the lecturers understand the modern (for that era) trends in computing. More often than not, it’s computer clubs rather than cause material that hold the really interesting content.

I don’t doubt there will be exceptions to this rule. But for most people I’ve spoken to or read interviews from, this seems to have been the trend.


It definitively is true of local universities. I've met people from the local university who have a master in machine learning, yet have never heard of docker.


This is a good thing. Opportunity costs are incredibly important with university educations because students have a limited time to learn.

Why spend the time futzing with a tool like docker? It's not foundational to machine learning, so learning that tool takes away from time that could be spent learning something more relevant. And the student may or may not use it when they get a job.


"Getting shit to work" is more foundational to machine learning than you would think, and containers helps a lot with that. If you want to train models on someone else's machine - and you probably will, for anything big - you need to know a little about how that sort of thing is done today.

And if you want to try two different deep learning frameworks, dependent on different versions of cuda, and want them to not break each other, God help you if you try that without containers.

It's not that they don't have a "course in docker". I understand that. It's that they haven't even heard of it, so they don't even know where to start to look for solutions to problems like that. I have been through that pain myself.

Containers is just one of so many easy things, that make your job so much easier, I've learned the painful way in 20 years as a developer in (mostly) non-elite companies, where no one else knew it either because they hadn't been taught at the local universities, because no one there knew it either.


Docker I can forgive, but I’ve worked with a lot of grads who haven’t even been taught the basics of using the command line.


It's highly dependent on school. The Ivies, including "public Ivies" will teach you proper comp sci. A lot of other big schools will do you well also. When it comes to smaller regional universities or junior colleges and community colleges, then it's hit or miss. Your intro CS course may be great if you manage to get an instructor who knows it well themselves and wants their students to know it, or you may get someone who teaches students how to do Microsoft Office without a shred of programming.


I went to RIT in the early 2000s. I remember the CS and CE departments were quite good (although the prevalent Sun workstations were already getting outdated). Somehow I ended up taking 1 elective from the "Management Information Systems" department and the instructor kept mixing up search engines and web browsers. I think I dropped the class shortly thereafter.


I dumpster dove at RIT to pull out a discarded VAX (in think an 11/70) and serial terminals. Probably about 1989 or 1990.


I was having to deal with token ring in '96-'97, and have not touched it since. Seems like it went away quite quickly. Cue up someone replying that they're still maintaining a token ring system in 2022... :)


I had to deal with token ring way up until 2001 when even the most die hard nuts had to admit that you could buy a dozen ethernet cards for the cost of a single TR. IIRC the TR people tried to convince us that ATM was the future.


Not quite 2022, but yeah I was maintaining a token ring based network for some subway at my last gig in 2019. As far as I know, no work is done on it now but the subway-car using the system are schedule to run for at least another decade so another bugfix release of the networking firmware is not entirely out of the question.


Hah, not quite nowadays but I, too, was dealing with one from around '97-2000'ish. What a pain in the ass. That was just one network in the building, I also had to deal with 10base-t, which was also a nightmare. shudder


I remember taking a graduate level networking course at NYU in the early 1990s. The instructor was an IBM consultant. We studied token ring, FDDI, SNA, HDLC/SDLC and several other commercial products.

One evening, I raised my hand and asked when we were going to study TCP/IP.

He simply quipped, "TCP/IP is not a real networking protocol."

So I wouldn't say that universities are always behind the curve :)


In 2015 or 2016 o was taking the computer architectures class at my local university… the processor they based the whole course upon was the motorola 68000.


As far as introductory courses go, the older/simpler the processor,the better it is for everyone. My class groused at being taught "old tech" because we taught the 68k, but very few of us had done any assembly before, I think most of the class would have failed if started of on amd64


And why wouldn't they base it on that CPU? If you're trying to learn the basics of shipbuilding, you don't start by going on a deep dive into the construction of an aircraft carrier.

It's a simple chip, with a simple instruction set, that can actually be taught to you in the time allotted over a three-credit class.


The class was worth 10 credits though.


The bit on "DevOps" is pretty egregious. There's two key things at stake here.

1. "DevOps" is an absolutely critical part of automation. It's the reason why we can start tech companies with such small engineering staff compared to 20 years ago. It's as important as all the high-level languages we use. This stuff is the logistics of how software gets deployed. It's the same in business as it is in war. Coding chops is like tactical strategy, and being able to ambush a tank column. It matters, and you won't have an engineering org without it, but the whole chain of how stuff gets deployed and iterated is what keeps the ammo flowing and the fuel pumping.

2. Universities want to teach stuff that'll still be relevant in 50 years. Given their proclivities, that means stuff like algorithms.

On one hand, I think that universities and academics can be somewhat forgiven for their ignorance on this matter. In fact I think we ourselves don't know what's going to be needed in our field in ten, twenty, thirty years. If the folks in industry didn't predict infrastructure-as-code 20 years ago, then the universities couldn't have taught it.

But what I know now is that:

- After all these years, no one is getting rid of shell scripting.

- Old school (i.e. 2nd generation) config management still has its place in many companies. Ansible is great for provisioning an AMI, if you need one, but if you need static infrastructure, puppet and chef are actually better because they track state, which allows you to better manage config drift.

- k8s may be hot and all, but a lot of the underlying "ops" stuff still translates. You average resource usage over pods instead of hosts, for example.

- Put together, there is an "instinct" for ops that is not unlike the "instinct" people learn for math, algorithms, and code. They are completely separate and an engineering org needs both. I think that universities don't "get" ops because computer science is more like math, whereas ops is more like history.

- On one hand, being stuck in an older ops paradigm is pretty awful – if you missed the transition to infrastructure-as-code, then it may be really, really hard to get out of that rut. But the field itself can be pretty bad with being stuck – it took us forever to give up our own datacenter racks.

- But otherwise, the old knowledge about old tools didn't necessarily just go away, in fact it's oftentimes still quite relevant. Linux internals (e.g. iptables) are still useful.

- When I was at CMU, a lot of folks learned some of that ops instinct in the dorm room, and in the computer clusters. But the universities pretty much made it optional. Looking back, I think this was a mistake. Ops is pretty much entirely transmitted through osmosis, whereas we at least try to teach people to code in official uni classes.


> ...and loved to say our jobs were going to India.

They weren't wrong, though; they just omitted delimiting that assertion.

Back in those dark ages, mainframe jobs were still considered by career "experts" the "adult in the room" jobs of programming. It is hard to convey to people who never studied that era or grew up in that era just how much microprocessor-based computers were considered "not real computing" in vast swathes of the industry. The proprietary Unixes thrived under that lay perception, as a "serious business" microprocessor-based computers market segment.

And the mainframe jobs did by and large up and wholesale decamped to India from large chunks of the mainframe account base. Those career experts were right in a way.

Just not quite the way they thought. The scope they thought in was too absolute because they lacked the technical (and business, and financial...) perspective and context to understand why the same wouldn't happen to quite the same extent to sectors outside mainframes, nor of the explosion of re-invention of the wheel of many mainframe tech stacks that would drive the industry forward even to this day and beyond, along with the rapid recombination of new ideas.


I think that the tv series "Halt and Catch Fire" illustrates the rapid recombination of tech and "new ideas" quite well.


I was using objdump and cordumps to debug a kernel crash just last week. Not tedious at all. More like working a difficult puzzle. And very rewarding if you figure it out and fix the crash.


objdump and coredumps today are way less tedious than getting a compiler error the next day (if not few days out!).

At least with punched cards if you kept them sorted (line numbers in front a'la BASIC really helped with that) you could easily edit in place - just replace that one card that was incorrect, because each card = one line.

TECO (which begat EMACS) started out because paper tape which was preferred storage on DEC machines was harder to edit in place than card stacks and instead of retyping whole program you'd summarise your changes (that you dutifully copied on fanfold greenbar printout - or suffered) into few complex commands then used the resulting 4 tapes (TECO load tape, TECO commands tape, incorrect program, fresh unpunched tape) to get one corrected.

For maximum efficiency, the OS/360 team had to work 24h - the programmers would write their changes on first shift, then teams had to prepare cards, submit them for compilation, night shift reprinted modified documentation, and when you'd arrive at work you'd have fresh documentation and results of your compile (unless you had the luck to work on-line that day with more immediate feedback)


Yeah, depending on deadlines that sounds fun to me


You say it like negative articles about Comp Sci/Applied Programming/Really any Tech Co from the NYT is a thing of the past. It's with a sense of irony that articles denouncing Tech is easy, routine clickbait for them now.


Oh yeah, no and low-code is going to put all of us programmers out of work any day now.


> ...in the late 90s, there were so many negative articles in the New York Times, vis-a-vis software

In retrospective, the New York Times is always wrong about everything. Maybe it should be adopted as a useful heuristic


I have a reference somewhere to a NYT article explaining that stealth technology is impossible


> I almost didn't major in Computer Science because in the late 90s

You missed, by a few years at least, the opportunity to study and earn a degree that is no longer available from CMU, the B.S. in Cognitive Linguistics. I got an early acceptance from CMU in late 1988, my first choice of education because I wanted that degree in particular, but I could not afford CMU tuition let alone housing, and I was ineligible for financial aid. I studied CS at Virginia Tech at about a tenth the cost and never regretted it. Though I never met him, Allen Briggs[1] was an underclassman there while I was an upperclassman. He ported NetBSD to 68k Macs while still an undergraduate at Virginia Tech, which always impressed me. A/UX licenses were not cheap, and MacBSD was free.

[1] https://www.netbsd.org/ports/mac68k/history.html


The backdoor into CMU back then was and maybe still is Pitt. Pitt students had the privilege of signing up for any CMU course and it just meant a slightly longer walk to class.


The Cognitive Linguistics degree at CMU in the 90's was an interdisciplinary combination of cognitive science, neurology, computer science and linguistics, and disappeared when the faculty member that created and sponsored it passed away in the mid-1990s. While Pitt is a quality university, I don't think they offer degrees from CMU. Pitt was on my radar and one of the few places I was accepted to, but out of state tuition at the time iirc was $7K/semester, more reasonable than CMU's ~$12K/semester, but I had moved to Virginia the year before, and Virginia Tech's in-state tuition was about $2K/semester with housing (though I was required to purchase $4500 worth of Mac and A/UX license). Today, Virginia Tech's instate tuition is as much as Pitt's out of state tuition was then, which is now about the same as CMU's private tuition was in 1989, and CMU's annual private tuition today costs a little more than an Audi Q5 Prestige.


Perhaps I wasn’t clear. CMU allowed Pitt students to register for CMU classes. Your degree would say Pitt on it, but you would have attended the exact same classes as the CMU students. As in sat in the same classroom with the same professors at the same time, doing the same assignments and taking the same tests.


Thank you, that is what I understood you to mean, but if Pitt doesn't offer the same degree, how could one graduate accumulating credits for a degree that doesn't exist? While many universities allow non-students to audit courses, and one could take every required course of a subject this way, without actually being awarded the degree one can not claim the degree. Also, as I explained, I was out of state, making Pitt tuition expensive. CMU does grant its FT employees and their children free tuition after a token number of years of employment, but, of course, even a qualified and experienced HS graduate without an undergraduate degree would not make it past HR for an interview. And unfortunately, the Cognitive Linguistics degree only existed for a very short window, about 5-6 years. Personally, my only option to attend CMU was to get about $90K worth of college loans, or conversely, $60K worth of loans to attend Pitt, but I would gave sooner accepted the appointment offered me to Annapolis, an even more selective university than CMU that costs nothing but a commitment of 5 extra years of military service. What I did instead was study CS at Virginia Tech and graduate only $10K in debt, which was not difficult to get out from under. And though I did not study any linguistics there, I did exhaust my curiosity in cognitive science and neurology via an elective in philosophy of mind. CS was itself 60 credits of CS and Math with a built-in Math minor, and was tricky enough completing without idk how many other credits in proper neurology and linguistics that I missed out on at CMU or Pitt, though fascinating, each considerably complicated subjects in their own right.


Very interesting. I am from that era, teaching myself to program starting in 1983 (which I thought was quite possibly too late to catch the microcomputer gold rush ;). I was self-taught and learned from popular computer magazines and well-written, carefully selected books. But now that you mention it I remember looking at course catalogs from good schools and being shocked at how retrograde it all was. Those guys at the universities totally did not get microcomputers for years after they should have.


I majored in CS in the late 90s and this wasn’t my experience at all. The Netscape IPO happened in 1995, followed by 5 years of the dot.com gold rush. Computers flew off the shelves, and everyone wanted to get online.

The dotcom crash happens later in 2001, but if we are talking about the late 90s, then I’d say it was a period of huge energy in the CS field and tech companies were hiring as fast as they could and jobs were plentiful all around.


We're probably about the same age. I decided against comp sci at the turn of the century because of exactly what was being said. The dotcom bust just happened and if the media was to be believed programmers were taking jobs flipping burgers and there were enough programmers without jobs to cover the world's programming needs for the next 50 years.

I wound up going to school for economics and then later found my way into the IT world by circumstance.


When I started uni in 2004 it was still like this. I kind of was ashamed at parties to tell what I study, not to come off as too nerdy. I did a double major, so business was hipper. Just imagine! The status of developers changed so much in two decades. Nowadays people are impressed. And even in my career I see the difference. Not so many VPNs anymore, the move to the cloud made everything much easier.


> Even what came after that, e.g. in C / C++ was considerably tedious compared to what we do today. Folks sometimes had to do objdumps of compiled binaries to debug what was going on.

They used to do objdumps. They still do, but they used to too.


> You used to legit need that CS degree in order to code in your day-to-day

And when people today look back with disdain at ugly VB applications and wonder what simpleton, non-programmer, drag-and-dropper built this piece of excrement (that has somehow been running for 17 years without an update and the replacement project that we hired those consultants for ended up 3x over costs and nobody uses it) as opposed to a Real Software Program, there's the reason.


I was in high school in the early 00s and heard the exact same thing, and that was a major reason why I chose not to major in CS! (The other is that my HS programming curriculum and teacher were inadequate, but at the time I was convinced that I just wasn't wired for programming.) In the end I took the long way around and ended up in the field as a self-taught programmer.


There was also bad programmer job market crash in the 80s that changed the market up a lot by the 90s. In fact, this was about when the gender ratio became very skewed (men and women dropped out of programming at equal rates, but the recovery was lopsided).

Our computer science department chair (Ed Lazowska) at the time brought this up as a reason to be wary about department expansion in the mid 90s.


What caused the crash?


I’m not really clear. A general recession at the end of the 70s along with an end in a booming rise of programmer jobs. If you look at https://www.geekwire.com/2014/analysis-examining-computer-sc...

You’ll see a huge drop off in computer science graduates after a local peak in 1985 (they wouldn’t get back to that level until 2000, note this article also quotes data from Ed Lazowska).


I majored in Computer Science in the late 90s and honestly don't remember any of what you're saying regarding negative/hostile media.

To me it felt like a golden age. The .com bust hadn't happened yet. If you could turn a computer on there were jobs everywhere. The world was starting to get online. Linux was really gaining traction and Slashdot was all time.


This is something my father told me too. He said he spent some time writing the code on paper, thinking a lot about it; then when he was somewhat sure about what he had written it was time to punch the cards. He used to leave the batch on Friday and went back on Monday to ask the "computer operator" about results and sometimes the result was "syntax error on line 1."


I enjoyed programming in the 90s and early 2000s but I feel it’s turning again into tedious grunt work with scrum, agile, yaml configuration files and needlessly complex systems.


You should seriously look in to changing companies.

I'm not trying to denigrate you in any way, I myself switched from working at $BIG_BANK to a more lithe type of company and 90% of that bullshit went away.

Agile + Scrum stuff are minimal and now consume ~4.5% of my week instead of ~12.5%, I'm not spending half my "dev" time babysitting and maintaining giant applications no one really understands in full, and instead work on a bunch of little serverless applications, maybe half of which I do actually understand and can explain end to end.


This is one industry where reinventing the wheel is quite the norm. It's good for all the developers - it keeps them working. Older devs can work on legacy systems, and newer devs (or devs picking up new skills) can recreate systems with the new tools and languages.


If you really find the agile processes you're using are adding tedium, why don't you do the agile thing and tailor them to your specific context?

But I imagine we're probably using the same word to describe very different things.


The current implementation of Agile in most cases is pretty much the opposite of agile as described in the agile manifesto.

In my team we have reduced the process to having a simple backlog which we work through. But I have seen other teams where you spend enormous amounts of time on planning but it’s frowned upon if you think any further than the next sprint. Just check off tasks without any thoughts about long term architecture or strategy. Basically just a sweatshop with replaceable “resources” (the company doesn’t hire “people” anymore but “resources”)


There is no current implementation of Agile though, Agile was an umbrella term for a variety of different practices. I don't think you can blame it if it's been poorly implemented.

Granted, that is most of what I see: poorly implemented agile everywhere, usually half embracing SCRUM. I think that has more to do with the typical command and control nature of upper management though.


I learned Fortran 4 in high school in 1967-1968. That’s how good the NYC exam schools were — Stuyvesant in this case. We had our own 1130. This came in handy in college, I did the programming in a physics group, immediately. But it seemed too tedious to do as a career. I still feel that way.


So what you gonna do when you grow up? :-)


F* around in the machine shop.


Switch to COBOL and REXX


swoon


My dad is an accountant who took some punch card FORTRAN programming classes in the early 70's as well. After 3 semesters he told his professor he wouldn't be returning to the computing department - his professor was shocked, for he was a star pupil! - for much the same reason. He and my mother still tell stories of Saturdays and Sundays spent organizing his punch cards and applying patches (literal in those days!) in the campus computer labs so he could have a more rapid debug cycle than was available during the week.


I agree with your father that batch processing (cards) is a drag.

I was in school and working in the computer lab when we switched from batch processing (cards) to a time shared system with terminal labs. I was part of the team that wired up the campus and connected the campus to the ARPANET (precursor to the internet).

As we rolled out the terminal labs, each CS class was either assigned to batch processing or time sharing. Since I was part of the lab, I could schedule my classes to be time share only. I was only stuck with 4 classes not in the time share lab. 2 were batch processing. For some reason my LISP and AI classes used a teletype interface. It was not as bad as the cards, but still weird. Some classes allowed me to use my personal computer (TRS80) and work at home.

At the time my uncle was a programmer. He said that there was no future in CS and I should switch majors. I could already see the wave coming and ignored the advice.


This is exactly how I first learned to program. Waiting a whole day to find out you had a bug was just way too frustrating for me so I completely wrote it off, as much as I enjoyed writing code. Once the first PCs came on the scene, though, everything changed and I was all over it. Still am.


I remember going off to college in 1986, and thinking I might major in computer science, and my dad told me, "anyone can learn to program computers, you might be better off with EE." To be fair, at the time, anyone could learn to program computers (and it's probably still true) -- my dad was doing it, and his major was chemistry -- and really, anyone that's really good at programming computers is necessarily self-taught to a great extent. You just don't become a great programmer by dint of tutoring. Anyway, I stayed in EE for a year, then switched my major to computer science with a minor in EE, and no regrets.


I find it astonishing that only a couple years later the basic Unix development environment (ttys and full-screen terminals instead of cards, cc, sh, make, ...) came into existence, and has basically prevailed.


UNIX packaged a lot of ideas that had been invented elsewhere. (Often badly.)

CTSS -> Multics -> UNIX. All had interactive development and scripting to varying degrees.


In one of my early scientist-programmer jobs I was assigned an assistant to keypunch, submit jobs and pick up printouts. The other scientists thought I was odd for wanting to do all this myself. I had much more productivity than them.


It's still tedious grunt work


What is forgotten sometimes is that there was (for men) a severe prejudice about working with a keyboard. The image pre-1985 or so was that keyboards were almost exclusively associated with typing pools. Those typing pools were were as far as I know 100% female.

To be honest, this prejudice still exists. I heard a C-suite exec mocking "those guys with the ticky-tacky machines".


My uncle worked for Folgers coffee in the 60s as a general office clerk. They gave everyone some kind of test designed to see if you had aptitude for programming. He scored high, so they asked him if he wanted to learn COBOL, and his career was born.


"Dad, you were a FORTRAN programmer and physicist in the 70's, you could be a very well paid developer anywhere in the developed world... why didn't you?"..."

I'd suggest there is likely another reason, and if your father didn't actively think about it he probably understood it subliminally. Back then, programming was part of mathematics at many universities and, like it or not, everyone doing science and engineering had to study the subject—and for many universities that was Fortran. Fortran was essential part of the background culture: if one was doing mathematics or any of the physical sciences, Fortran was just there—thus, one didn't see it as special or exceptional.

I had no option but to study it but I didn't see that as an imposition—diehards like me were regularly chucked out of the punch card room by the university security guards last thing at night when the joint closed.

Moreover, at my university the Fortran lecturer also wrote the Fortran textbook (well, the ones we used at least), so there was no leniency or excuse: Introduction to FORTRAN IV programming using the watfor compiler - 1968 & 1971 and Basic FORTRAN IV programming (version IBM 360), 1969—by John M Blatt: https://en.wikipedia.org/wiki/John_M._Blatt.

As I found out later there were better textbooks on the subject and Blatt was a didactic forceful character without much charisma, so his lectures were somewhat painful. However, comic relief was not that infrequent. We had Fortran lectures in a large hall which had an upper circle like a picture theater, a certain fraternity would frequent the circle and aim paper airplanes at him when he was facing the blackboard much to his chagrin. No, I wasn't one of the guilty but I fully enjoyed the spectacle.


My grandpa told me the same thing; he didn't think computers were gonna last.

Not sure of exact time, but I think it was late-60s-to-early-70s; Gramps was a business professor at a major university in the Midwest. A team of math nerds started working on this new thing called a "computer". He said it was almost the size of a basketball gymnasium, it took weeks to get the thing set up, and was always breaking. In the end, it could only do simple calculations. He said the university viewed the project as pretty much a failure. So, Gramps, the futurist, told all his business students not to get involved with computers, there was no future in them. Good job, Gramps.

He later told me his big hope was that none of his students listened to a word he said. LOL


As recently as the early 90's, making a lifetime career out of software development was considered impossible. When I was starting out in the late 80's all the developers were taking classes or had a side business, with the goal getting out of "programming" before it was too late. Even those who wanted to stay in the industry took every opportunity to talk directly with clients so they could get into sales or marketing.


One have to understand the social factors as well. It was womens job in the beginning, the stigma of being seen as a female computers was most probably what made it unattractive and tediuos to many men at the time. Many highly placed engineering bosses in programming were women for a long time because those where the people who had experience. See Margret Hamilton of Apollo, and also some of the pioneers in cellphone "software".


my father who was a research physicist from the 70s+ lamented at his colleagues who got too distracted with programming their computers! For me it was great, I got to grow up with computers and electronics and a bunch of adults who loved these things!


It was very niche. My dad (also early FORTRAN programmer) graduated in the very first undergrad CS class at UCLA, around '69 or `70. I think very few universities had a CS course at that time.


It looks like the first CS department was at Purdue (wasn't expecting that); they introduce a CS degree program in `67. UNC was another early adopter.


What is interesting is that the IITs in India (the first 5 at least) were setup a decade prior (late 50s), and some had very heavy support from American and European universities while setting up. So much so that IIT Kanpur actually had a CS department that started in 1963!


Ditto more or less. Dad was a mechanical engineering academic. Learned fortran programming in the late 1960s on punch cards.


I love this so much.


It reminds me of "Mel"in 1983 which was in response to "Real Programmers write in FORTRAN."

https://www.cs.utah.edu/~elb/folklore/mel.html


Which is, of course, included as “The story of Mel” in the jargon file:

http://www.catb.org/jargon/html/


I was fascinated by this story as teenager.

But looking back on it, I would say out of my current perspective this Mel guy was not a genius, but one of the worst programmers you could probably hire:

He written unmaintainable and even unchangeable "write-once" code that was so complex that nobody else could handle it either. He refused to do what he was payed for and just went away as he lost interest.

One of this kind of dudes on your engineering team and your company is in real deep trouble…

It's a given that you will need to throw away everything they did and start form scratch should any changes be necessary later on. However there's one fundamental constant in software engineering: Your software is going to need to change over time! No mater whatever somebody told you upfront. So in case you've got software built by some "Mel" you're completely screwed at that point, especially as changes to SW are usually needed the most at some critical period in time for your company.


He can be a genius to be admired while also being one of the worst programmers you could hire, at the same time. Someone to appreciate, but not to emulate. A highly optimized human being, optimized for the "wrong" thing. More in the realm of art than anything else.


Ok, take your up-vote. I think I can agree on that perspective.

Maybe that's even the point that makes me like the story as such very much.


Nah, those were different times when bits and bytes mattered. Everything was written in assembly/,machine code. Mel's tricks were just how things were done back then. There was no repo, code didn't need to be maintained or added onto. The lifecycle of software was much much shorter.


> Nah, those were different times when bits and bytes mattered.

Obviously not. We're talking about mundane business software.

Also the "optimizing compiler" that couldn't reach such levels of "perfection" wouldn't be a thing if this would really matter.

> Mel's tricks were just how things were done back then.

Obviously not. Otherwise there wouldn't be any point in this story.

It points out, with a lot emphasis, how exceptional Mel's code was!

> There was no repo, code didn't need to be maintained or added onto.

VCS dates back quite some time…

Also maintaining code was of course not any less important for a company as it is today. Simply as companies back than also relayed on their software to operate.

> The lifecycle of software was much much shorter.

No, of course not, as nobody would throw away some very expensive asset for no reason.

If anything, lifecycles of software were much longer than today (when you can deploy changes every few minutes if you please). Stuff written in the 70's is still running on some mainframes today!

As changing software was much more dangerous with much higher risk of breakage, less experts around, and everything much more difficult in general, it was more usual to try to not touch an already running system. (Maybe you even heard some quite similar proverb coined back than ;-)).

But "not touching" it does not work, as there is only one truly constant thing: Change.


>VCS dates back quite some time…

Let's see. From the story:

>I first met Mel when I went to work for Royal McBee Computer Corp... [The firm] had just started to manufacture the RPC-4000

https://en.wikipedia.org/wiki/LGP-30#RPC_4000

> the General Precision RPC 4000, announced in 1960

https://en.wikipedia.org/wiki/Version_control#History

>IBM's OS/360 IEBUPDTE software update tool dates back to 1962, arguably a precursor to version control system tools. A full system designed for source code control was started in 1972, Source Code Control System for the same system (OS/360).

The events of the story predate the precursors of VCSs by two years, and the earliest true VCS by a decade.


My guess is that the earliest Code Versioning systems were completely manual, "Duplicate your tape, mark it 1.2, store it in drawer". And these manual processes were brought to and duplicated on computers when code began to be stored on the computers themselves, instead of through cards and paper tape.

Certainly if you were a business, you had an extreme business interest in keeping your "known good" stack of cards in a place, and every revision in code required a new stack of cards.

"Hey, the machine just ate 10 cards from the payroll software, can we get duplicates made?"

"No, those were the originals, guess we're SOL no one gets paid" never happened.

Most likely "Ok, version 1.34 of the payroll software that was updated last week? Cards 1032 to 1042? Duplicate cards will be up to you within the hour"

or "We have to revert to the old payroll processing software, can you create a new fresh copy of 1.33, 1.34 has some bugs and we need to get tonights run in?"


As you like to discuss this detail:

I can't find any definitive info when this computer got actually manufactured ("announced in 1960" doesn't mean strictly the same). But this was the time Mel was met first time by the author.

The story plays likely some time thereafter.

I guess some significant time, because it takes time even for a genius to become familiar enough with a machine to do all this kind of trickery described in the story.

I think it may make sense to assume even some years passed between when the author met Mel the first time and Mel's departure form said company.

So I wouldn't be even so much off with the VCS statement—which actually doesn't state any relation between the usage of VCS and the story. I've only said that "VCS dates back quite some time". Which is obviously true. ;-)

But, all this actually doesn't matter.

The more important statement was the following. Which is a direct reply to "code didn't need to be maintained", which is in my opinion just not true.

I did not say VCS was used back than for that purpose.

I guess they preferred more a sort of solid hard copy. :-)


What are you smoking? :-P


I won't tell you.

But it's quite strong. B-)


nah, when you're constrained enough, you rarely to never sacrifice anything in the name of future changes. You figure out what needs to be done, then you write a program that does it. If it needs to change, you write a new program. Part of why that's not as bad as it sound is exactly because of those constraints, you're not dealing with megabytes of source code.

There are lots of problems that are specific and simple enough to solve, that it's easier to write a C program from scratch, than it is to find, install and then learn how to do it with some existing package... The same concept goes for programs.. At a certain scale, it's not worth the extra infrastructure/overhead/rigidity/complexity that it takes to write software that's optimized for change.

That said, today, in 2022, it's more or less the opposite, codebases are huge enough that most of software "engineering" is about plumbing together existing libraries, and at that scale, it's an entirely different thing.


No, not even given the historic context this makes any sense.

We're not talking about embedded software with special constrains here!

This story is about mundane enterprise software.

Nothing in the story justified this insane level of over-engineering and premature optimization.

Just using the "optimizing compiler" was deemed "good enough" for all other needs of the company, likely…

Also nobody asked for that over-"optimized" throw-it-away-and-start-over-if-you-need-to-amend-anything-crap.

I have still this warmth nostalgia feeling when looking at this story, but when thinking about it with quite some experience in real world software engineering I'm very sure that this kind of programmer would be one of the worst hires you could probably run into.

Finding any valid excuses for "write-only" code is hard, very hard. This was also true back in the days this story plays.

Sorry for destroying your nostalgia feeling, but please try to look at it from a professional perspective.


I'm pretty sure a card dealing game is rarely considered mundane enterprise software..


Wasn't it a marketing gimmick? At least that's my understanding.

Otherwise it makes no sens. Computers back than where very expensive. You wouldn't use them for anything that wouldn't yield income in some way.

It quite clearly wasn't a "computer game" in today's meaning.

I would call "marketing support software" indeed "mundane enterprise software".


> Your software is going to need to change over time!

Not so for anything shipped on ROM.


The software running on a specific ROM might not be able to change (assuming we're talking for devices using mask ROM and not EEPROM) but it doesn't mean the code itself is supposed to be disposable. Different devices or revisions of the same device can benefit from code changes. Even on the same device unit, they might want to replace the ROM chip to include some fix if it's important enough and makes financial sense. The software itself transcends the constraints of any particular delivery medium.


Ha, perhaps. I work in game dev, and previously had a stint in integrated display controllers for feature phones. Anecdotally, most of my code is effectively thrown away once shipped.


I was a physics major until I stumbled across the jargon file online. It was an, "aha, my people!" moment. It was already showing its age then—nearly 20 years ago!—but sucked me into CS where I was much happier.



This comes up on here every few months, and I can't help but read it every time. We had a few Mels in the earlier days of my employer's history and I can't help but be a little bit in awe of the stories I've heard from and about them.


> the IBM salesmen stood around talking to each other. Whether or not this actually sold computers was a question we never discussed.


A few years ago in school I had to read a paper that was written by a guy who happens to also be a part of my local electronics hobby group. I mentioned this to a friend and he noted that, unlike a lot of fields, Computer Science is still young enough that many of the pioneers are still around.


They're often on this very website, in fact.


I worked at Microsoft in the late 90s and methodically went around to all of them, from the creator of MS-DOS to the creator of Turbo Pascal/C#/Typescript, and asked them all the questions that I couldn't find in the computer history books.


Would love to read these if you've collected them somewhere.


Never really wrote about it!


Where can we read it?


I didn’t take notes or anything and many of my questions were very specific. For example, how did Turbo Pascal do error handling (easily, because it was a handwritten recursive descent compiler), or did Tim Paterson, who sold DOS to Bill Gates for $50,000, ever regret it (no).


My grandfather was like this. Full stint in the Marines and then worked on computers. I remember him telling the story of how exciting it was (and what a big deal it was) when one of their systems got upgraded to 4k of RAM.


> college recruiting materials might well brag that at their institution, there were not one, but two computers on campus.

Our community college highlighted their Vax minicomputer by having a special window that showed all the flashing LED's to passers by. But when PC's became the "in thing", they felt embarrassed and covered the window with PC posters. Poor Vax, lots of memories together. It was an early lesson in IT = star-today-washup-tomorrow.


We had one of the first class of HP3000 minicomputers, which was both highly advanced, with it's stack architecture and variable length memory segmentation, but also very disappointing. But on the flashing lights front, the first design class did not disappoint (see console in upper right quadrant - lots of LEDS - which were new and only red in those days - and paddle switches): http://www.hpmuseum.net/images/3000_2615A_1973-25.jpg


Computers are older than you think.

Computers based on integrated circuits were more recent. But the foundations were much older.

For example take the punchcard. Punchcards as a way to work with automatic computing devices go back to Hollerith machines and the 1890 census. That was how IBM got started. The phrase "Super Computing machine" dates back to 1931, and referred to a tabulating machine built for Columbia University. Raytheon was producing and selling analog computers starting in the late 1920s. Much of the calculations for the Manhattan project were done by machines - Feynman talks about this in Surely You Must Be Joking, Mr. Feynman.

And to give a sense of how much history there is, one of my favorite essays in the 1945 essay, As We May Think, which you can find at https://www.theatlantic.com/magazine/archive/1945/07/as-we-m.... It provided the inspiration for both hypertext and the science citation index. The recombining of those ideas in the PageRank patent was the foundation of Google. But how could someone in 1945 understand computing that well? It is simple! Its author was the man who designed those computers Raytheon sold in the 1920s, and among other things was in charge of the development of mechanical computers for the Manhattan Project. (OK, he did a lot more than that...)


> Computers are older than you think.

I doubt it. If by computer you want to mean any machine capable of mechanically doing some kind of calculation, then of course there are examples going back hundreds of years, or even millenia - the Antikythera device was without doubt a mechanical, astronomical computer, for example. But I wrote "the first commercial computer," and the first stored program, Turing complete computing machine that you could actually write a purchase order and be invoiced for, the UNIVAC I, was only produced in 1951. So, I cheated a bit with "less than 20 years old in 1973." It was 22 years old. The point being in terms of my post, that when I started, computers, and programmers, were still very unusual in the employment landscape, and for people like Ray, of the original post, the concept of being a programmer appeared AFTER their formal education in electronics was finished.


There is a distinction between computer and stored program computers, and you're right that the first commercially available stored program computer appeared in 1951. (Though, interestingly, there was a 1936 patent application on the idea in Germany. And a barely functioning one was actually built in 1941.)

But Turing complete computers predated that. In fact Turing's own design for a Turing machine was NOT a stored program computer. And you're right that the modern idea of programming postdated stored program computers.

But computers are older. As you pointed out, arguably thousands of years older.

However I maintain that automatic tabulating machines were on the path to modern computers. Early accounting applications were based on them, as were key parts of the technology used. Like punch cards.


I agree that "automatic tabulating machines were on the path to modern computers" were on the path to modern computers. But that does not in any way negate that there was an inflection point in the 1950s and leading into the 1960s that represents the birth of what we mean by the word computer (in English). The machines that I learned programming on in the 1960s were qualitatively different from those things "on the path" to them, but are recognizably the same species of machine as what we work on today - which is to say, they were Von Neumann architecture, Turing complete, off-the-shelf, generally available computers for which one wrote code - not calculators, tabulating machines, or hardware reconfigurable logic engines, and not laboratory experiments aiming toward that. The only thing missing from them that is inherent in our contemporary approach to information is the notion of ubiquitous networking of devices to create a broad and broadly accessible information and calculation landscape. That was still a laboratory project in 1970.

Does that matter? Depends on your point of view. What I meant by "how new it all seemed" then tells the tale here. Baruch could maybe see the future in 1945 because of his experience and vision. But by 1970 or thereabouts, almost anyone who touched it could see it, and millions were increasingly able to touch it, precisely because it was was commercially produced and generally available. Still not ubiquitous, as they are today, computers were nevertheless showing up in every corner of life.

So it's not that I think computers or information as a processing as a concept, or even as an occasional reality, were new around 1970. But computers as a phenomenon that would define the way interact with each other and the world, and computer programming as a skill that anyone could acquire, really were.


> even the people who taught it were still just learning it.

Still true today.


And this will likely never change as the most skilled people land eventually in leadership positions or become entrepreneurs.

By the day we didn't even invent some best practices or std. tools everybody in the field would agree on.

CS is still like electrical engineering around 1850. ;-)


Also many of those guys were EEs who had no degrees. They always seemed cheerful and happy with their jobs. It was one of the things that inspired me to teach myself programming.


11/10! Gracias for sharing !




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: