I don’t hear much about it these days, but circa 2004 when I was going into computer science I had multiple teachers and a guidance counselor warn me that all the jobs would be going to India and tell me I should pursue something else. This seemed to be very much the prevailing wisdom of the time. I pursued it anyway as it was my passion.
I was one of five students in my program that had been a huge multi-campus program only a couple years earlier. We were the last cohort before the college discontinued the program entirely, and I was the only one to graduate. What I found however for maybe five years after graduation was a insanely high demand for developers.
There was genuinely a generation that was so strongly discouraged from becoming developers that there were very few. Seems to me like the folklorists have largely missed this.
My father was very strongly against my decision to get into web development (2012) and he echoed this sentiment heavily. I distinctly remember him yelling at me as a teenager; "you'll be living under a bridge in 5 years if you do this! India has this market!". He eventually kicked me out for not listening, forcing me to to borrow money from a friend to pay tuition in my final semester.
Needless to say he was completely wrong. I was out-earning him within 2 years of entering the market and I have probably the greatest job security in my immediate friendship circle.
Despite my efforts, he refuses to talk to me to this day. The man does not like being wrong nor disobeyed. A shame really.
If there’s anything I’ve learned it is that you should always make your own mistakes. Your parent’s mistakes don’t even make sense any more 30-40 years down the line.
My father pushed me into EE. I burned 5 years in industry being paid crap and then discovered it was actually a fairly low paying profession in the UK. Ended up supporting a bunch of Sun machines for other EEs which was more fun. So i learned Perl one weekend, bailed out and lied my way into a sysadmin job for 2x the money. 20 odd years down the line I’m glad I made that choice.
When offering PhD studentship positions, I have had several cases of people from developing countries who wanted to come work in my group, but didn't, and the reason (from their own accounts) was that their parents wouldn't let them go get their PhD anywhere not in the US.
They ended up at groups in the US with much worse scientific output than mine (it's true that well-known US universities wipe the floor with mine in every metric, but that's not true when you go down to the level of individual groups, labs or fields of research), while working unpaid while I was paying a decent salary.
They were bright people so I think and hope they'll have fruitful careers, but their parents (with their best intentions, I'm sure) surely put a roadblock in front of them due to the prejudice that there are no good opportunities outside the US.
This. I also had a M.Sc. student work at my company, a brilliant guy from China, he deserved the best Ph.D. Which in my opinion is with a Professor that made his mark, has a secure position (so he is not still in his publish or perish period), enjoys what he does and cares about his Ph.D. students' development. His parents didn't agree, they wanted him to go to the most prestigious lab he could get into. Probably to be worked like a measurement slave for 4 years + the inevitable extension.
I told him to beware, talk to other students and try to gauge the atmosphere in the lab. If you're smart and motivated you'll make it. But will you have fun and be happy? What do you want out of life?
I have friend that made "Associate Professor" at an impressive age. He does make 80 hour weeks and travels 50% of his time. It can be fun and it is an adventure, but is it what you want? This life is pretty incompatible with having a family for example.
Beware of the fact that your parents may define success very differently from you.
Why is EE so bad? On the surface it looks like a degree with a strong grounding in mathematics, exposure to programming, the discipline of engineering, and a diverse range of applications across industry. It kind of seems like the ideal degree if you want to hire someone. I'm always baffled when I hear stories like this.
I have some hypotheses. Note my degree is in physics, perhaps a similar situation though not as widespread. There are a number of things happening.
Most engineers in all disciplines lose their math ability after graduating. The workplace itself allows this to happen: They get so busy with regular design work that they forget their math and theory. A lot of the analysis work is handled by their CAD tools. The work that does need math or deep domain knowledge is handled by one or two experts within the department.
There are some practical limits to the size and complexity of hardware, that limit the amount of hardware work. An electronic board might be designed and tested once, and then a million copies made. The software for supporting that board is maintained constantly. This is partly due to a conscious choice to move functionality from hardware to software. When hardware is obsolete, it's abandoned. When software is obsolete, it's augmented with new software on top of the old software.
There's a strong message from above that software is more important than hardware. Sparkly software is what management sees when they are shown the product. The people who find that they can program well enough to do it for money, have moved into software development.
It's harder for an individual hardware person to capitalize on their own innovation, because they need the infrastructure to test and manufacture new hardware. So we can only move at the pace of the businesses that employ us.
Programming can inflate its own demand through technical debt, and can organize itself to a level just short of full blown collective bargaining.
Note that I'm not talking about pure software businesses, but those businesses don't need hardware engineers at all. ;-)
Though I don't agree with negative take on software like "sparkly software for management", "inflate its own demand by technical debt" or "organize itself for collective bargaining".
Sparkly software gets you just as far as it is useful usually, less sparkly sells worse but you still need it on hardware to get job done, hardware alone is not enough.
Inflate demand - well people just suck at organizing big projects there is no need to artificially inflate demand it just happens as business needs more and more features.
Software developers are bad at organizing and bargaining - because they all think they are better than others and code of other people always sucks :)
What is my hypothesis:
Hardware has physical limitations as is obvious - even if you build millions of boards - well it takes storage space, you need copper, aluminum, you cannot make transistors smaller into infinity. You can only sell so many phones as there are buyers.
Software on the other hand is limited now mostly by amount of the developers in the world. There is infinite amount of programs that you can run on finite amount of hardware. There is infinite amount of software to be built let alone maintained that is why software developer salaries are going through the roof.
While I can sell 1 phone only once now I can build SaaS solution that I will get cashflow and monthly payments it is not even that individual can capitalize on his own innovation. Basically infinite revenue stream from SaaS model is just so attractive for any business man.
Brilliant comment.
It applies to most engineering, or scientific fields. Develop, discover, once. Maintain or improve forever.
Explains the massive difference in demand for truly imaginative innovative thinkers, and the maintainers.
Come to think of it, other fields too.
Well for me the degree was of mediocre usefulness. The job I had to sit in an office on a factory floor with no windows, fully airgapped on my own 7 hours a day doing test automation and designing test fixtures mostly. That was the entry position for us back then really. I had to wear full static gear head to toe which was hot and uncomfortable and smelled weird. This was broken up twice a day by some shitty machine coffee and to sit in the canteen and stare out the two small windows at people outside at the company over the road all smoking. All while being paid just about enough to eat for the month and keep up with my games habit. There were also 9 layers of bureaucratic nightmare everywhere and as the company was large everyone knew or was related to everyone else as they all lived in the same area so it was politics galore.
A lot of negative responses here but my own experiences having graduated with an EE degree from an average university (in the UK) 3 years ago has been pretty good so far. I was able to get 3 internships at 3 different semiconductor companies throughout the course of my degree and got a job at another semiconductor company immediately after graduating doing digital chip design so I don't think the job market, at least in my niche, can be quite that bad. The pay is pretty good, well above average for a recent graduate in the low cost of living city I'm based in and a bit higher than my software engineering friends in the same city. Still considering maybe doing a masters in computer science to expand my career options though
An EE degree IS awesome. It gives a generalizable mathematical foundation which applies in just about any other field of work. It gives a huge competitive edge. For example:
* Circuits are physical implementations of differential equations, and EE gives a unique way to intuitively think about dynamics, which applies to finance, epidemiology, and just a really diverse range of domains.
* With a rigorous EE background, you can rapidly pick up most domains of engineering (think Elon Musk), since you've got all the mathematical foundations. The reverse isn't true. The way math is taught in EE is broader than e.g. mechanical, civil, or other engineering disciplines, where it tends to be more domain-specific. EE gives you a lot of depth in probability, error analysis, signal processing, controls, calculus, linear algebra, etc. I think the only things missing are statistics and discrete math, and I picked those up elsewhere.
High-performance board-level EE is insanely fun. Incredibly creative. You get to build stuff, design stuff, do math, and it's just a huge diversity of intellectually-fulfilling stuff.
IC design is a bit less fun, due to the many-month turn cycles (develop, wait months and hundreds of thousands dollars, and test/debug), but not bad.
However, the EE industry sucks:
- Pay is not bad, but much worse than other jobs you can get with an EE degree.
- Work culture has all the worst excesses of the eighties -- think of Office Space style cubicle farms, dress codes, conservative management, ISO processes, and paperwork.
- Yet it's somehow adopted some of the worst excesses of the 2010s; it no longer feels like work is a family or a community
- And it's hard to get into. There are virtually no jobs for junior-level EEs (which isn't just BSes -- in the Great Recession, I knew bright newly-minted Stanford/MIT/Caltech/etc. Ph.Ds who couldn't find jobs).
- Even at the senior-level, there's a bit too much of a boom-and-bust cycle, without the booms ever getting that boomy, but the busts being pretty busty.
I spent maybe five years doing EE work after my EE degree, and I think that was enough. I've been out for a long time now. I still do EE as a hobby, and I enjoy it, but the industry culture isn't one I remember with fondness.
I suspect a lot of this stuff will continue to disappear from the US into Asia; that transition is rapidly in progress. US firms maintain specialized knowledge in some areas (e.g. particular types of MEMS devices), but there are plenty of places we've fallen behind. I don't see us on the path to regain leadership. I think some of this is cyclical. Declining industries don't make for good employers, and poor employers don't make for growth industries.
EE is one of those "engineering is really applied physics" disciplines. There's a slant towards standardised EE-specific solutions for PDEs, but it's still much more abstract and mathematical than any field of CS, apart from maybe cryptography and data science.
But career-wise, it's a mediocre choice in most Western countries. (Possible exception is Germany, where engineers have a similar status to doctors and lawyers.)
Most people have no clue what EE even is, or just how much math and engineering goes into building everyday devices and services.
(A friend of the family said "Great! You'll be able to get a job repairing TVs!" when I got my course offer.)
> There's a slant towards standardised EE-specific solutions for PDEs, but it's still much more abstract and mathematical than any field of CS, apart from maybe cryptography and data science.
Here's the thing, though: PDEs are NP-hard. There isn't a generalizable way to model dynamics. On the other hand, dynamics come up everywhere:
- How is the pandemic going to evolve?
- How will incentive structures skew cultures?
- How do I build a suspension for my car?
- How does heat leak from my house?
- How does my understanding evolve with learning?
... and so on.
What EE does -- and I think uniquely -- is given intuitive, graphical tools to think about differential equations, in tools like Laplace, Body, Nyquist, root-locus, and so on.
They also give a lot of applied experience in applying those, including in contexts with nonlinearities. An op amp will clip on both sides, which you model as a linear differential equation (which is easy enough to reason about) and a memoryless, time-invariant nonlinearity. You squint. You kinda ask yourself how it would work if it /were/ linear, and the nonlinearity just cut gain. And at some point, after doing it enough, you have intuition for what it will do.
With the EE-specific stuff, I can intuitively reason about these things think through to design.
EE is all about modeling -- building simpler equations which approximate more complex ones in ways which give intuition -- so this is also usually correct or almost correct. Indeed, if you go onto grad level courses in control theory, you'll see formalizations of this intuition, where for example, a time-variant system or a nonlinear system is modeled as a linear time-invariant system, together with a bounded error.
A lot of the mathy stuff -- which I've learned a fair bit of as well -- is in abstract more general, but in practice, gives much less intuition.
My experience with the real world is that there are rarely actual differential equations handed to me. I kinda get that we've set up some pricing structure, or some incentive design, or whatnot, but I can't model it formally. I know which way things push, and whether those integrate or not. I can draw a block diagram and reason about how it will behave, in a way the math side doesn't let me do.
>Germany, where engineers have a similar status to doctors and lawyers
Errr, no they don't. In terms of pay and status Doctors and Lawyers trump Engineers every day of the week in Germany, the only exception being the engineers with PhDs who are tech leads in some well known research institute or big-brand company like Audi or Porsche.
I think you over promote the utility of the formal methods taught in the degree.
Not once did I use Laplace in EE. It was all cheat sheets or applying data sheets and doing some adhoc calculations in excel or taking a wild guess and iterating.
After twenty years of doing IT related stuff I’ve forgotten how to even differentiate stuff.
1. I did learn them well, and so I did use Laplace quite often in EE.
2. I jumped careers not into IT, but into tracks which leveraged both programming and a mathematical skill set.
I'll mention: I'd be bored out of my wits doing just IT. The intersection of IT and math includes computer graphics, visualizations, robotics, machine learning, fintech, image processing, and a ton of other stuff I find much more fun and fulfilling.
That's not an implicit commentary on your path, by the way, just an explanation of mine. We all have different goals, desires, values, constraints, etc.
Yeah, I have an EE degree but have only worked in software, and I have to say you can bring any kind of mathematics to the job if you have the imagination to find the places where it is an advantage.
I sought areas where I could learn more math and use it to stand out from the crowd, albeit to mixed results, because if your manager can’t read your analysis paper he may not be impressed either, and sometimes the reverse.
If you know a little bit of math, there's no benefit.
If you know enough math to jump to e.g. medical imaging, robotics, controls, simulators, image processing, ML, or similar, there's a ton of benefit.
An EE degree ought to give enough background to get there, although it may involve a year or two of study in a particular domain, and a side project to prove you have the skills.
I think some of it must be 'sticky' culture. I'm another former EE who left the field to develop with much better pay so the inter-discipline competition does exist in some form. What I saw of EE jobs didn't have the dress codes and comfortable offices.
Same answer as any underpaid field: an oversupply of labor in that field vs demand.
The absolute worst fields are the “sexy” or trendy ones. Unless you are strongly driven to enter a field for non-monetary reasons always look into employment opportunities.
"The ideal degree if you want to hire someone" is different from "the ideal degree if you want a high salary".
To get a high salary, you need to be in a good negotiating position, such that if company X won't hire you for US$150,000(/year), then company Y will hire you for US$145,000 (a strong BATNA). There are three possible reasons company Y might not hire you for US$145,000:
1. They are badly managed and making irrational decisions.
2. It would be unprofitable for them to hire someone like you for US$145,000; in an engineering position, this is because their revenues minus cost of sales would go up by less than US$145,000, risk-adjusted.
3. They can hire someone else like you for a lower cost, such as US$130,000.
Item #1 can mostly be discounted as a difference across fields; there are badly managed companies in every field, but generally they aren't the ones who hire a lot of people, and they aren't the ones providing your BATNA. However, in the case of programming, company Y might be you and your college roommate setting up Pinboard or Tarsnap, so there is perhaps a relevant difference here.
A thing about items #2 and #3 is that "someone like you" means "like you" from the company's perspective before they hire you. The fact that you can solve hard leetcode puzzles during the interview in ten minutes figures into this, because that's something they can observe before they hire you, unless they go through a recruiter, in which case it doesn't. If you can do a board layout with 166 MHz DDR and it will work right on the first spin with no signal integrity issues, that doesn't figure into "like you", because that takes at least a week, so you can't do it as part of the interviewing process.
The bigger difference, though, about item #2, is that the returns to NRE work in either EE or programming depends on volume. If your EE innovations give them a working board that costs $3.80 to produce instead of the other guy's $4.30, then if they're producing 100 units, you've produced $50 of value for the company. But if they're producing 100,000 units, that same amount of work on your part has produced $50,000. And similarly if the product brings an $0.50 higher price instead of having an $0.50 lower cost. And similarly if we're talking about lowering the cost per user of operating a server farm, or increasing the ad spend per user.
So why is that a difference if they're both NRE work? Because producing 100,000 units of an electronic device requires a huge amount of up-front capital investment. You can't produce 10 units one day, then 15 units the next day, then 25 units the day after that. So if Company X is investing $3 million in your project and Company Y is investing $0.3 million, the Company X devices are likely going to get produced in ten times higher volume, so every design decision you make is worth ten times as much money.
Now, of course if you're working on Google's backend systems that serve, ultimately, 5 billion people, an 0.1% improvement produces more value than an 0.1% improvement in the systems of a company with only 1 million clients. So FAANG can afford to pay hackers more than the average health plan, ISP, or venture-funded satellite imagery startup. But, even in those cases, the capital investment needed to get a lot of value out of a programmer's work is fairly small, maybe US$10k to US$100k, rather than the US$1M or more that is common for EE projects. This puts programmers in a much better negotiating position.
So, though the two fields have similar diversity of applications, type of work, and intellectual difficulty, electrical and electronic engineers are reduced to begging for scraps from rich employers and then have to sit in bunny suits on factory floors with shitty coffee, while programmers get free massages, incentive stock options, and private offices. Except in countries where companies hire programmers through recruiters.
(Why do they do that? I think it's mostly an industrywide case of #1: companies in England and Australia use recruiters because everybody else uses recruiters, and so the ambitious programmers leave the country, reducing the large advantages obtainable by black-sheep companies who hired directly and could thus hire only competent programmers, to whom they could profitably offer twice as much money. But maybe there are legal reasons or something.)
Same for physics. I earn over 3X what I could make as a top-tier researcher paying myself out of grant money and have had a handsome seven-figure IPO (close to 8 figures, if the market price holds). It's quite shameful, as physics research is almost uniformly more important and relevant to society. Now that I'm set for life, financially, I very well might return to physics.
In America, we value frivolous crap over serious scientific and artistic endeavors, mainly because the masses are terribly dumb. Idiocracy was quite close to the truth, as good satire must be.
I had almost exactly the same career arc. Perl became Python, Sun became Linux and I eventually moved beyond sysadmin work into automation, then API development and now I lead software development teams. Are you still doing sysadmin work?
> I burned 5 years in industry being paid crap and then discovered it was actually a fairly low paying profession in the UK
I was absolutely dumbfounded when I looked up engineering salaries in the UK. I mean, even in the US, I don't consider most engineering fields particularly high paying, but the UK is something else.
In hindsight this may seem wrong but I don’t completely blame him.
The offshoring of IT was so novel unlike say manufactured goods. Governments can’t stop it although I think they should in some form if you think it’s important for your country
I'm really sorry for the relationship you have with your father.
I've had similar experiences, but not as drastic as yours -- my father didn't allow me to use my bike for YEARS as a punishment for smashing my bike as a 10 year old (faulty chain, impulsive child).
I guess this I would call the real toxic masculinity.
Unfortunately there's nothing much you can do, except build your life around it, keep your boundaries firm and perhaps psychotherapy.
That's a harsh punishment. What kind of effect did it have on you? I remember I biked to school at that age (~10-15 min). If I damaged one of my tools, such as my bike or my shoes, it'd look damaged, and I disliked that. But I had to cope with that. That was a punishment by itself.
I wasn't able to get a job as system administrator (even though I believe I'd liked it) because I can't program. I tried it various times, and the reason I always quit is because I get frustrated and get migraine symptoms. I tried C, Lua, Python (multiple times), and all I ended up being able to is some Unix kung-fu and shell scripting. Which seemed to be not good enough for system administration. It kind of feels like my failure in life, to be frank. Tho I'm happy I ended up with something else, IT security related.
> I guess this I would call the real toxic masculinity.
HN is the last place I would have expected to see these words.
Which is slightly interesting take on a parent - I presume they housed you, fed you, got you educated and relatively safe...I cant possibly imagine you are writing this from a rehab facility or jail. And yet, for denying you a bicycle deems that man "toxic"? And what does a parent taking away a privilege have to do with masculinity? Which one of your human rights were infringed upon exactly?
>> my father didn't allow me to use my bike for YEARS as a punishment for smashing my bike as a 10 year old
Seriously, being so strict to punish a 10yrs old kid for years is not a sign of a toxic personality? Parents suppose to build theirs kids character and not stomp on it to make them 'tough'.
toxic masculinity is not being manly man man. Its being insecure and lashing out at anyone who disagrees with you or is not 'with you'. Any father that thinks their kid misbehaving is somehow a personal attack.
The last bit fits the bill the most in his case -- he took it as a huge personal attack that I smashed the bike he bought, and the tough discipline he was raised in is why I labeled it as toxic masculinity.
Why do you associate this with masculinity though? Yes, your father is man but I'm not sure if his masculinity has anything to do with it?
I recognize the character traits. I know someone raised without a father who also shows them. Perhaps in his case a result of having a tough time financially their entire youth? This put a huge value on stuff (because almost everything is irreplaceable, literally, and there was hardly any time to fix things), leaving little room for error and no reasonable room for breaking stuff, as children do?
As I am getting older I really realize that I do things like my parents did them. It is really difficult to just decide to do it differently because of the emotions involved. It is difficult to be rational the whole time, the monkey brain sometimes leads. What is important is to be able to admit your mistakes and being able to apologize. For all my flaws I hope to show my kids at least that there is no shame in this.
I'm sometimes harsh on the children only to later apologize and tell them I actually do believe they should be able to make mistakes and that whatever they broke can be fixed again, and if not, no lives were lost (ideally).
I wonder why I get so down-voted, perhaps my non-native speaker brain has a wrong understanding of the term "masculinity"? In this context I take it to mean that being unreasonably angry at kids that wreck stuff is a typical thing for men, and that women don't suffer from this? Is this wrong?
Denying a child his bike for YEARS, and I mean years gatheting dust in the shed is toxic by my definition, obviously not the single thing my father did that I would label this as an example of toxic masculinity, otherwise I'd call it "odd", also my dad had enough qualities that I don't label him as abusive, in my mind this is an example of over the top discipline + the other cultural stuff that in my mind I would call toxic masculinity.
I am certainly grateful to my father for all the material support he provided, and he did provide a lot and still insists he does, even though I constantly remind him I'm an adult and quite well off myself.
You don't need to be a drunkard to drink alcohol, and you don't need to be permanently toxic to exhibit toxic masculinity. I dislike the term (because people immediately get defensive) but the concept is definitely useful and describes a real phenomena.
It's not about masculinity being inherently toxic, it's about toxic expectations men are expected to fulfill by the society.
And you don't have to have your human rights infringed, for example something as simple as saying "you are useless" is toxic, yet it doesn't infringe on any rights. Not everything that's legal is good.
I have a lot of respect for your perseverance, it was the right thing to do. Make your own mistakes, making someone else's is just so... Painful. And it makes you feel weak.
Similarly someone once advised me not to do a Masters in Molecular Biology because the market was screaming for Bachelors (back then in the Netherlands a Bachelor meant working in a production lab or something). I didn't listen and this also turned around in about 3-4 years and I love how my career went (and is going).
I had a similar (but much less intense) situation with my Father in law by the way. He has some experience in "home improvement". I like to read and watch YouTube before starting any activity. So we had a big discussion/argument on how we would re-do our ceilings. In the end I said: I prefer to do it my way and mess it up than to do it yours and regret that. He agreed, we had a fun time working together after that. He learned a thing or two and now he consults me on many things he does around the house.
"I honestly don't know which way is right. In case something _does_ go wrong, though, I'd rather have only myself to hold it against because I made the wrong decision than also hold it against you and let that make its way into our relationship."
I think this just made its way into my "life rules for dealing with family and close friends."
It's up there with "give money freely, but never lend it," and, "don't hire a friend to perform work for you on a deadline."
It's for sure something I try to keep in mind raising my children now. It's a form of admitting that you never have 100% of the facts + knowing that making mistakes is very good for learning. You can definitely learn things from how even the youngest of kids solve problems.
This of course does not mean you should let them die screaming "yolo!", there is a balance to be found, but having this mantra keeps the balance a bit more on the "open mind" side of things.
Kudos for sticking to it, I'm not sure I could have done that in your place! I'm always horrified at stories of parents who act this way. "Do it because I say so, I'm infallible as far as you are concerned, if you disobey me I'll hurt you."
I was not aware of how much damage he did until my mid 20's and it led to about 2 years in therapy to work through it all. This story is one of countless stories where he has acted out of line. A benefit of my career is that I had enough capital to pay out of pocket for a very effective therapist.
I ended up gaining a far better understanding of who he is as a person and perhaps even why he is the way he is. This has allowed me to come to terms with his actions enough to forgive and forget (for the most part) and move forward with my life. I still call him once a year on fathers day, an attempt to extend the proverbial olive branch, but he never picks up.
All in all my life goes on and I am happy and healthy. I am extremely grateful for the community I live in, the fantastic friends I have surrounded myself with and (most importantly) my incredible wife who helped me through a lot of the pain over the years. I wouldn't be where I am today were it not for her.
I'm not trying to put everyone on the same level as your situation, but many (most, I might guess?) have had parental situations that are less than good. Sometimes for a short period, sometimes for the entire childhood.
My childhood wasn't bad (as in horrific) but there were enough ... episodes that stuck with me for a long time, and sharing with friends (both at the time and afterwards) raised a lot of eyebrows (mostly saying "that's really not normal/acceptable").
"This has allowed me to come to terms with his actions enough to forgive and forget (for the most part) and move forward with my life."
I'm glad you got there. I've come to a similar place (without a lot of therapy, but over a longer period of time). My parents did the best they could, and on looking back, they were in a situation they didn't want to be in (kids at a young age, money problems, etc). With enough time and distance, and knowing them as an adult, they're generally OK people. They did the best they could, they just didn't know very much at the time, and ... they got a lot of bad advice (imo) during a pre-internet time when it wasn't fast/easy to get a lot of information. I'm still rather fortunate, in that they're both overall good people (just... were not prepared for parenthood) - many folks are bad at parenting and ... are just overall not great people either. I have a sibling that is still struggling to come to terms with some of this, and is still searching for 'answers' to things that I don't really think have 'answers'.
And yes, a good spouse can really help balance you out, give you some grounding and perspective. You can get some of that through therapy as well, I'm sure, but having a spouse with you is a different sort of grounding and perspective.
"If you pursue this career, you'll live under a bridge in 5 years, so I'm kicking you out and giving you no money, so you'll live under a bridge right now."
edit: kudos to you for sticking to it really! You should be proud of yourself!
I know you made the point in jest but you’re right. To be adaptable you need to embrace change and opinion is the hardest to change. I’ve seen people die holding an opinion which has been thoroughly disproven. Embrace change!
To be fair, 2002-2003 would have been a pretty rough time to graduate as a developer. The industry hadn’t yet recovered from the whole dot com thing and there were still a million “converted business major” developers on the market who hadn’t yet got the hint that they should go to law school.
The advice you got might not have applied to you on the way in. But it certainly applied to all the students those professors had just seen graduate. It probably took them a few years to notice the tide coming back in.
Incidentally, the jobs had all been “going to India” for a long time by then. I’ve personally heard about them going there for at least 30 years now and I don’t doubt they will continue to do so in the future. At this point I’m not overly convinced they will all make it.
> The industry hadn’t yet recovered from the whole dot com thing
You are completely right about that and that slipped my mind. Either way it was kind of nice how easy it was to get a job back then. Getting my foot in the door was a little difficult actually, but once I had some thing on my résumé I got an interview for basically every job I applied to.
My résumé is much more impressive these days, and it may be in part ageism or higher salary expectations, but it’s much rare to hear back from a job application.
It's ageism. Sorry for being so blunt, but I know to many people who were just uninteresting to human resourses after hitting 45-50. I know more than a few homeless people in my life, and over half were once Programmers. It makes me sick. I do know this, if you are in tech, save up that money.
(I have never understood this industry's way of hiring. Long drawn out interviews, and candidates whom are just average that get jobs.
Why not hire a guy for two weeks. See how they do. And of course, be upfront. Tell them this is probationary. You don't sent a guy to relocate, just to be let go in a few weeks.
There are some really great older workers that some company will scoop up when hiring is upgraded to this decade. Sorry about the sentence fragments. Too lazy to fix.)
As someone who's been in charge of recruitment, I've made an effort in hiring people who are less likely to get a chance at other places (minorities, older candidates)
One thing that stood out to me though is that a lot of the older candidates I interviewed that had been struggling had deep knowledge of a technology that's no longer used and had had very little interest during their career in learning new things outside their job. So once the thing they were expert in was no longer fashionable, they had a hard time catching back up.
I know it's unfashionable to say this and I know that a lot of people have full time jobs and don't want to have side project additionally to their full time job but if you want to keep a career as a software developer past 40 without going into management then you need to do side projects,you need to keep up to date with some of the latest fad (even if some of those fads are cyclical and recycle concepts from 20 years ago). Doing a job well and becoming an expert in an obsolete technology is no path to career growth.
> … "if you want to keep a career as a software developer past 40 without going into management then you need to do side projects,you need to keep up to date with some of the latest fad (even if some of those fads are cyclical and recycle concepts from 20 years ago). Doing a job well and becoming an expert in an obsolete technology is no path to career growth."
Sad thing about this is that some of us (like me for one example) have known this since the days before personal computers were even a thing, but it don't really help to stay on top of modern tech when all the people in charge of hiring in the tech industry are twenty-somethings who automatically instantly hate anyone with even a single gray hair or wrinkle visible, and instantly dismiss a lifetime of knowledge based entirely on ageism.
Even sadder is that the "decision makers" in many tech jobs still to this day actively ignore good advice that people like me have been giving since the early days re; things like network security being more important than they want to admit to themselves (just to name a simple and obvious example), and then when their ignorance of such issues come back to bite them in the ass they always scapegoat the "new guy" somehow and out he goes.
Yes, I should have elaborated. The two best engineers in our team are in their mid 40a and our QA lead is in his late 50s. So, I'm well aware that even great candidates who are really good at what they do are passed over just because of age.
But I did want to explain the typical issue I have seen when interviewing older software engineers that had a long career but shot themselves in the foot by being too tied to a specific technology. At no point did I want to say that it was the case of every engineer I interviewed. And I fully agree that it's often tough dealing with the ageism in this industry.
A track record of being right is useless for having your advice be seen as "good advice".
It takes some politics, and interpersonal trust to get your advice taken seriously. Sadly, this also means looking past the technical details and considering organizationally why certain decisions need to be made. The company might prefer the occasional breach and "we are sorry" PR over spending a lot of money on network security, for example.
> "The company might prefer the occasional breach and "we are sorry" PR over spending a lot of money on network security, for example."
Sure, but what about when the choice presented is more along the lines of "Hey, let's spend a little money on network security now so that we don't have to spend millions later cleaning up a huge mess", and is met with the mentality of "You're worrying about nothing." Ummm, no? I'm sharing the expertise for which I was hired?
This is true for anyone. I've interviewed young people with 5 years experience only to find out they had 5 years experience doing the same thing over and over that they learned in the first 3-6 months on the job.
While the pace of change has slowed down in some software areas in my last 20+ years of working, I never saw it as a field where a person could learn something in school and simply apply it for an entire career. For better or worse, it requires learning outside ones job, or at the very least, constantly moving jobs to keep skillset building.
For me, this has been fine because I've always loved tinkering on the computer. Learning new tech is something I've always done as a hobby as far back as I can remember.
Those going in only thinking programming is an easy way to make money, may struggle later in their career. Though, this is less of the case now as there is so much more older code out there to be maintained.
I just want to add that technology is political. The directions specific entities push for become de facto standards. I have heard and seen many technical hiring staff express the idea that some technology or practice is wrong or obsolete when they are simply unaware of either the full picture or how the underlying tech works. There was a medium article that recently suggested that developer competence be measured by a number of standards that are quite obviously the result of the authors misperception about how web browsers, JavaScript, modern front end coding idioms, and modern front end build tool chains work. What was worse was the condescending disdainful tone the author had while they clearly had no idea what they were talking about.
Developers who understand the fundamentals of a particular area may not be ignorant of so-called developments in that area , but rather may be able to see where trends and modern approaches fail or lack. I know a Swiss developer who never used react in his life, but he coded up a rough approximation in an hour in front of me. He has his own tool belt, naturally. Would he be a good front end factory grunt? No. Smart companies use his services to design them tools.
>There was a medium article that recently suggested that developer competence be measured by a number of standards that are quite obviously the result of the authors misperception about how web browsers, JavaScript, modern front end coding idioms, and modern front end build tool chains work. What was worse was the condescending disdainful tone the author had while they clearly had no idea what they were talking about.
Closing in on some ageism inflection points, I find I have the opposite problem: everything is still interesting! Maybe more so: things I thought were boring at 20 are things I now know more about the relevance of. And the thing that's terrifying is knowing I won't have time to learn it all.
Paradoxically, this does have the side effect that I'm more skeptical of learning some new ways of doing the same thing, preferring to prioritize new learning for actual new capabilities vs different arrangements of the same deck chairs. I see that behind some of the inertia in my experienced cohort -- just-in-time can be reasonable approach to picking up specific tech.
OTOH, I've certainly worked with people who are aggressively disinterested in picking up anything new, even when offered the chance to do so on the job, and that's certainly no fun to work with. Or manage.
And thats the reason I would stop my yonger-self from doing software.
When compared to other careers after, say, 15 years of experience you are not well established high paying position as a lawyer or an engineer (the actual kind :) )
After 15 years you are out-of-date dev that didn't find a time to do their job, manage family and friend and have a side projects to keep up to date with technology.
This is a bit overdramatic but sw devs are expected to ride the wave all the way through their careers when other career paths offer stability and protectionism of well established fields.
What about the half million total comp and the part where you can retire in your thirties? If anything, I bet younger me would do even better than I did if he were to start today.
Sure. I bet those vocal people would be happy to explain in detail what they did to get those jobs.
It’s not easy, and you may need to actually go to the place where those jobs are. But each of those companies will hire thousands of people at those compensation level in the next year. And the people they hire were in no way exceptional (or different to you) before they started down the path to getting where they are now.
No need to buy "their book on how to get those jobs" or anything like that.
To get the interview, you just gotta have some years of experience, some side projects, and that's about it. If you went to a top tier school or something like that, you don't even need the years of experience. Getting an interview at those companies is the easy part, pretty much everyone I went to college with got an interview with one of those companies before graduating (and I didn't go to some super known ivy league college, it was just a pretty good public school in Georgia). The issue was that most people didn't pass those interviews.
To pass the interview, read up on systems design, read up on algorithms, practice some leetcode/hackerrank/etc., and you are good to go. All of those resources can be easily obtained for free. There is no secret or trick to any of that.
You mentioned "fashion" twice. I think as an industry we need to re-evaluate why it is so important for candidates to be "fashionable" compared to being smart, reliable, experienced, or efficient. Why is a new grad with 1-2 years of experience and only knows React so much more successful finding work than someone with 20 years of experience programming in eight different languages and in companies ranging from a 10 person startup to a 10K person mega-company?
You are absolutely right. Would you like your brain surgery to be done by a young kid straight out of collage or a 20 year veteran who has experienced and seen it all before?
I have had the same experience when hiring people. However the problem was the young developers not the experienced ones. Very few young developers I interviewed showed any interest in learning anything new. They just wanted to find a job that matched the limited tools they had. The best developers I have ever hired were seniors.
Cue for resume-driven development, and people asking "why do we hype so much that new worse-than-useless tool that is costing that much people's time and reducing our software quality?"
In the Netherlands, permanent employment contracts are much harder for companies to break than in the US, so they often hire new people as consultants or temporary employees for a few months, then offer them a permanent contract if they work out.
It worked out well for me at TomTom: I kicked ass during the 3 month probationary period as a contractor, then when they decided to offer me a full time contract, I was in a very good position to negotiate a salary (and even a hiring bonus), much better than if they were hiring me cold out of the blue.
No, they don't. You'll generally be paid less and have less benefits. It truly only serves as a "do your dues" period.
GP also doesn't mention that having a perm contract on its own grants an individual benefits you wouldn't otherwise have. Inverse this, and it means a perm contract becomes a requirement for reaching a lot of milestones in life. Most notable are mortgages and even free market rent.
> Why not hire a guy for two weeks. See how they do. And of course, be upfront. Tell them this is probationary. You don't sent a guy to relocate, just to be let go in a few weeks.
By doing so, that cuts out anybody who currently has a job. Looking for a new job is much less stressful when your fallback option is "keep working at the current job and continue the search". It also gives you a much stronger negotiating position, since you don't have the deadline of bankruptcy hanging over your head.
If I heard that a company wanted me to work for two weeks on a probationary manner before deciding, that would be a very hard pass. That's long enough that it couldn't really be done without leaving a current job first. Best-case scenario, the transition becomes immensely more stressful for no good reason. More likely scenario, the hiring company decides to play hardball after the two weeks, and I'm over a barrel because I already left the safety net.
I graduated in 2003; did volunteering for 9 months and then got a very low-paid job in a startup in the Midlands in the UK.
The assumption I was given was almost all the thinking and innovation had been done, and it was just a matter of working your way up the Java coder/senior/architect hierarchy over time.
I feel a bit cheated in retrospect - almost all other times the industry's felt amazingly exciting/fun. At that time it was a bit deflated. Luckily I just wanted to work with computers, so at the time I wasn't bothered.
I graduated in 2002. My peers had been boasting about how we wouldn't even entertain offers below $75k when we graduated in 2000. There was the general .com bust, but I was also literally working on my resume the morning of 9/11 which essentially froze up the market entirely.
Most offers which were "locked up" were rescinded. I was fairly fortunate that the small software consulting firm I had interned at landed a new contract and made me an offer- for $37k. I swallowed my pride, the market was absolutely flooded with java devs with a few years experience that I was competing against. It seemed far better to be working and building a resume even if for a non living wage.
I received only one hit on my resume my entire first year working, despite sending it out to multiple places each week.
After about a year, things picked up a bit and I landed a job with a group that was impressed by my distributed systems undergraduate research work and curiosity and was building what would become known as a high frequency trading system. My salary was still lower than my 2000 self thought acceptable, but at 55k I could at least afford to move out of my parents house.
Things took off from there a bit, but it was a rough start. And it's a post 2010 thing that engineers can retire in their 30s. My expectation was that this would lead to a comfortable middle class lifestyle but the eye popping salaries of today were beyond the expectations of any of us. Tbh, when one of my coworkers left to go to Facebook in 2007, I looked at it with disdain- a website? How disappointing... I imagine his net worth is deep in the 8 figures- he is still there.
Oh yeah 2001! 6 months and dozens of applications to get a job offer. I would have done 100s of applications had their been that many to apply to.
Low points was being rejected from a job maintaining access databases in a small town. Being rejected from another job because I hadn’t suggested manually unrolling a loop as an alternative solution to a problem and I said “oh yeah you could do that I guess” (feedback: bad communicator I withhold information), and a 2 day hoop jumping affair for a dev job involving fake meetings where you had to make up bullshit.
Fuck graduating 2001. Also probably heavily affected my lifetime earnings as I started on £18k.
Just an FYI to the Brits on this post - I've found US startups are happy to hire remote engineers from the UK. Might take a little bit of effort but it's surprisingly possible to get a six figure job quickly.
Make a UK company and they pay that as a contractor, you sort it out with HMRC yourself.
The Americans seem to be much higher paying, greater variety of roles, happy to let you remote.
Now is the time by the way, I was blown away by how hot the market is. Maybe jump in there before WFH starts to get retracted, you can do more interviews before you're back in the office.
Yeah America seems to have the nice combo of reasonably price places to live and high tech salaries. Since “remote, but US” is common I do t see why you can’t take your $150k and live off a quarter of that somewhere and save most!
I recently got an offer that was $175k base, full remote and felt comfortable enough declining it (though most of that is because Im optimizing for liquid total compensation so….) $200k total compensation liquid is still mediocre for a mid level engineer but at least I don’t pay London real estate prices.
It's because some areas where you can get a house for $1,500 per month instead of $4,000 come with caveats like neighbors with guns and hellish weather
Isn't it deadly ironic that the country that has such terrible health care is also the same country where people get shot all the time?
I love it that San Jose is going to require gun owners to carry liability insurance if they want to carry weapons. It's about time they started paying for the health care and rehabilitation and funerals of all the people they shoot, and stopped whining that they're victims of government is oppression, when they're perfectly complacent with everyone paying for licensing and registration and insurance of cars.
>San Jose to Require Gun Owners to Carry Liability Insurance
>San Jose officials have passed the first law in the country that requires gun owners to carry liability insurance and pay a fee to cover taxpayers’ costs associated with gun violence.
Bearing in mind that 30k a year goes a lot further given that there's no need for health insurance and student loans are effectively just a 6% tax on earnings above 27k until either your loan is repaid or it's been 30 years since graduating.
While it's true that UK salaries are quite a bit lower than other countries if I'm honest if I earned £150k a year the only difference in how I live my life would be that I'd be driving a Tesla. 30k is more than enough to live comfortably on, even in areas with a fairly high cost of living.
Health insurance is generally covered by your company as a benefit, and if it’s not when you’re in your early 20s it’s generally only a few hundred dollars a month. On a salary of $150k it’s pretty insignificant. Average student loan debt in the US for a bachelors degree is $30k (too high of course) but when paid out over decades is also insignificant against $150k.
I’m not making a value judgement about whether health insurance or student loans should exist but just pointing out that your comment seems like a false rationalization about why a much lower salary in a different country is okay.
Outside London, sure, but a London one-bed apartment 40 minutes away from city will run you ~1300/montn, that £18,000 a year with councill tax and bills.
I didn’t have student loans because state schools are cheap. I pay like $75 out of pocket for my health insurance per month and most of that is my HSA contribution.
For what it’s worth I can’t afford a Tesla either making $200k now and live in a 450sqft studio.
>which is on the upper end of graduate starting salaries even for today
Do you have a source for this? Anecdotally, my impression is that graduates in any degree can reach £30k with reasonable effort. Specifically for graduates in tech, the top end is something like £100k nowadays, and the average would be more like £45k.
I don't beat myself up. I had 3 offers. 2 for 18k and one for 17k for the government. The fact I had 3 offers but nothing before that for 6 months was probably an artifact of getting better at interviewing. The job I eventually got I was accused of being "too polished".
The highest starting salary I applied for was 25k, and the highest I heard anyone get was 40k, but working on a trading desk for 12 hours a day.
>6 months and dozens of applications to get a job offer.
I honestly can't tell if this is supposed to be good or bad. Graduated in 2020 and spent over a year looking for work with several hundred applications sent. Ended up enrolling in a masters because of it, only got a job last month.
Back in those days developers were still doing most of the hiring decisions so with a sanely looking resume you could get an interview and if you could pass for knowing programming you could get a job (apart from in the post-dotcom bust and 2008 when firms just didn't hire).
At some point during the 20 years since, orgs decided that HR was to handle hiring. HR trying to add value by having the BEST candidate in their mind and decides that nice-to-haves are promoted to requirements, anyone over 55 (because they would not stay long enough, even while younger ones change jobs within 2 years) is culled as is anyone with less than 5 years of experience (unless possibly if you're below 25).
Then orgs go around complaining that it's hard to recruit people, Go figure.
I thought it was bad, since most applied for jobs while studying (the big companies that come to the uni type of thing) and had jobs lined up. I decided not to disrupt my studies with this so waited until I graduated to look for work.
My boss loves to play with the idea of outsourcing, but he seems to be aware of the issues to some degree. In our situation, the idea is pretty hilarious - we as developers have to make sure our products don't become unmaintainable, product management doesn't like to think things through when specifying new features, and we already don't get time for refactoring. I think outsourcing (especially when done to save money) requires pretty fantastic product management and my intuition is that it's fundamentally incompatible with a codebase that is growing along with the company.
Programming used to be getting a computer to do something.
Now it's refining the idea someone has until it's code. People think they know what they want, but have no clue about the masses of details required. If the person with the idea and the programmer are far apart (in space, time, or culturally), it's pointless.
Most ideas don't require 100s of programmers, so the cost saving is minimal
> Now it's refining the idea someone has until it's code.
This is one of the most succinct ways I've seen this said. Thank you.
For the most part the days of handing a programmer a spec who then codes it up are gone. For many problems, programming itself is the easy part. Figuring out what to program, and the complete solution is where the majority of the work resides.
> the decision makers that are convinced GUI data engineering tools are the future.
The same train of thought leads to the idea that you can produce good software with mediocre developers by using processes, coding rules and checklists and "development methodology" like agile.
To be fair, Agile is not about leveraging mediocre developers - it's about enabling a handful of your best, most experienced hackers and architects (the fabled "two-pizzas team") to work under the best possible conditions, because the proponents of Agile had internalized the lesson that trying to scale up large teams of mediocre devs is a fool's errand.
Agile is what managers make it to be. To A LOT OF THEM it is about not doing any planning, writing zero specifications, using vague language for requirements, comparing performance of apples and oranges, "self-help"/motivational, justification to pressure those below and not bother with those above and so on and on.
But to most, it is just putting lipstick on the same old pig.
Did I mention the whole training, coaches and lango flinging industry?
Agile in it's original form - yes. Many everyday biz versions of agile et al - No.
Fundamentally, dev is something to negotiate with, cooperatively, versus commanding "do what I say"-style. Many businesses are unable or unwilling to do away with this top-down hierarchy, hence agile will never succeed there.
The agile they practise instead is "Agile, but without actually relinquishing top-down control".
TBH, I think what we are really missing is for the development team to be a contract team (ala Accenture) that can implement agile via official mediums e.g. pushback via contractually negotiation, backlog prioritisation via contract pricing ("we can do that, but this is what it'll cost").
The missing piece is people forget that the bar for software also rises with the ease of creation.
We do have very capable drag and drop CRUD generators now (not quite the same, but I still have a soft spot for VB5). But, what's expected, even out of a simple CRUD form, has risen.
I experienced the exact same thing (graduating in 2005) to the extent that it dissuaded me from getting a CS degree at all. Outsourcing was supposed to be the end of programming jobs in America.
My career took a 6 year detour until I finally made my way to programming professionally. I wish I’d had your conviction - the jobs I did in between weren’t half as fun and my career started that much later.
Interesting. I was just starting CS course at university in 2003 in Poland and it was obvious to everybody it's the future and there will be enough jobs for everybody.
There were 20 candidates for each place in the course so you had to pass the entry exams really well (and it wasn't the best university in Poland, far from it).
I guess that's the difference between being the outsourcing source and destination :)
> I guess that's the difference between being the outsourcing source and destination :)
Personally i'd agree that this is good for individual employment prospects, but not necessarily good for the country itself.
For example, here in Latvia i've heard that there's a shortage of developers and hence many government projects can't be fully staffed easily, because they're having problems being competitive in regards to wages, when compared to outsourcing companies, like Accenture.
Of course, there's no reason for why any developer should work against their interests in regards to being able to save and earn money, but it does seem interesting - maybe i'm wrong and it's just a regional thing?
Government Development jobs just don't pay enough to be competitive for talent. So instead talent goes to outsourcers. Here in the Netherlands that has meant most big government development projects are done by outsourcers. Which leads to bad incentive alignment. Yielding cost overruns, projects that are DOA, and projects with way to large scope. And at the bottom line, the consultancies are better of keeping it this way than to improve things.
But government is not agile enough to pay fair wage for development work. Besides, it would be unfair to all other government employees if these developers made so much more money with less time on the job and the same level of education. (sarcasm)
Government employee in Canada here. The pay itself is not necessarily the best, but top notch benefits, generous vacation (maybe a moot point in Netherlands but definitely not here), a rock solid pension make it a lot closer than the salary number alone.
There's also the fact that the number of hours worked is always fair, and that you can move within the public service very easily.
How many big projects are handled mostly "in house" by the government in Canada. And for the outsourced ones, how much input / leadership do the internal government people have?
Because it is largely the outsourced big projects where our Government fails.
The situation is not very different here: the outsourced projects often fail, and lots of the big ones are outsourced. Definitely not a good use of public funds (my opinion), especially since the funds spent for such projects are often stratospheric. Also, the amount of talent in government is not THAT small, so often these decisions seem very political.
Exactly. Even in Malta, most government IT projects are partially outsourced to local or international companies for many of those reasons. Developers don't usually want to work for the government at the start of their career.
I think for almost all countries the taxes developers working for foreign companies pay outweight the higher costs of hiring some of them for building software for the government.
Let's say government hires 1% of the developers in the country, foreign companies pay them 3 times more and if not for outsourcing 50% of these people would have to emigrate.
If X is base salary ignoring outsourcing then the country would earn
0.5 * X * tax_rate - 0.01 * X without outsourcing and
3 * X * tax_rate - 0.01 * 3 * X with outsourcing
So the change is (3 - 0.5) * X * tax_rate - 0.01 * X * (3-1) =
2.5 * X * tax_rate - 0.02 * X = X * (2.5 * tax_rate - 0.02)
So in this simplistic example for the country to benefit from outsourcing it suffices that tax_rate > 0.008 which is pretty much always true.
Let's say government hires 10% of developers, nobody would migrate and the salary difference would be 500%. Even with such unrealistic assumptions the tax_rate would just need to be higher than 10% for the country to benefit from outsourcing.
Depends how you define 'good for the country'. Upwards pressure on skilled wages due to increased demand rather than brain drain sounds like a decent place to be.
I remember a national best seller around that time was "The World is Flat" [1]. It talked about globalisation and made convincing arguments about outsourcing. Maybe your advisors were influenced by it.
Plus, there's also the "pork cycle" (https://en.wikipedia.org/wiki/Pork_cycle) aspect to consider: few CS graduates -> better employment chances/higher salaries for developers -> more people study CS -> (after a few years) more people graduate (if you're unlucky, coinciding with a recession) -> worse employment chances -> fewer people study CS and so on...
The failure of software outsourcing is really quite interesting, especially since telework is working decently well. That seems to eliminate the “remote work doesn’t work” objection.
It was not for lack of trying. The management class pushed it very hard. There are still many firms trying but it’s not really panning out.
From what I have seen the answer is that good software engineers are globally scarce and the good ones in India or Eastern Europe make enough that outsourcing isn’t that profitable once all overhead is included.
“Insourcing” is becoming a bit of a thing. You can get some salary benefits outsourcing to lower cost domestic places like Nebraska or Kentucky and not dealing with language, time zone, international currency exchange, legal differences, or other headaches. These folks can also travel to your office for an in person meeting in hours not days. Oh and their power grid and broadband are reliable too.
The jobs that have been most successfully outsourced are low to moderate skill manual jobs where basically what is needed is huge numbers of cheap hands.
Also a lot of the good software engineers in India end up relocating elsewhere, like the US, to get better pay. I've worked with a lot of really talented Indian software engineers, and some really terrible Indian developers (I hesitate to call them engineers).
All of the talented ones had relocated and were living in the US. I'm sure there are some talented software engineers that stay in India also, I just never end up working with them (probably because the companies I worked for were trying to save a few bucks by outsourcing to begin with, and went with the lowest bidder).
That’s strange, I started studying in 2005 and had planned to go into Comp. Sci. for at least 2 years, yet I had never heard similar sentiment expressed. Germany.
Same here (2005), but in Austria. I think Europe still had to catch up to the US‘ pre-2000 level of „digitalisation“ such that the bubble burst didn’t do that much to our industries.
Good point. I guess the language barrier was another issue, with English obviously being more popular than German, so "Indians in India taking our IT jobs" wasn’t really a thing.
Except that Germany and Austria have an obsession to outsource, or more accurately nearshore everything to Poland, Romania, Hungary, Ukraine, Czech Republic, etc. so dev wages there hold no candle to the ones in the US or even UK and the only path to decent pay is going into management.
I started university in 2000 in Spain, and I remember at that particular point in time studying computer science wasn't seen as desirable. When people had that interest, parents typically pushed them children to study telecommunications engineering instead. It was the fad at the time: they were all employed, had a strong lobby and made a lot of money.
It didn't last much, by the time I graduated from computer science everyone in computer science had a decent job (not paying a lot, but enough for a living) while telecommunications engineers were out of job in their specific field and mostly taking second picks in the CS job market. And to my knowledge, it has been mostly like that since then.
It really was the prevailing wisdom of the time. I entered the job market in 92, a few months after CERN released their browser. I knew what was coming when I saw the web for the first time, but the only jobs at the time where custom desktop software for businesses. It was the same after the .com bust which is right around where you picked up. 04 and 05 where really bad years to be in the industry, as the only jobs left in the industry where internal corporate software gigs. Which meant most of the employment opportunities where not, out in CA, but rather in sporadic areas around the country.
As well, there was a waves of offshoring, so again the common wisdom was do not go into computing. With that being said, I hade the fortune of working with a company that was early to mobile, and was working with Palm on mobile web apps around 2000, so when the crash came, though I lost that job due to the company closing, I knew mobile was right around the corner and went back to grinding out internal corporate software for a few years.
The offshore projects started to fail to deliver and then Steve came back and launched the iPhone. My experience is software is a boom bust economy. It's been more robust this run and everyone learned that outsourcing was not the win-win they thought it was going to be. In the meantime software ate the world so there is not enough hands to do the work that is out there and to your point, a whole generation was discouraged from entering the market. If there is any market that has a bad track record of employment forecast software dev has to be top of the heap.
I went to uni in 2004 and always had a crush for civil engineering. A few days before signing up my dad told me: I feel construction is stalling, I think you should go for Electrical Engineering. Best advice ever. By the time I finished construction was indeed dead and there were too many programming jobs.
Similarly to other users here from Europe, in 2010 the market was actually starting to take off. My dad was such a visionary... or I just had a total lack of it!
I graduated Electronic Eng in the UK in 1998, my best friend did Civil. I went into computers, he could only find low paid jobs in the UK so went to Hong Kong where construction was booming and now owns a significant part of a multimillion dollar company with hundreds of employees. I haven't really followed the money at all, so I'm not too sad.
Weird, around that time (a few years earlier actually) I was warned by the IT school we went to that they might try and pick us up to work for them before graduation. Didn't work out like that though, and plenty of my classmates couldn't find a job in IT and had to go for something else or move on to college.
I went on to college anyway because I wanted to do software development. Most of my classmates that joined didn't make it, or needed more time. A friend of mine ended up working in garbage collection and construction (remote crane operator) for a while, but he landed a job at a shipyard, and now a very cushy one at a generic IT supplier, traveling the country sometimes to go to jobs.
I had the exact same thing when I arrived at university in 2004.
There was a big Compsci building that had been completed a year or two before I arrived (started being built before the dotcom crash). It was scoped for the massive growth in students that they were expecting in computing.
By the time I arrived the thing was practically a ghost town. Student enrollment had cratered. I remember the huge fields of desktop computers available for students on multiple floors, and it was basically empty the whole time.
Actually, I was trying to find out the number of programmers world-wide the other day, and many US-based analysis were saying the same thing: "programmer jobs in the US will drop in the next 10 years as jobs move abroad".
For example [1], but many more with the same gist.
The BLS statistics have us based “programmer” jobs declining by 9% but “software developer” jobs expanding by 23% on a higher base. The job description is largely the same, although software developers have more design responsibility.
Yes and this thinking still pervades the engineers that came of age in that era and did avoid programming. One of my coworkers is obsessed with outsourcing and thinks of any programming task as being something that should be done in India. He asks why I should do something when we should hire a team in India, which is a ridiculous thing to say for so many reasons.
I went to university a couple years after that and also felt a similar sentiment that it was basically too late to get in on a good career before the industry got outsourced.
Of course then social media happened and everyone and their grandma needed developers so it didn't matter that people were flooding in because there simply wasn't enough labor to go around.
I think there are also qualitative reasons why outsourcing didn't prove as successful as many companies thought.
One being, that business requirements are often complex, and contain ambiguity, which requires communication between developers and the business, which is hampered by differences in timezones and culture.
I was one of five students in my program that had been a huge multi-campus program only a couple years earlier. We were the last cohort before the college discontinued the program entirely, and I was the only one to graduate. What I found however for maybe five years after graduation was a insanely high demand for developers.
There was genuinely a generation that was so strongly discouraged from becoming developers that there were very few. Seems to me like the folklorists have largely missed this.