Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

As a former-dropout myself [1]: You might some of the courses utterly stupid now, and you might have many doubts about the usefulness of a degree, but I think ultimately a degree is very very much worth it, both intellectually and for logistics reasons. A degree will open doors to you, for example, most jobs will throw your resume right away if you don't have a degree. Some countries won't allow you to immigrate if you don't have a degree. There are many dreams that require a degree.

The only reason that you can justify dropping out is that either (1) you think you can't possibly learn anything useful from the professors that are teaching you and you'll rebuild/repay what you didn't learn one day (and you better have to have a good answer when you'll do that right now), or (2) when you have a grand startup like Bill Gates or Mark Zuckerberg. But I don't think that's why you're wanting to drop out now. So don't drop out. Keep pushing.

For me, I wish I learned assembly, kernel development, stats and machine learning. First two because I love to, the latter two because they are useful.

I am now almost finished with grad school, and I feel like I know nil. But in a very Lao-tzu way, I think the biggest enemy of mine is myself (the willing to sit my ass down and learn), not that these can't be learned by myself. Lately, I think I somehow I overcame that problem and was able to read, learn and make a lot of stuff on my own. I think the same thing can be said about anyone who had the patience to get a degree as well: It means they are willing to deal with things they don't totally enjoy to get what they want. As Lao-tzu said, patience is a good virtue by itself...

1: If you need to verify, read the entry called crankshaft #2 on my blog on my profile.



I remember signing up for a course in mass transfer that we joked would have been better titled bubble science. It seemed so obvious at first, but as we progressed, the math and concepts became more vast and inscrutable to me, and by the end I'd learned so much about something outwardly so simple that I felt that I'd barely scratched the surface of the topic. But that's not really an unusual outcome for learning, right?

Pursuing a technical education is tricky because those interested often have an elevated baseline knowledge and want to jump ahead without relearning fundamentals, but it's often those fundamentals that cause growth to suffer later on. Realizing that you're actually struggling with algebra while you're taking mv calc is a big eye opener, and realizing that you get the basics and applications of certain implementations of certain technologies in the first two weeks of a course can feel redundant and insulting, but that's because it's hard to gauge or trust that there's more to things beyond the limits of our understanding, not because the material is unworthy.

Overcoming that is humbling, and that can put people eager to get a start on making money because they're already slightly better at something than the population at large at odds with the goals of higher learning, but it's a necessary part of our growth and perspective.

People who grew up being told how smart or special they are can have a harder time with this, and I know it was pretty embarrassing for me when I realized early in my adulthood I was much closer to the "kid who's good with computers" category than an actual "IT professional", despite being able to successfully complete contracts and make money from what I was doing. Those experiences helped me re-evaluate my approach and get out of the "I'm already awesome, why would I need to do more" mindset. Had I not realized that, I might have stubbornly stalled out thinking I didn't need anyone else while the world passed me by.


>despite being able to successfully complete contracts and make money from what I was doing.

I think making money is like cashing out the investments (knowledge/experiences). Nothing wrong with cashing our the investments when we enjoy the cash and continuously reinvest using some of the money we cashed out. I think it is dangerous when we have a little investment and no plan what to do when that investment runs out - when our knowledge becomes completely obsolete.

Which is basically "stop learning and just work now to make money" does.

>Those experiences helped me re-evaluate my approach and get out of the "I'm already awesome, why would I need to do more" mindset. Had I not realized that, I might have stubbornly stalled out thinking I didn't need anyone else while the world passed me by.

Happened to me as well. At one point recently, I figured out that being able to make money is not always the result of being able to enrich life. I could gather a lot of money while stalling. And a person who is growing immensely might be very poor. I feel as we get older, we tend to cling to a few metrics that we do good on to judge others, and money is a popular metric.


> Those experiences helped me re-evaluate my approach and get out of the "I'm already awesome, why would I need to do more" mindset. Had I not realized that

What helped in getting out of the mindset, what did you do?


I hit a number of walls professionally, interviewed more ambitiously (and unsuccessfully), and in general discovered the delineation between my perceived and actual values. It was more "look at me doing what people go to school for and make careers out of just because I can" novelty instead of realizing what kind of role I was serving, understanding my market, and trying to have good business sense. I thought of what I did as a series of problems to solve for cash instead of a mutual relationship, and I was just good at getting my foot in the door. Pretty obvious mistakes, really.

Even more generally, I left my comfort zone and put my experience to the test against people with either a lot more resources and education. I either was unable to complete these larger scale jobs, wasn't able to negotiate effectively or else let them run all over me with pay or feature creep, or had to be able to say (read: admit) I couldn't actually do or understand the work as I was.


"you think you can't possibly learn anything useful from the professors that are teaching you"

Of course, it is possible that you will learn from the professor, but that you would learn _more_ by doing other things with four years of your life. University ahs no monopoly on knowlege.

Also it's disturbing what starting life with a gigantic, crushing pile of debt can do. (It doesn't have to be this way, but I've seen plenty of even quite intelligent individuals go this route and regret it).

If you can, take advantage of things like junior colleges. They're amazing. I owe my career not to my degree (pointless, though people at least tend to say "well you can't be an idiot" when they see a physics degree), but rather to what I learned taking a few night classes at Santa Monica College (including assembly, which was good fun) while working a day job and doing some projects for fun.


>Of course, it is possible that you will learn from the professor, but that you would learn _more_ by doing other things with four years of your life. University has no monopoly on knowledge.

Lao-tzu said that you don't know what you don't know. I think staying in college is among the best ways to fill the holes that I am not aware that I have. I wouldn't have learned compilers, music, arts, game theory, automata, microecon, macroecon, accounting, matrix, physics, biology if I never had to take them/have them offered to me. I think everyone could absolutely learn everything on their own faster and more effectively. I think I would much less likely to be aware of them, and even less likely to endure the pain to go through them, that's for sure. But that's my pro-liberal arts view.

In the case of OP, they said they wanted to learn "brainwave sensors, VR, and that jawbone thing from MIT that was posted a few weeks back..." Maybe a few weeks more he/she will jump ship to another thing that MIT got on the news and the brainwave VR jawbone thing doesn't get on the news anymore. That's the tragedy of many "independent researchers," myself included sometimes. Having the patience to study something really well is really, really hard. Colleges are good at making you do that. At the very least, they guarantee whoever got out of it can spend at least one semester studying various subjects well, and many semeseters studying one subject they have the degree on very well. Whom do you trust more when you read their resume given you know nothing more about them: A person who claims to know AI/ML on their resume, or a person who has a degree in AI/ML?

>Also it's disturbing what starting life with a gigantic, crushing pile of debt can do. (It doesn't have to be this way, but I've seen plenty of even quite intelligent individuals go this route and regret it).

>If you can, take advantage of things like junior colleges. They're amazing.

Agreed. I attended a cheap college too. It was amazing.

>I owe my career not to my degree (pointless, though people at least tend to say "well you can't be an idiot" when they see a physics degree), but rather to what I learned taking a few night classes at Santa Monica College (including assembly, which was good fun) while working a day job and doing some projects for fun.

I totally agree with you. I didn't think that I owe what I became today to exactly what I learned in college. But I still think that I am capable of what I do today thanks to years being in college. Lao-tzu said colleges may not give you the fish directly but it teaches you how to fish for yourself.


>Lao-tzu said colleges may not give you the fish directly but it teaches you how to fish for yourself.

And I say fuck Lao-Tzu and what he said.

Someone living in 500 BC should not direct your life's direction. You have something called the Internet now, which he didn't. You have nearly unlimited access to knowledge that you can't possibly consume in all your lifetime. You have courses from top tier colleges free. You can learn anything.

Yeah some stuff requires a degree to work in if you're not at the top of the field and want to join a good company (ML, Neurotech and stuff like that). Most of it doesn't. College isn't just expensive. It's passive and it has a terrible system of teaching people stuff that doesn't interest them just because someone thought it would be useful to them. And it's boring as fuck.


>You have something called the Internet now

The Internet hasn't changed fundamentally the way we think and study many subjects. There are no internet forums or MOOCs that are able to deliver many aspects of what you'll be able to learn in an art class or in a lab. Going to college by learning from MOOCs on the Internet is the new learning to paint from Bob Ross.

>And it's boring as fuck.

Of course, college courses are boring. Bob Ross painting sessions are more fun than taking a painting course in college.

>You have courses from top tier colleges free. You can learn anything.

I don't dispute the fact that we have very good courses online. But then, if they actually solve the knowledge and skill gap, then you have to ask yourself why the pay gaps between people in many industries get bigger, not smaller?

Please don't get me wrong, I think MOOCs are a very good thing. I was among the first people to have received a course completion certificate from one of the first MOOCs (if not the first MOOC ever) back in 2012 called MITx 6002x. As a person who took MOOCs and advocated for them, I would say the idea that the average person can learn just as well from MOOCs/Youtube/Hacker News/"The Internet" and skip college shows a shallow understanding of the matter.


Sorry if it came out a bit harsh, I had one of those days and vented it out on you.

I feel like you have a narrow view of college since you seem to be from the US and your system is widely different from ours here. What it offers here is a narrow approach that teaches you bland theory, with professors being mostly people who weren't competent in their fields so remained in academia (before someone goes it isn't true, I said mostly. There are ones that are amazing and I'd love to be a part of their classes any time). You can't explore outside of your curriculum - you can't take an art class, a philosophy class, a sociology class if you're in CS. You're fixed into a course that rarely changes and they are usually outdated and provide as much info as a week of googling would. No textbooks also, so no material to reference too.

College (in this format we have here) stepped on one-too-many toes for me and I despise it, so I tend to overreact.


I think part of the ability to succeed is even rooted in how invested you actually are in school. I don't think its great, but people definitely put more effort in when they have some skin in the game...


> And I say fuck Lao-Tzu and what he said

Is this because you don't like him? Don't like that quote? Or do you not like what is being stated? I don't see what is so incendiary from the supplied quote. I actually believe it has quite a bit of merit and did not know who it was attributed to previous to this post. You can learn anything, sure. Maybe university sucks for learning (I don't entirely agree). But do you not think that there are many cases where you don't know how much you don't know about a given topic? In my experience, it is extremely common in this field... just my 2 cents.


> For me, I wish I learned assembly, kernel development, stats and machine learning.

Did you take compilers, learning e.g. parsing theory? If so, are you happy you did or did you feel it's skippable? If not, do you wish you had taken it?


Compilers is definitely my #1 favorite course. Data structures and algorithms is my #2. I kinda have a love-hate relationship with data structures later on because I never feel I am good enough to say I love it. With compilers, I found only love (perhaps because I don't know enough about it to fear it). Compilers was taught by a very competent professor in my college. It was an undergrad 1-semester course in a liberal arts school, so I think it wasn't as hard as courses offered at other schools. Nevertheless, I enjoyed it immensely, it made sense of everything that I learned in those boring theoretical courses like automata and formal languages. It showed that little machines with very little memory and power can do amazing things. It showed why the ancient calculators with practically no memory can parse a very complicated math function correctly. I really found my fascinations being unleashed in compilers. I still remember at the very end of the course, with the people who survived the last assignment, our professor handed out to us the paper "Reflections on Trusting Trust" by Ken Thompson.

I don't think it needed to be mandatory because I can see why some people don't like it. I believe people who get out of school to be web devs, for example, will not be needing it to be competent. But I really think the ideas in the courses are useful in real life in many cases. Later on, I even used what I learned in that course to make a poor man's HTML parser to translate rudimentary HTML to the instructions to write to the Adafruit thermal printer. So basically it makes the thermal printer a wireless one with an easy-to-use API that you can interface with from a phone app [1]. The code is for the Raspberry Pi, but was intended to run on an extremely limited uC that is embedded in the printer itself. I never had time to make the actual hardware but it works well enough for a Raspberry Pi right now. Without the stuff I learned in Compilers, that would have been impossible.

1: https://github.com/htruong/html-bt-printer


I took compilers. It was a lot of work but I think it takes away a good bit of the magic (and adds some more in) to what makes everyday programming possible. I don't think it was vital but If I was designing a CS degree I would definitely not axe it.


For the sake of registering a counterpoint: I disagree entirely that things like kernel development or assembly (and let’s throw in architecture, computability theory, and all that jazz for good measure) are even remotely useful in software engineering. You’ll forget most of it and personally I don’t think it will even meaningfully alter your performance over the long term.

Knowledge that is acquired but not routinely recalled or applied will atrophy.

Sometimes you can make the argument that it’s worth your time to satisfy your own intellectual curiosity and I can understand that. Where people misstep is in thinking all knowledge is created equal.

I used to rationalize forays into theoretical material as holistically improving my capability as a thinker. In hindsight, it’s obvious that was bullshit. There are much more efficient ways of turning yourself into a good thinker that are more directly relevant to how things work in the real world.

The other thing I realized (and this is more specific to me), is that if I were to give myself the luxury of diving into knowledge for its own sake, I would choose a topic in the natural sciences, like physics or astronomy. Computers are interesting, but the theory surrounding them doesn't do much to help explain the nature of our reality, which I personally find much more fascinating.

If I could go back and redo my education, I would try my best to focus on a combination of:

(1) The most pragmatic courses in CS. IMO, the most useful ones beyond the intro courses were data structures and algos, distributed systems (project-driven), OS design (writing a simple OS), basic prob/stat, and intro ML (you do not and never will need deep anything, unless you decide to specialize). You could cover all of that in about a semester and a half tops.

(2) Projects out the wazoo. Real ones. Ideally motivated by a real problem and birthed into the world with all the messiness that entails, and iterated upon until they create real value for someone. You'll learn a stupid amount along the way.

(3) Through some combination of courses, reading, and projects: scripting/automation, API design (easy), modern web dev (project plus lots of Googling and learning to accelerate learning by relying on others), mobile app design (same approach as web dev), PaaS via AWS or GCP (or bespoke), basic security, AMQs, orchestration (at least Docker; maybe Kuberbetes), proxying (uses of Nginx) and UNIX/Linux networking fundamentals, metrics and analytics (with an emphasis on learning the value of instrumenting a system/product/business and using the feedback to improve it), databases (Postgres at least; become super proficient at SQL), basic UI/UX design principles, software engineering best practices (from simple things like KISS, coupling, testing, all the way up to reliability, availability, maintainability, scalability, and good decision-making, particularly with respect to knowing how to achieve a sensible balance between time, cost, and quality).

I’m missing a lot, but in short you should know every technology function required in a modern company at least at a basic level. Some people call this "full stack".

If you want a lasting career in tech and you don’t plan to specialize, then this is the way to go. The merits of being a specialist vs a generalist are debated all over the place. Thiel will tell you to relentlessly focus on one thing and ‘vertically integrate’. Scott Adams will tell you to get very good at two or more things and then combine them, since becoming the best at any one thing is extremely hard.

If it’s not obvious, I chose to be a generalist. If I had to explain why, it would be because: (a) I don’t like the risk of committing to one thing (“blockchain engineer" seems like a dubious track, for example), (b) I get bored easily, (c) specialization often but not always seems to lead to myopia, which is cancer in any enterprise; this is hard to explain but you’ll know it if you ever see it: everyone operating in their own silos, incapable of cross-displinary thinking, lacking empathy for the nature of what other people do, pervasive groupthink, arrogance (d) if you’re not good enough to be a top-tier specialist (I'm not), then the way you maximize the value you can create and that you can get paid for is to be an exceedingly useful generalist, who can think across organizational concerns and boundaries effectively.

(4) What Charlie Munger calls “remedial worldly wisdom”.

The most appalling failure of our education system is that it produces people who can take a test but can’t think independently, let alone innovate.

Some of us software engineers get to thinking we’re hot shit. We're not. For one simple reason: what we do is almost always deterministic. Someone has done it before and written it down so that you can do it too. At worst, you have to tweak something a bit to make it work for your situation.

In the real world, nondeterminism drives novel value. In other words, everything wrapped around the lines of code you write is what's important. That means you're going to be hard-pressed to make a dent in anything if all you can do is write code.

Thinking well is a broad subject and you’re going to have to tackle it on multiple fronts, probably for the rest of your life. The most important thing by far is behavioral psychology. Do whatever you possibly can to grasp it. Additionally: systems thinking, philosophy, basic accounting, very basic economics (as soon as they say “Solow Model” run away; ideally well before that), some history. Poor Charlie's Almanac is a good starting point for much of this. It'll help you appreciate why this is important.

You should also know how to apply math to solve any problem you run into that falls short of involving calculus or advanced prob/stat. In a perfect world, you would know how to apply calculus as well, but the opportunities to do so are so few and far between that you likely won’t remember it beyond basic differentiation/integration in the long run (or at least I didn't since I have a poor memory, but that may not apply to you).

(5) Read The Lean Startup. And expand out. Be careful since there’s a lot of garbage in the business genre. Others I can recommend: The Phoenix Project, Lean Analytics, the first part of The Startup Owner’s Manual (the latter two parts only if you ever get past the first stage of building a company). Even if you never choose to work on a startup, it’s the same kind of thinking that will enable you to generate outsized value in any organization. Good decision-making offers at least an order of magnitude better value per unit time than writing code. You will get in the door by writing code. You will get up the ladder by making good decisions.

When you read books, get paper copies and write in them: underline, take notes in the margins, drop in some Post-Its to mark really good sections, etc. If you read a book that really resonates with you, then go further and write up notes on it afterward. Even just underlining a book is ridiculously useful. Underlining alone can allow future you to skim through what you understood to be the most important parts of a 300 page book in roughly 10 minutes.

All of the above may seem like a lot. And honestly, it would all fit in easily if I could swap out the less useful required parts of my CS degree. But that won't be viable until universities offer that option and companies stop thinking a complete CS degree is the qualification they should be targeting. Until that happens, the onus is on you to not let your "schooling interfere with [your] education."


>For the sake of registering a counterpoint: I disagree entirely that things like kernel development or assembly (and let’s throw in architecture, computability theory, and all that jazz for good measure) are even remotely useful in software engineering. You’ll forget most of it and personally I don’t think it will even meaningfully alter your performance over the long term.

I said I want to take them because I want to (i.e. they are fun to me personally), not because I said they are useful. I think that's the same reason many people went deep into a field, they found it fun to work on problems in that field. Sure, we're not the hot shit, it might not teach us many useful skills. But somehow the idea of satisfaction in the field of study and work goes a long way to me. It makes me stay late at night working on things that matter instead of smoking weed every night and wonder about our life choices and thinking about dropping out. I've been there, done that, I know how it feels to be in a noble place but dead inside. I'd choose to be creative and inspired to work any day of the year.

>Some of us software engineers get to thinking we’re hot shit. We're not. For one simple reason: what we do is almost always deterministic. Someone has done it before and written it down so that you can do it too. At worst, you have to tweak something a bit to make it work for your situation.

>In the real world, nondeterminism drives novel value. In other words, everything wrapped around the lines of code you write is what's important. That means you're going to be hard-pressed to make a dent in anything if all you can do is write code.

I totally agree. Personally, I think of myself closer to being a creative person than a procedural person. I think creativity is very very important perhaps, as much as competency. That's why inspiration is great, and that's why studying something that I find fun is important.


That blog entry is well worth reading. It put a smile on my face! :-)




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: