Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

You have to meta it one level. The reason C was taught was because it was close to the machine without actually being assembly language. That's pedagogically useful because it exposes the student to all the properties of the Turing machine without having to be too specific about implementation (not that MIPS assembly isn't also a good choice). Similarly, the reason to teach Smalltalk is to teach OO, the reason to teach Lisp is to teach Lambda Calculus. The only reason Java is taught is because students demand it because they've heard that if you know Java you can get a good job.

Now, that doesn't imply that Java shouldn't ever be taught. But the reasons for choosing any language should be academic reasons. Particularly, it is bizarre to see academia trailing industry in language adoption.



You claim that "the only reasons Java is taught is because students demand it," and it appears your objection is based on this claim. Do you have any support for this claim outside of your personal belief? My undergrad made the switch to Java right before I graduated, and they are a counter-example to your claim.

Also, I doubt your claim about C is true. C was used because it was close to the machine without actually being assembly. I suspect C was then taught because it was used everywhere.


I think his claim is based on the belief that Java has no academic merit (in the context of the other languages he mentioned) - it's just useful for development.


Well exactly. What aspect of theoretical computer science is Java better suited to teaching than any other language?

Kids these days don't understand that computer science and software development are distinct disciplines.


One step further: If you think that learning a specific language in university is going to get you a job you are not studying computer science, you should be going to a trade school.

A cs degree is universal, it should be language agnostic.

One computer language or another, it doesn't matter one bit, they're all functionally equivalent. Just like a chef cook has 30 knives to choose from you have a palette of languages that you could choose from to solve a given problem.

If you really understand computers then the languages are just a means to an end.


I disagree with your statement that choice of language "doesn't matter one bit." Some languages are better at some tasks than others. Appealing to functional equivalence ignores the relative cost (in time and characters typed) of expressing the same idea in different languages.

See PG's essay where he talks about the Blub Paradox: http://www.paulgraham.com/avg.html

Note that I think this point is different than how this thread started, which was about teaching.


I'm sorry if I wasn't clear enough, I thought I covered the 'some languages are better at some tasks than others' with the 30 knives analogy.

The right tool for the right task.


I suspect you're abusing the term "theoretical computer science," which is surprisingly common in this crowd. I assume you actually mean basic computer science concepts relevant to programming.

To address your question, Java abstracts away memory management - not just dynamic memory allocation, but common off-by-one mistakes will result in a runtime exception. It's possible, but unlikely, that you'll get a segfault in C. You probably own the memory just past your array, and you're more likely to get strange errors because you're invisibly overwriting values.

If you want to focus on algorithmic thinking, and not the realities of a computer, this is a win. As another posted pointed out above, I think other languages are better suited for this, but Java is still valid.


It's quite possible to argue that lambda calculus is not relevant to software development - I think that's the sort of thing gaius was talking about.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: