I think comments like there are border-line trolls in a community like HN.
The languages used in various parts of a CS curriculum are one component to the whole. Holding up that component as the reason that the sky is falling is, at best, disingenuous. I can construct terrible CS curriculums that start with C, and terrific ones that start with Java.
If those who construct the curriculum want the beginning courses to focus on algorithmic thinking, I think it's fair to use a language that abstracts away much of the physical machine. The abstractions can be peeled away in later courses.
If, instead, they want the beginning courses to focus on the realities and difficulties of dealing with computer systems, it makes sense to start with something like C. They can then introduce the abstractions that let people manage those difficulties in later courses.
I think both approaches are valid, as long as a student gets a view of the important points of the field. I can even see arguments why one approach might be better than the other. But claiming that one approach represents the failure of our CS academic system is zealotry.
You have to meta it one level. The reason C was taught was because it was close to the machine without actually being assembly language. That's pedagogically useful because it exposes the student to all the properties of the Turing machine without having to be too specific about implementation (not that MIPS assembly isn't also a good choice). Similarly, the reason to teach Smalltalk is to teach OO, the reason to teach Lisp is to teach Lambda Calculus. The only reason Java is taught is because students demand it because they've heard that if you know Java you can get a good job.
Now, that doesn't imply that Java shouldn't ever be taught. But the reasons for choosing any language should be academic reasons. Particularly, it is bizarre to see academia trailing industry in language adoption.
You claim that "the only reasons Java is taught is because students demand it," and it appears your objection is based on this claim. Do you have any support for this claim outside of your personal belief? My undergrad made the switch to Java right before I graduated, and they are a counter-example to your claim.
Also, I doubt your claim about C is true. C was used because it was close to the machine without actually being assembly. I suspect C was then taught because it was used everywhere.
I think his claim is based on the belief that Java has no academic merit (in the context of the other languages he mentioned) - it's just useful for development.
One step further: If you think that learning a specific language in university is going to get you a job you are not studying computer science, you should be going to a trade school.
A cs degree is universal, it should be language agnostic.
One computer language or another, it doesn't matter one bit, they're all functionally equivalent. Just like a chef cook has 30 knives to choose from you have a palette of languages that you could choose from to solve a given problem.
If you really understand computers then the languages are just a means to an end.
I disagree with your statement that choice of language "doesn't matter one bit." Some languages are better at some tasks than others. Appealing to functional equivalence ignores the relative cost (in time and characters typed) of expressing the same idea in different languages.
I suspect you're abusing the term "theoretical computer science," which is surprisingly common in this crowd. I assume you actually mean basic computer science concepts relevant to programming.
To address your question, Java abstracts away memory management - not just dynamic memory allocation, but common off-by-one mistakes will result in a runtime exception. It's possible, but unlikely, that you'll get a segfault in C. You probably own the memory just past your array, and you're more likely to get strange errors because you're invisibly overwriting values.
If you want to focus on algorithmic thinking, and not the realities of a computer, this is a win. As another posted pointed out above, I think other languages are better suited for this, but Java is still valid.
"If those who construct the curriculum want the beginning courses to focus on algorithmic thinking, I think it's fair to use a language that abstracts away much of the physical machine. The abstractions can be peeled away in later courses."
I don't know that Java does this all the much better than C. The problem with C for a beginning student isn't so much that you have to manage memory manually--it generally takes a few weeks to even get to malloc() in a C-based introductory course--but that C gets in your way with explicit typing, #includes, etc. Java does away with some of that but introduces its own OO scaffolding to get in your way too. Instead of having to write main()s and #include's, the Java student has to enclose their functions in a class and so forth. Let's compare Hello World in C, Java, and C#.
C:
#include <stdio.h>
int main(void)
{
printf("Hello, World!\n");
return 0;
}
Java:
class HelloWorldApp
{
public static void main(String[] args)
{
System.out.println("Hello World!"); // Display the string
}
}
The Java and C# examples are even more cluttered than the C example when it comes to superfluous tokens: it has a class declaration, the method signature is more unnecessarily elaborate, and the print command has like three levels of object-drilldown in it. When you get to the simple procedural programs that a beginning student will write, this mysterious crud remains unresolved for longer. It's not enough to explain typing as you would in C or Pascal, but you have to talk about object-oriented programming before you get into problems complex enough to justify that level of abstraction.
If you really want an abstract language to enforce algorithmic thinking, pick one that doesn't have all that extra mental burden when you first approach it.
Perl 5.8
print "Hello World!\n"
Perl 5.10
say "Hello World!"
Python
print "Hello World!"
Ruby
print "Hello World!"
The cool thing is that these languages still have subroutines and classes and so forth, but they don't force you to declare a class, declare a subroutine, and call an object method just to code "hello world".
Java has advantages over C. These advantages don't include "letting beginning programmers focus on algorithmic thinking by using high level abstractions". Java's higher level than C in that it protects you from naked pointers and lets you do OOP, but that's not the type of high-level abstraction that helps a beginning programmer, especially not when it comes at the cost of forcing them to put everything in classes and methods.
If those who construct the curriculum want the beginning courses to focus on algorithmic thinking, I think it's fair to use a language that abstracts away as much as possible. We have no shortage of good interpreted languages to accomplish this.
Actually I'd argue that with C, you have to start managing memory manually before you even get to malloc. The abstraction advantage of Java over C (not that I think Java is necessarily a better intro language) is that you can generally explain the syntax in abstract concepts and then use it like you'd expect. With C, it's far more likely to encounter scenarios that don't fit a simple model of understanding.
For example, unless you're concerned with specific performance issues, you're not likely to care how a string is implemented in Java. It's difficult to use strings in C without understanding memory. Without understanding when strings are mutable and when they aren't, what null-terminated means, how "%s" works, and such you will quickly run into some unexpected behavior and will likely just trial and error until you get something that seems to work. When you understand that C is a syntax for allocating and manipulating memory, it tends to make a lot more sense.
I purposefully phrased my statement to allow for dynamic language like Python - which is what I would probably choose for a starting language. I constructed my comment to also address the same arguments that have popped up in the "MIT switched to Python" discussions.
I've taught intro to Java labs to undergrads, and I did get questions on the scaffolding required. I answered their question, but also told them they don't need to understand that now. I'd rather not have to do that.
I would argue that Scheme fulfills the same purpose we discussed--getting out of the way and letting students focus on algorithms--except it emphasizes expressing those algorithms in a functional style.
The languages used in various parts of a CS curriculum are one component to the whole. Holding up that component as the reason that the sky is falling is, at best, disingenuous. I can construct terrible CS curriculums that start with C, and terrific ones that start with Java.
If those who construct the curriculum want the beginning courses to focus on algorithmic thinking, I think it's fair to use a language that abstracts away much of the physical machine. The abstractions can be peeled away in later courses.
If, instead, they want the beginning courses to focus on the realities and difficulties of dealing with computer systems, it makes sense to start with something like C. They can then introduce the abstractions that let people manage those difficulties in later courses.
I think both approaches are valid, as long as a student gets a view of the important points of the field. I can even see arguments why one approach might be better than the other. But claiming that one approach represents the failure of our CS academic system is zealotry.