Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Oxford and Cambridge both begin with an ML, the same way I was taught last century (although of course it's a different ML today, I believe one of them uses Ocaml).

Unavoidably I am biased, but I nevertheless feel confident that this is the correct way to teach programming to CS students in particular. One reason to use an ML, and especially to use a less common ML, is to create a level playing field. Teaching Python, Java (as they do today at his institution), or C (yes that happens) means some of the fresh students have months, maybe years, even occasionally decades of prior practical experience in the "first language" before day one. These students will find most, or even all, of the exercises trivial and so they won't actually learn what you were teaching, which may surprise them too, they're also likely to be disruptive, after all they're bored because this is easy.

Now, for just a "level playing field" you could choose almost anything unpopular these days. Pascal? Scheme? Fortran? But that's not the only reason. MLs have a nice type system, which is great because probably simultaneously on the theory side of their course your students are learning about types, and Hindley–Milner gets you from a theory to a real world, from why this is a good idea, to huh, the machine did exactly what I meant, that's neat. If your colleagues are teaching them a sane approach to types, but you're teaching a language which says YOLO, everything is just a bag of bits don't worry about it, that's not a cohesive message. Likewise for function composition. It'd be nice if the first language you're teaching has actual bona fide mandatory TCO, so that when the nice elegant explanation in the theory class shows tail calls, the result in your practical class isn't a room full of stack overflow errors (or worse).

This is reasonable for a First Language because you're expecting to teach these students several languages. The same doesn't apply to say, an optional module for Geographers that gets them enough Python to use a GIS competently. But we are (hopefully) not relying on Geographers to invent the next Python.



I don't know if I agree with ML as the first language to teach students.

The syntax is vastly different from most of the "C like" languages out there and any functional style code is miles away from what actually gets executed by the CPU.

The fact that you even mentioned TCO means you are aware of that since that is a compiler level optimisation, effectively replacing your recursion with an imperative set of instructions simulating a loop or even precomputing everything in the compiler.

Python also does not treat types as "a bag of bits", it very much has a very strong concept of types and you can make them explicit using annotations.

What would be the difference between running a compiler vs running something like mypy? You as the programmer still had to reason about the code all the same.

I would absolutely also argue that multiple languages should be taught since in the real world that will be expected of you, so you might as well get a head up.

Why not make the languages taught be languages you are very likely to use? I would be all for teaching C/C++, JS/TS, python, C#/Java, Go and then sure add in a ML style language as well.

The arguments for python->C->ML are pretty strong to me, you can get a very very broad exposure to a large collection of programming paradigms at a ton of abstraction levels using just that. But now we are delving into territory where I begin to argue about what should be taught so let's leave it there.


I think you're making roughly the same mistake I described in the grand-parent comment, where people confuse the C abstract machine (roughly a caricature of the PDP-11) for their very different real computer. Yes, C works very like this abstract machine, as well it might, but, that's not how your computer actually works and so if you tell students that's what's really going on you're presenting them a fiction.

If you don't like TCO because it's "effectively replacing your recursion with an imperative set of instructions simulating a loop" then you're really going to hate the fact that the machine doesn't have loops anyway, the machine just has several flavours of go-to and computed go-to, and none of the languages you're proposing to teach instead provide these hard to use primitives (no C's goto is not the full go-to complained of in the famous letter)

What you wrote will be transformed, whether it's tail calls, a for-each loop, or the C-style pointer iteration - none of these are actually what the machine does. We should not prefer to write the code that most closely resembles what it'll get transformed into. Knowing how the transformation works is interesting, probably enough to justify showing some of it in class in an undergraduate degree but the purpose of writing high level languages like C is to express ourselves more clearly, something we certainly do want these students to learn - yes it needs to be transformed into machine code, but more important is that humans, including us and often future maintenance programmers can figure out what is intended.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: