>it feels a bit like cheating. Because you know that somewhere, somehow, the code you write is translated to assembly instructions. To the machine language.
I think you should go all the way, show the machine code along with the assembly language. Show them that in the end it's all numbers.
Well, philosophically it's not even "numbers" per se, it's some physical tokens that are moved around in a manner that is isomorphic to a kind of arithmetic and logic.
[edit] And yes, by "physical tokens" I mean voltages and not atoms, which you can argue are not "really" what is going on ... etc
From a pedagogical standpoint, it's a shame we've effectively lost self-modifying code. It's a very good way to show students that everything the computer does is driven by data, and the only difference between "code" and "data" is interpretation; this is the heart of the stored-program computer, as opposed to things like plugboard systems or adding machines, where the instructions were in hardware and were, therefore, a different kind of thing from the data.
It's also a good introduction to pipelines and caches, when some bright spark tries to modify the next instruction and it "doesn't take" in the way the student expects.
Why do you think we've lost it? You can still write self-modifying code today. You may need to change permissions on the pages if you have a system that doesn't allow both execution and writing at the same time, but that's easy.
,,I avoided using any TypeScript- or JavaScript-specific language features in the''
I don't understand this part. The hardest part of compiling TypeScript to machine language is that you don't know what an operator does, as you don't know the exact type.
The book doesn't seem to have a specific limited scope to me, it seems to not choose between interpreting, JIT compiling techniques (optimizing/deoptimizing) or static compiling.
For a modern language I would probably use Rust instead (also stay with Intel architecture to make trying things out easier).
Looks cool! My first impression from the title though was that it was a book about building a compiler for in in the graphical MIT Scratch language [0] and I very interested to see how someone would do that.
For clarity, it's "from Scratch" like making an apple pie from scratch. I'm looking forward to the book!
I think you should go all the way, show the machine code along with the assembly language. Show them that in the end it's all numbers.
Well, philosophically it's not even "numbers" per se, it's some physical tokens that are moved around in a manner that is isomorphic to a kind of arithmetic and logic.
[edit] And yes, by "physical tokens" I mean voltages and not atoms, which you can argue are not "really" what is going on ... etc