I am just finishing The Innovators by Walter Isaacson (the guy who also did Jobs' biography) and it did a good job at tracing the threads of theoretical information management systems through to their practical manifestation via the invention of the web by Berners-Lee. It's a decent read and I'd recommend it if you have an interest in the history of computing as a whole and how the various ideas and concepts that came to shape how computers exist today were first born then later realised.
interrupts in out-of-order processors were the topic of last's week coursera comparch course https://www.coursera.org/learn/comparch - nice to see the discussion for a smaller machine
really nice project - i was playing with compilers and javascript a while ago see http://embeddednodejs.com/compiler - as a simple experiment to learn about javascript's role in compilers
Octals (in the current context) is simply a way of referring to the fact the opcodes are expressed in octal values (base 8), as opposed to hex (base 16) or binary (base 2).
Octet on the other hand is simply an alternative word for byte ( is no longer used and is somewhat archaic.) They are in no way different to bytes and are just another way to refer to 8 bits.
Octal used to be a lot more popular before the 80s and most languages still interpret numbers with a leading zero as an octal value.
"Octet" is still commonly used and in no way archaic. In fact, most standards use "octet" instead of "byte" because, despite IEC having said otherwise in IEC 80000-13, most other references still define the byte as the smallest addressable unit that can represent any member of the host system's character set.
Notably, C99 (and its predecessors) follow this convention. Owing to history, and the curiosities and resolution constraints of fixed-point arithmetics, a lot of devices with a DSP in it think a byte is anything but 8 bits.
There used to be systems that used 5, 6 or 7 bits for a byte, too, but as far as I know, most of those really have gone the way of the Dodo. However, since most of those systems were used in fields like telecom (and they weren't only what we'd call "computers" nowadays), "octet", rather than "byte", is still commonly used in virtually every networking-related context.
I think I was a bit too careful not to hyperbolize and I ended up doing the opposite. Non-8-bit bytes are still very common in terms of numbers. For instance, SHARC DSPs, (unfortunately...) one of the most popular DSP family, operates with 32-bit bytes, if my (repressed) memory serves correctly. I wouldn't be surprised if there were a lot, lot more deployments of such systems than iPhones. Same goes for telecom and friends. There are a lot of such systems, I'm mentioning SHARC because it's the one I programmed most recently and can probably still do a decent job answering questions about it.
Given the different programming : manufacturing ratio of these systems, I'm not surprised that they see less exposure in the programming community, but it's not a matter of pedantry, it's really a matter of correct use and not being creeped out when the compiler insists that sizeof(char) and sizeof(int) are both 1, on a 32-bit machine.
It's also used in non-English documents extensively, French being a notable example where "To", "Go", "Mo", and "Ko" are encountered instead of "TB", "GB", "MB", and "KB" respectively.
indeed, first thought was how does it differ from classic d3 charts ? i like d3 a lot, but usually it takes some boilerplate code to setup charts. when working with multiple datasets the d3 commands look a bit confusing at times, so higher level abstractions are interesting to see
Can't comment on openOCD issue; I just use tools which nordic and SEGGER provide, since these tools are cross-platform these days. Even GDB server works as expected!