The article doesn't understand programmers. People will stay because they are passionate about OCaml and there are not a lot of OCaml jobs.
When hiring for a permanent position, I have the expectation that a programmer can learn a new language and environment. An OCaml programmer for a position that is python or C would be looked on very favorably. Far more attention-getting than “full-stack programmer”.
If your only professional experience is OCaml and you want to look elsewhere for work then your opportunities shrink noticeably. Especially if you're looking for a position that requires experience. It's much more digestible for a company to hire someone out of college and invest in training on tooling. But many companies won't get past the resume if a senior developer has to take more time to on-board.
This is likely true for many companies. However, it is also a metric for what type of working environment it will be. I value being able to learn quickly and creativity over pre-training.
`JSON.parse` actually does give you that option via the `reviver` parameter, which gives you access to the original string of digits (to pass to `BigInt` or the number type of your choosing) – so per this conversation fits the "good parser" criteria.
To be specific (if anyone was curious), you can force BigInt with something like this:
//MAX_SAFE_INTEGER is actually 9007199254740991 which is 16 digits
//you can instead check if exactly 16 and compare size one string digit at a time if absolute precision is desired.
const bigIntReviver = (key, value, context) => typeof value === 'number' && Math.floor(value) === value && context.source.length > 15 ? BigInt(context.source) : value
const jsonWithBigInt = x => JSON.parse(x, bigIntReviver)
Generally, I'd rather throw if a number is unexpectedly too big otherwise you will mess up the types throughout the system (the field may not be monomorphic) and will outright fail if you try to use math functions not available to BigInts.
Sorry yes, i was thinking of the context object with source parameter.
The issue it solves is a big one though, since without it the JSON.parse functionality cannot parse numbers that are larger than 64bit float numbers (f.ex. bigints).
Hard rules are the problem. There is a lot of "it depends."
After over 40 years of programming, I continue to reduce the size of functions and find it easier to write and understand when I return to them. Ten lines are now a personal guideline.
However, a linear function with only tiny loops or conditionals can be easily understood when hundreds of lines are long, but not so much with nested conditionals and loops, where there is natural decomposition into functions.
I observed that the same guidelines became rules problems when test coverage became popular. They soon became metrics rather than tools to think about code and tests. People became reluctant to add sanity check code for things that could should never happen because it brought down code coverage.
There are certainly functions written too cleverly to be apparent how they manage to work at all in a few lines. By my own hand six months ago sometimes. The solution is an unsexy one but always works: write a books worth of comments near that function code that explains absolutely everything and why it was done.
Circos is useful when the data is sparse and quickly becomes extremely time-consuming to near-impossible to interpret. The most you can say is "wow, there is a lot going on".
I know in five seconds if a Circos plot is worth looking at.