> ...but, this 'dont use anything but the standard library' movement that keeps popping up here and on reddit is just ridiculous. It's just fall out from people who can't be bothered figuring out how to use 3rd party dependencies.
Unlikely, I myself slowly came to arrive at this philosophy after 15 years in programming. Nothing lasts forever except bit-rot! Look if I want to look back at my youth in old age, that'll mean looking at, running, playing with old code.
Do I want my time spent discovering and documenting obscure bugs in rare contexts/situations of other people's code, or my own? Guess which.
Has a 3rd-party framework/lib been fine-tuned and painfully optimized for many months for my actual use-case, or their authors'/audience's? Guess which.
Is the stuff these actually offer any actual rocket science, or just pretty basic easily grokked stuff except 80% of it I don't even need and that 10% I really do and nobody covers turns out to be nigh on impossible to integrate with their very peculiarly own idiomatic ways of describing things which I wouldn't ever have any trouble parsing through had they used my own peculiarities? Guess.
Now don't get me wrong, am I going to reinvent CUDA, OpenGL, .NET/JVM or other crucial infrastructure layers, of course not.
But most helper libraries and bootstrapping frameworks are at best fluff for a MVP, then get that timebomb the hell out of your codebase built-to-last-the-ages. Do it for your golden years.
(I make an exception for 3rd-party Haskell code as of now. First, you usually just need functions together with the insights that drove their design and workings, not as copy-paste but to adopt-adapt and thus most efficiently learn from. Secondly, the core language is just 6 primitives or so, everything else in Haskell is technically syntactic sugar and language extensions, so writing "pure" Haskell would probably be as nightmarish as coding in Lisp s-expressions (then again, I sure know some delight in this). Thirdly, with equational reasoning and referential integrity guaranteed for every piece of Haskell code that compiles, the fragility is a lot less and mostly comes from IO interactions with the outside/real world, which I'm ultimately going to recoup control over myself anyway. Further, most Haskellers are as of now way more seasoned and I can only learn from their work. Even furthermore, even much/most of the built-ins are written for learners or clarity, not efficiency/robustness/all-edge-cases-correctness/etc and will need custom replacements with time, on a case-to-case basis. Lastly, it'll be a breeze (in comparison) to just keep various versions of older compiler versions around, it's all quite self-contained for the most part, Stack excels at this but one could decidedly do so manually in case Stack gets stuck decades later.)
Unlikely, I myself slowly came to arrive at this philosophy after 15 years in programming. Nothing lasts forever except bit-rot! Look if I want to look back at my youth in old age, that'll mean looking at, running, playing with old code.
Do I want my time spent discovering and documenting obscure bugs in rare contexts/situations of other people's code, or my own? Guess which.
Has a 3rd-party framework/lib been fine-tuned and painfully optimized for many months for my actual use-case, or their authors'/audience's? Guess which.
Is the stuff these actually offer any actual rocket science, or just pretty basic easily grokked stuff except 80% of it I don't even need and that 10% I really do and nobody covers turns out to be nigh on impossible to integrate with their very peculiarly own idiomatic ways of describing things which I wouldn't ever have any trouble parsing through had they used my own peculiarities? Guess.
Now don't get me wrong, am I going to reinvent CUDA, OpenGL, .NET/JVM or other crucial infrastructure layers, of course not.
But most helper libraries and bootstrapping frameworks are at best fluff for a MVP, then get that timebomb the hell out of your codebase built-to-last-the-ages. Do it for your golden years.
(I make an exception for 3rd-party Haskell code as of now. First, you usually just need functions together with the insights that drove their design and workings, not as copy-paste but to adopt-adapt and thus most efficiently learn from. Secondly, the core language is just 6 primitives or so, everything else in Haskell is technically syntactic sugar and language extensions, so writing "pure" Haskell would probably be as nightmarish as coding in Lisp s-expressions (then again, I sure know some delight in this). Thirdly, with equational reasoning and referential integrity guaranteed for every piece of Haskell code that compiles, the fragility is a lot less and mostly comes from IO interactions with the outside/real world, which I'm ultimately going to recoup control over myself anyway. Further, most Haskellers are as of now way more seasoned and I can only learn from their work. Even furthermore, even much/most of the built-ins are written for learners or clarity, not efficiency/robustness/all-edge-cases-correctness/etc and will need custom replacements with time, on a case-to-case basis. Lastly, it'll be a breeze (in comparison) to just keep various versions of older compiler versions around, it's all quite self-contained for the most part, Stack excels at this but one could decidedly do so manually in case Stack gets stuck decades later.)