Fair! I'd definitely agree with that! I don't really know the author's intentions here but my read of this article is that it's for the people that ARE skipping thinking entirely using them. I agree completely, to me LLMs are effectively a slightly more useful (sometimes vastly more useful) search engine. They help me find out about features or mechanisms I didn't know existed and help demonstrate their value for me. I am still the one doing the thinking.
I'd argue we're using them "right" though.