These models are simply much more powerful than a tradition search engine and stackoverflow, so many people use these models for a reason, a friend of mine that never tried ChatGPT until very recently managed to solve a problem he couldn't find a solution on stackoverflow using GPT-4o, next time he's probably going to ask the model directly.
I don't know what your friend's prompts were, but this probably speaks to the conversational aspect. I've found success in using LLMs to "search" for things I don't know how to search for - a 'tip of my tongue' type scenario.
"How do I do a for loop" though is a waste of time and energy and should be put into a search engine. There is no need to use the inefficient power needs of an LLM to answer that question. The search engine will have cached the results of that question, leading to a much faster discovery of the answer, and less power draw to do it, whereas an LLM needs to ponder your question EVERY. SINGLE. TIME. A huge waste.