I remember one of the books in my algorithms class actively discouraged debuggers and advocated for harder reasoning and inserting print statements. Their reasoning was that programmes are too quick to reach for a debugger rather than really understanding their code. I'm still not sure how I feel about that.
If you were trying to learn math, your shouldn't be reaching for a calculator everytime when there's a difficult arithmetic problem.
A debugger is the same. Use it when you've mastered skill of desk checking code, and understand how to make hypothesis about where bugs might be and binary search with prints to debug.
When actually developing software, you then reach for the debugger for convenience. But the skills learnt above is still very useful, and is in fact a good way to learn how to problem solve.
If you only ever relied on a debugger, it'd be like only knowing how to use calculator and no mental arithmetic.