If water is just used for cooling, and the output is hotter water, then it's not really "used" at all. Maybe it needs to be cooled to ambient and filtered before someone can use it, but it's still there.
If it was being used for evaporative cooling then the argument would be stronger. But I don't think it is - not least because most data centres don't have massive evaporative cooling towers.
Even then, whether we consider it a bad thing or not depends on the location. If the data centre was located in an area with lots of water, it's not some great loss that it's being evaporated. If it's located in a desert then it obviously is.
If you discharge water into a river, there are environmental limits to the outlet temperature (this is a good thing btw). The water can't be very hot. That means you need to pump a large volume of water through because you can only put a small amount of energy into each kg of water.
If you evaporate the water on the other hand, not only is there no temperature limit but it also absorbs the latent heat of vaporisation. The downside is it's a lot more complex and also the water is truly consumed rather than just warming it up.
If it was being used for evaporative cooling then the argument would be stronger. But I don't think it is - not least because most data centres don't have massive evaporative cooling towers.
Even then, whether we consider it a bad thing or not depends on the location. If the data centre was located in an area with lots of water, it's not some great loss that it's being evaporated. If it's located in a desert then it obviously is.