I did a $ find . -type f | wc -l in my ~/www I've been adding to for 24 years and I have somewhere around 8,476,585 files (not counting the ~250 million 30kb png tiles I have for 24/7/365 radio spectrogram zoomable maps since 2014). I get about 2-3k bot hits per day.
Those are the good bots, which say who they are, probably respect robots.txt, and appear on various known bot lists. They are easy to deal with if you really want. But in my experience it is the bad bots you're more likely to want to deal with, and those can be very difficult, e.g. pretending to be browsers, coming from residential IP proxy farms, mutating their fingerprint too fast to appear on any known bot lists, etc.
Right, if you add up the named bots in my list it only comes to about 1.5k. But there's another 1-2k of bots per day pretending to be browsers but I am okay with that.
It's just the malicious ones I ban. And indeed I've banned nearly every hosting service in Wyoming (where shady companies don't have to list their benefactors and it's all malicious actor fronts) and huge ranges of Russian and Chinese IP space. My list of IP ranges banned is too long for a HN comment.
Today's named bots: GPTBot => 726, Googlebot => 659, drive.google.com => 340, baidu => 208, Custom-AsyncHttpClient => 131, MJ12bot => 126, bingbot => 88, YandexBot => 86, ClaudeBot => 43, Applebot => 23, Apache-HttpClient => 22, semantic-visions.com crawler => 16, SeznamBot => 16, DotBot => 16, Sogou => 12, YandexImages => 11, SemrushBot => 10, meta-externalagent => 10, AhrefsBot => 9, GoogleOther => 9, Go-http-client => 6, 360Spider => 4, SemanticScholarBot => 2, DataForSeoBot => 2, Bytespider => 2, DuckDuckBot => 1, SurdotlyBot => 1, AcademicBotRTU => 1, Amazonbot => 1, Mediatoolkitbot => 1,