They may have missed that one, because at the same time they introduced both pagination and search to the repository membership page, and boy did that help us on one of our repos with a few hundred direct collaborators, by the end we could only manage access through the API because the page didn't even load most of the time.
I see why web pages need pagination so the server or browser doesn't OOM, but there really ought to be 10000 entries per page, not 25 that most sites seem to like.
Ctrl+F on a list of 10000 entries is far easier than clicking through 400 ajaxy pages and trying to figure out some custom and buggy filtering system that probably doesn't allow regex.
Past 10000 records most sites probably ought to just let you export in something bigquery compatible anyway - Regular Joe isn't going to have more than 10000 of anything, and anyone who does can learn how to use proper data tools.