True, but that still leaves other search engines in the dark about which links should be ignored. I thought disavow.txt might be a better solution to help us avoid The Destruction of the Web.
Good point, of course there are other search engines too and in a way a 'webmaster / search engine' interface that requires webmasters to have direct contact with a search engine when the same thing could be fixed by something the crawler could pick up is less elegant. I had not thought this through when I wrote my reply to you, you are absolutely right.