Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Also re the GP mention of caching, I'm sure I read in a discussion that kagi had tried that and it didn't change the economics.

For me, ~50% of searches are just lazy ways to get to a website I know, like I literally search for hacker news regularly. I'm curious if there's some kind of triage possible, where the lazy site lookups get a simple search that's cost efficient and then fail over to a costly search if they don't find anything. If there isn't something like this ready. Modern search is really just a "portal" with fuzzy matching for most queries, as opposed to genuine "show me a site I don't already know about" and I've never seen that reflected in any discussion about search.



> For me, ~50% of searches are just lazy ways to get to a website I know, like I literally search for hacker news regularly.

I see many people writing the same. What is the explanation? Do you not use browser history or do you use a browser that don't suggest URLs?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: