Hacker Newsnew | past | comments | ask | show | jobs | submit | maltelandwehr's commentslogin

Looks interesting! Which LLM are you using under the hood?

Thanks. For voice and video we use two-layered models (Whisper + fine-tuning), while for documents, images, and other content users can choose from multiple models. For example CLIP ViT-B/32, all-MiniLM-L6-v2, multilingual variants, etc. Everything runs fully offline.

The author is referring to the concept of "authority" as it is used in SEO-circles.

Since a lot of websites link to MSN, Forbes, and the NYT, they have a high Page Rank. This is interpreted as "high authority".


In the past, you would type a question into Google. To get the full, detailed answer you had to click on the Wikipedia search result.

With ChatGPT or Google AI Mode, you get all answers directly in the chat. And you can even ask follow-up questions. There is no need to click on a link.

From the data I have seen, 40% of searches on Google used to lead to a click to another website. In ChatGPT and Google AI Mode, this number is lower than 5%. One study (with a small N) even came to the conclusion that the number is 0%.


> people having AI talk to their partner about deep relationship stuff. I have read stories about people using AI to write their Tinder messages, eulogies, etc.

Gives me a weird/strange feeling.


Many things contributors are doing could be handled by AI agents.

We would just need a good system to a) keep humans in the loop and b) sort out bad actors.


If if all this is true, LLMs would still need to create a repository of such aggregated "summarizations of secondary sources".

LLMs cannot in real time find and read thousands of secondary sources. Especially not if some of these might have already disappeared or are not digitalized.

I can see a future where LLM labs a) donate to Wikipedia and b) contribute to it with agents that suggest edits and review facts.


I think LLM companies will need to donate and contribute to more direct sources are newspapers, journalists, bloggers, coders posting on stack overflow etc and not Wikipedia (which is more of a tertiary source).


I like to idea of using AI to make Wikipedia better.

There are many small tasks simple AI agents could handle.

For example: Go to every article in the English Wikipedia of a Spanish city. Check data like inhabitants, major, etc. and compare them to the Spanish Wikipedia article. (The assumption here is that the Spanish Wikipedia will be more up-to-date for Spanish cities than the English Wikipedia.) Double-check what is written in the Spanish article. Update the English article accordingly.

If such an agents is only allowed to create drafts, human editors could review them and we would get a lot of small updates in.


Definitely, this kind of processes will be AI-driven completely

Everyone benefits!

Similar to when Google pushed people to make mobile-friendly websites, fast websites, secure websites, etc.


What does Ranksmith do different from Profound, Peec AI, Athena and the 100+ other tools in this space?


It's a new space, so everyone is building tools to surface analytics and increase visibility in AI space. One of the hard problems we are solving is tackling contextual bias in querying the LLMs and getting the context as close to the actual searcher as possible. Apart from that, we are also working closely with beta testers to build tools that will help brands not only monitor but also optimize their website performance in the AI search. We are just getting started, and I think no other platform has yet solved the AI visibility problem in the best way possible.


What does Peec AI, do differently than Profound, Athena and a 100+ other tools in this space?

I guess, we can all be a little supportive than putting our competitors down, the market is too big


> putting our competitors down I did not put anybody down. I am surprised people still start to build in this very crowded space. I was genuinely hoping for a unique twist/idea - like a focus on local, on non-western LLMs, etc.

But it seems that is not the case.


I am sure you meant well, but it was your word-choice. Show some grace to the founders trying to build stuff. Its easy to nit-pick ideas and execution. You could have ask the same thing politely.

Focus on local, non-western LLMs, is not a unique differentiation, I am sure Peec can implement it tomorrow, if you guys want to.

Or were you only looking for differentiation so you can build it at Peec.

If two products are solving the same problem, eventually they will converge on the same feature-set. You should know that better.


Google is using a special version of Gemini (fast, small) and a special version of their internal ranking API (faster, fewer anti-spam/quality measures).

That makes them very fast. But that also leads to a ton of hallucinations. If you ask for non existent things (like the cats.txt protocol), AI Overviews consistently fabricate facts. Ai Overviews can pull the content of the potential source ULRs directly from Google's cache.

ChatGPT is slow because they have to make an external API call to Bing or - even worse - to a scraping provider like SerpApi/Data4SEO/Oxylabs to crawl regular Google search results. That introduced two delays. OpenAI then has to fetch some of these potential source URLs in real time. That introduces another delay. And then OpenAI also uses a better (but slower) model than Google to generate the answer.

Over time, OpenAI should be able to catch up in terms of speed with their own web/search index.

If you try more complex questions, you might find AI Overviews less to your liking.

Google gets away with this because users are used to type simple queries - often just a few keywords. Any kind of AI answer is like magic.

OpenAI cannot do the same. Their users are used to having multi-turn conversations and receiving thoughtful answers to complex questions.


Interesting. I am still defaulting to ChatGPT when I anticipate having a multi-turn conversation.

But for questions where I expect a single response to do, Google has taken over.

Here's an example from this morning:

It's my first autumn in a new house, and my boiler (forced hot water heating) kicked on for the first time. The kickboards in the kitchen have Quiet-One Kickspace brand radiators with electric fans. I wanted to know what controls these fans (are they wired to the thermostat, detect radiator temp, etc?)

I searched "When does a quiet-one kickspace heater turn on". Google Overview answered correctly [1] in <1 second. Tried the same prompt to ChatGPT. Took 17 seconds to get the full (also correct, and similarly detailed) answer.

Both answers were equally detailed and of similar length.

[1] Confirmed correct by observing the operation of the unit.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: