Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Yeah more or less! I still use open ai for their embeddings (translating text into vector space)

- Your question -> vectors with open ai embeddings - Text you uploaded before -> vectors with open ai embeddings

Get the most similar above a certain threshold, and then add it to the prompt saying

"From these articles / paragraphs, answer the user's question: What are the three foobars"

So yep! I preprocess it



I think that’s a really clever idea.

It feels like training is analogous to years of growing up and going school. And what you’ve done is taken that educated “mind” and said, “here’s a document. I’m going to quiz you on it.”

That seems really practical compared to sending an AI back to school to learn all about some specific topic.

I wonder if we will end up with a library of trained models, each of which went to different schools to obtain different focuses of knowledge. Maybe LiteratureGPT is better suited for this than General GPT or ProgrammerGPT.

Okay I think I’ve stretched the analogy far enough.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: