I'm using https://github.com/ggerganov/llama.cpp and currently mistral 7b (on a m1 macbook pro). I'm sure with some prompt examples you can get pretty good results on a smaller model.
At the moment I don't have it open sourced due to it being part of a larger project that I'm working on that contains tailwindui licensed components.
A cool feature that I'm working on is creating a firefox plugin so you can save/index job postings from other sites and extract out meta information via an LLM. Very similar to this chrome plugin.
https://chromewebstore.google.com/detail/huntr-job-search-tr...
What open source LLM did you use? Anything that would run on a CPU of a 5 year old MacBook Air (which Iām using as my home server)
Do you have the code for your project? Or maybe some features/ideas that you could share?
Thank you