Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

What laptop are you using to run which model, and what are you using for that?


The easiest way is to use ollama - mistral 7b, zephyr 7b, openhermes 2 are all decent models, I guess in fact openhermes 2 can do function calling.

If you further want a smaller one, stablelm-zephyr 3b is a good attempt with ollama.


M2 MacBook Pro, I run man different models but mistral, zephyr, deepseek. I use Ollama and LM Studio.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: