Thanks. Yes, I've seen airoboros, it aims to use a mixture of fine-tunes of the base model if I recall correctly. Not a truly pre-trained MOE, but could be useful.
Yes, it's fine-tuned models, hopefully the community find use-cases where it will shine. Regarding Hydra, yes, that's the one. To stay updated, join the Discord mentioned in the repo.
2- The only 2 i know of are airoboros[1] and Hydra which is still in progress.
[0] https://x.com/ggerganov/status/1698667093711880687?s=46&t=Jp...
[1] https://github.com/jondurbin/airoboros#lmoe