cheese_greater@lemmy.world to Ask Lemmy@lemmy.world · edit-227 days agoWhat's a good local and free LLM model for Windows?message-squaremessage-square7fedilinkarrow-up116arrow-down111
arrow-up15arrow-down1message-squareWhat's a good local and free LLM model for Windows?cheese_greater@lemmy.world to Ask Lemmy@lemmy.world · edit-227 days agomessage-square7fedilink
minus-squareToes♀@ani.sociallinkfedilinkarrow-up3·27 days agoTry it with this model, using Q4_K_S version. https://huggingface.co/bartowski/mlabonne_gemma-3-12b-it-abliterated-GGUF You’ll probably need to play with the context window size until you get an acceptable level of performance. (Likely 4096) Ideally you’d have more RAM, but I want to say this smaller model should work. Koboldcpp will try to use both your GPU and CPU to run the model.
Try it with this model, using Q4_K_S version.
https://huggingface.co/bartowski/mlabonne_gemma-3-12b-it-abliterated-GGUF
You’ll probably need to play with the context window size until you get an acceptable level of performance. (Likely 4096)
Ideally you’d have more RAM, but I want to say this smaller model should work. Koboldcpp will try to use both your GPU and CPU to run the model.