This is a 2.0 bpw quantized version of Qwen/Qwen2.5-Coder-32B-Instruct made with exllamav2.

License

This model is available under the Apache 2.0 License.

Discord Server

Join our Discord server here.

Feeling Generous? 😊

Eager to buy me a cup of 2$ coffee or iced tea?πŸ΅β˜• Sure, here is the link: https://ko-fi.com/drnicefellow. Please add a note on which one you want me to drink?

Downloads last month
11
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API: The model has no library tag.

Model tree for DrNicefellow/Qwen2.5-Coder-32B-Instruct-2.0bpw-exl2

Base model

Qwen/Qwen2.5-32B
Quantized
(86)
this model

Collection including DrNicefellow/Qwen2.5-Coder-32B-Instruct-2.0bpw-exl2