Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
medmekk
/
SmolLM2-1.7B-Instruct.GGUF
like
0
GGUF
Inference Endpoints
imatrix
conversational
Model card
Files
Files and versions
Community
Deploy
Use this model
main
SmolLM2-1.7B-Instruct.GGUF
1 contributor
History:
2 commits
medmekk
HF staff
Upload quantized models
01d16a0
verified
10 days ago
.gitattributes
Safe
2.95 kB
Upload quantized models
10 days ago
README.md
Safe
1.4 kB
Upload quantized models
10 days ago
SmolLM2-1.7B-Instruct-IQ3_M_imat.gguf
Safe
810 MB
LFS
Upload quantized models
10 days ago
SmolLM2-1.7B-Instruct-IQ3_XXS_imat.gguf
Safe
680 MB
LFS
Upload quantized models
10 days ago
SmolLM2-1.7B-Instruct-IQ4_NL_imat.gguf
Safe
991 MB
LFS
Upload quantized models
10 days ago
SmolLM2-1.7B-Instruct-IQ4_XS_imat.gguf
Safe
940 MB
LFS
Upload quantized models
10 days ago
SmolLM2-1.7B-Instruct-Q2_K.gguf
Safe
675 MB
LFS
Upload quantized models
10 days ago
SmolLM2-1.7B-Instruct-Q3_K_L.gguf
Safe
933 MB
LFS
Upload quantized models
10 days ago
SmolLM2-1.7B-Instruct-Q3_K_M.gguf
Safe
860 MB
LFS
Upload quantized models
10 days ago
SmolLM2-1.7B-Instruct-Q3_K_S.gguf
Safe
777 MB
LFS
Upload quantized models
10 days ago
SmolLM2-1.7B-Instruct-Q4_0.gguf
Safe
991 MB
LFS
Upload quantized models
10 days ago
SmolLM2-1.7B-Instruct-Q4_K_M.gguf
Safe
1.06 GB
LFS
Upload quantized models
10 days ago
SmolLM2-1.7B-Instruct-Q4_K_M_imat.gguf
Safe
1.06 GB
LFS
Upload quantized models
10 days ago
SmolLM2-1.7B-Instruct-Q4_K_S.gguf
Safe
999 MB
LFS
Upload quantized models
10 days ago
SmolLM2-1.7B-Instruct-Q4_K_S_imat.gguf
Safe
999 MB
LFS
Upload quantized models
10 days ago
SmolLM2-1.7B-Instruct-Q5_0.gguf
Safe
1.19 GB
LFS
Upload quantized models
10 days ago
SmolLM2-1.7B-Instruct-Q5_K_M.gguf
Safe
1.23 GB
LFS
Upload quantized models
10 days ago
SmolLM2-1.7B-Instruct-Q5_K_M_imat.gguf
Safe
1.23 GB
LFS
Upload quantized models
10 days ago
SmolLM2-1.7B-Instruct-Q5_K_S.gguf
Safe
1.19 GB
LFS
Upload quantized models
10 days ago
SmolLM2-1.7B-Instruct-Q5_K_S_imat.gguf
Safe
1.19 GB
LFS
Upload quantized models
10 days ago
SmolLM2-1.7B-Instruct-Q6_K.gguf
Safe
1.41 GB
LFS
Upload quantized models
10 days ago
SmolLM2-1.7B-Instruct-Q8_0.gguf
Safe
1.82 GB
LFS
Upload quantized models
10 days ago