Part of work associated with sentientsimulations.com


MythoMax L2 13B - SqueezeLLM

Description

This repo contains SqueezeLLM model files for Gryphe's MythoMax L2 13B.

About SqueezeLLM

https://github.com/SqueezeAILab/SqueezeLLM

Quantized using the steps here: https://github.com/SqueezeAILab/SqueezeLLM/tree/main/quantization


license: other

Downloads last month
10
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API: The model authors have turned it off explicitly.

Model tree for GusPuffy/sq-MythoMax-L2-13b-w4-s0

Quantized
(12)
this model