MLX
Safetensors
mixtral

mlx-community/Mixtral-8x7B-Instruct-v0.1

The Model mlx-community/Mixtral-8x7B-Instruct-v0.1 was converted to MLX format from mistralai/Mixtral-8x7B-Instruct-v0.1 using mlx-lm version 0.12.0.

Use with mlx

pip install mlx-lm
from mlx_lm import load, generate

model, tokenizer = load("mlx-community/Mixtral-8x7B-Instruct-v0.1")
response = generate(model, tokenizer, prompt="hello", verbose=True)
Downloads last month
89
Safetensors
Model size
46.7B params
Tensor type
FP16
ยท
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API: The model has no pipeline_tag.

Space using mlx-community/Mixtral-8x7B-Instruct-v0.1 1

Collection including mlx-community/Mixtral-8x7B-Instruct-v0.1