Roberta Zinc Compression Head

This model is trained to compress embeddings generated by the roberta_zinc_480m model from the native size of 768 to compressed sizes 512, 256, 128, 64, and 32.

The compressed embeddings are trained to maintain cosine similarity computed using the native embeddings.


license: mit

Downloads last month
391
Safetensors
Model size
2.64M params
Tensor type
F32
·
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API: The model has no library tag.

Model tree for entropy/roberta_zinc_compression_head

Finetuned
(1)
this model