̶F̶u̶l̶l̶ ̶m̶o̶d̶e̶l̶ ̶c̶a̶r̶d̶ ̶s̶o̶o̶n̶.̶ ̶E̶a̶r̶l̶y̶ ̶r̶e̶l̶e̶a̶s̶e̶;̶ Spherical Hexa-Merge of hand-picked Mistrel-7B models.
This is the successor to Naberius-7B, building on its findings.

[11 Dec 2023 UPDATE] Original compute resource for experiment are inaccessible. Long story;

https://huggingface.co/CalderaAI/Hexoteric-7B/discussions/2#6576d3e5412ee701851fd567

Stanford Alpaca format works best for instruct test driving this engima.

Downloads last month
16
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.

Model tree for CalderaAI/Hexoteric-7B

Merges
4 models
Quantizations
5 models