KULLM project

  • base model: Upstage/SOLAR-10.7B-Instruct-v1.0

datasets

  • KULLM dataset
  • hand-crafted instruction data

Implementation Code

from transformers import (
    AutoModelForCausalLM,
    AutoTokenizer
)
import torch

repo = "heavytail/kullm-solar"
model = AutoModelForCausalLM.from_pretrained(
        repo,
        torch_dtype=torch.float16,
        device_map='auto'
)
tokenizer = AutoTokenizer.from_pretrained(repo)

Initial upload: 2024/01/28 21:10

Downloads last month
2,255
Safetensors
Model size
10.7B params
Tensor type
FP16
Β·
Inference Providers NEW
This model is not currently available via any of the supported third-party Inference Providers, and the model is not deployed on the HF Inference API.

Model tree for heavytail/kullm-solar

Merges
1 model

Spaces using heavytail/kullm-solar 6