Before: কম্পিউটাৰ এন্ড এন্ডিং এন্ডিং এন্ডিং � কম্পিউটাৰ এন্ড এন্ডিং এন্ডিং এন্ডিং �

After: কম্পিউটাৰ সংবাদ এই সমূহ এই সংবাদ এই সমূ কম্পিউটাৰ সংবাদ এই সমূহ এই সংবাদ এই সমূ

max_seq_length = 512
per_device_train_batch_size = 2,
gradient_accumulation_steps = 4,
warmup_steps = 2,
max_steps = 10,
learning_rate = 0.0005,
fp16 = not torch.cuda.is_bf16_supported(),
bf16 = torch.cuda.is_bf16_supported(),
logging_steps = 1,
optim = "adamw_8bit",
weight_decay = 0.01,
lr_scheduler_type = "linear",
seed = 3407,
output_dir = "outputs",

Uploaded model

  • Developed by: tamang0000
  • License: apache-2.0
  • Finetuned from model : unsloth/tinyllama-bnb-4bit

This llama model was trained 2x faster with Unsloth and Huggingface's TRL library.

Downloads last month
10
Safetensors
Model size
1.1B params
Tensor type
FP16
·
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.

Model tree for tamang0000/tinyllama-assamese-v0.2.1

Finetuned
(190)
this model