metadata
base_model: unsloth/gemma-2-2b-bnb-4bit
tags:
- text-generation-inference
- transformers
- unsloth
- gemma2
- gguf
license: apache-2.0
language:
- en
Loss should be 2.493672.
https://wandb.ai/paul-stansifer/huggingface/runs/argv2m1l
Uploaded model
- Developed by: paul-stansifer
- License: apache-2.0
- Finetuned from model : unsloth/gemma-2-2b-bnb-4bit
This gemma2 model was trained 2x faster with Unsloth and Huggingface's TRL library.