Model Description

Erya4FT is based on Erya and further fine-tuned on our Dataset, enhancing the ability to translate ancient Chinese into Modern Chinese.

Example

>>> from transformers import BertTokenizer, CPTForConditionalGeneration

>>> tokenizer = BertTokenizer.from_pretrained("RUCAIBox/Erya4FT")
>>> model = CPTForConditionalGeneration.from_pretrained("RUCAIBox/Erya4FT")

>>> input_ids = tokenizer("竖子不足与谋。", return_tensors='pt')
>>> input_ids.pop("token_type_ids")

>>> pred_ids = model.generate(max_new_tokens=256, **input_ids)
>>> print(tokenizer.batch_decode(pred_ids, skip_special_tokens=True))
    ['这 小 子 不 值 得 与 他 商 量 。']
Downloads last month
125
Inference Providers NEW
This model is not currently available via any of the supported third-party Inference Providers, and the model is not deployed on the HF Inference API.

Spaces using RUCAIBox/Erya4FT 2