wubingheng commited on
Commit
224d7bf
verified
1 Parent(s): 6f73ea3

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +2 -2
README.md CHANGED
@@ -30,8 +30,8 @@ In addition, Doge uses Dynamic Mask Attention as sequence transformation and can
30
  ```python
31
  >>> from transformers import AutoTokenizer, AutoModelForCausalLM, GenerationConfig, TextStreamer
32
 
33
- >>> tokenizer = AutoTokenizer.from_pretrained("wubingheng/Doge_medical_chat-197M")
34
- >>> model = AutoModelForCausalLM.from_pretrained("wubingheng/Doge_medical_chat-197M", trust_remote_code=True)
35
 
36
  >>> generation_config = GenerationConfig(
37
  ... max_new_tokens=256,
 
30
  ```python
31
  >>> from transformers import AutoTokenizer, AutoModelForCausalLM, GenerationConfig, TextStreamer
32
 
33
+ >>> tokenizer = AutoTokenizer.from_pretrained("wubingheng/Doge-197M-Medical-SFT")
34
+ >>> model = AutoModelForCausalLM.from_pretrained("wubingheng/Doge-197M-Medical-SFT", trust_remote_code=True)
35
 
36
  >>> generation_config = GenerationConfig(
37
  ... max_new_tokens=256,