t1101675 commited on
Commit
f59a278
·
verified ·
1 Parent(s): 00cd08c

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +5 -1
README.md CHANGED
@@ -7,4 +7,8 @@ sdk: static
7
  pinned: false
8
  ---
9
 
10
- Edit this `README.md` markdown file to author your organization card.
 
 
 
 
 
7
  pinned: false
8
  ---
9
 
10
+ ## Training Small Language Models with Knowledge Distillation
11
+
12
+ Official pre-trained models and baselines in
13
+ + [MiniLLM](https://github.com/microsoft/LMOps/tree/main/minillm): Knowledge distillation of LLMs during instruction tuning.
14
+ + [MiniPLM](https://github.com/thu-coai/MiniPLM): Knowledge distillation of LLMs during pre-training.