Update README.md
Browse files
README.md
CHANGED
@@ -7,4 +7,8 @@ sdk: static
|
|
7 |
pinned: false
|
8 |
---
|
9 |
|
10 |
-
|
|
|
|
|
|
|
|
|
|
7 |
pinned: false
|
8 |
---
|
9 |
|
10 |
+
## Training Small Language Models with Knowledge Distillation
|
11 |
+
|
12 |
+
Official pre-trained models and baselines in
|
13 |
+
+ [MiniLLM](https://github.com/microsoft/LMOps/tree/main/minillm): Knowledge distillation of LLMs during instruction tuning.
|
14 |
+
+ [MiniPLM](https://github.com/thu-coai/MiniPLM): Knowledge distillation of LLMs during pre-training.
|