![](https://cdn-avatars.huggingface.co/v1/production/uploads/5dd96eb166059660ed1ee413/WtA3YYitedOr9n02eHfJe.png)
google/switch-base-256
Text2Text Generation
•
Updated
•
101
•
4
This release included various MoE (Mixture of expert) models, based on the T5 architecture . The base models use from 8 to 256 experts.