Discrepancy in max tokens
#101
by
KennethEnevoldsen
- opened
the config states 512, which one is correct?
The maximum sequence length imposed by Sentence Transformers has priority: https://huggingface.co/sentence-transformers/all-MiniLM-L6-v2/blob/main/sentence_bert_config.json#L2, whereas 512 is the architecture's limit. The 256 was chosen based on the training data.
- Tom Aarsen
Thanks for the (fast) clarification - will close this
KennethEnevoldsen
changed pull request status to
closed