Entailment Detection by Fine-tuning BERT


  • The model in this repository is fine-tuned on Google's encoder-decoder transformer-based model BERT.
  • New York University's Multi-NLI dataset is used for fine-tuning.
  • Accuracy achieved: ~74%

    image/png

  • Notebook used for fine-tuning: here
  • N.B.: Due to computational resource constraints, only 11K samples are used for fine-tuning. There is room for accuracy improvement if a model is trained on all the 390K samples available in the dataset.
    Downloads last month
    0
    Inference Providers NEW
    This model is not currently available via any of the supported third-party Inference Providers, and the model is not deployed on the HF Inference API.

    Model tree for ArghaKamalSamanta/ema_task_entailment

    Adapter
    (1)
    this model

    Dataset used to train ArghaKamalSamanta/ema_task_entailment