Live demo results generated by GitHub Actions
bert-base-uncased distilbert-sst-2 HuggingFace Transformers Generated: 2026-03-07 11:23 UTC
BERT predicts the most likely words for a [MASK] token using
bidirectional context.
DistilBERT (fine-tuned on SST-2) classifies each sentence as POSITIVE or NEGATIVE.