e) BART (Bidirectional and Auto-Regressive Transformers)

BART (Bidirectional and Auto-Regressive Transformers) from Facebook AI Research (FAIR) is a variation of a Transformer model. It uses the Transformer model architecture, but with a unique pretraining strategy.

BART is pretrained by randomly masking out certain portions of the text (like whole sentences) and then training the model to fill in the missing parts, which differs from both BERT and GPT's pretraining approaches. This makes BART more similar to a denoising autoencoder. BART is bidirectional, like BERT, in the sense that it considers context from both directions when predicting the masked portions of the text, but it also has an auto-regressive factor, like GPT, in how it generates the predicted tokens one at a time.

Therefore, BART can be seen as a hybrid of BERT and GPT, leveraging the strengths of both, but it still operates within the broader category of Transformer models.

Last updated