A transformer-based model that is pre-trained using three types of language modeling tasks; unidirectional (like GPT), bidirectional (like BERT), and sequence-to-sequence prediction. This allows it to be fine-tuned for a wide variety of NLP tasks. 27.07.2023 17:54 aior