TinyBERT is another model distillation method that transforms large scale pre-trained language models (like BERT) into smaller ones, while preserving most of the original model's capabilities. 27.07.2023 17:54 aior