Large-scale Self-supervised Pre-training Across Tasks, Languages, and Modalities
-
Updated
Mar 15, 2023 - Python
Large-scale Self-supervised Pre-training Across Tasks, Languages, and Modalities
A LITE BERT FOR SELF-SUPERVISED LEARNING OF LANGUAGE REPRESENTATIONS, 海量中文预训练ALBERT模型
An Open-Source Framework for Prompt-Learning.
Language Understanding Evaluation benchmark for Chinese: datasets, baselines, pre-trained models,corpus and leaderboard
[CVPR 2021] Involution: Inverting the Inherence of Convolution for Visual Recognition, a brand new neural operator
Pre-trained Chinese ELECTRA(中文ELECTRA预训练模型)
[ICLR'23 Spotlight] The first successful BERT/MAE-style pretraining on any convolutional network; Pytorch impl. of "Designing BERT for Convolutional Networks: Sparse and Hierarchical Masked Modeling"
[MICCAI 2019] Implementation and Pre-trained Models for Models Genesis
PyTorch code for "Prototypical Contrastive Learning of Unsupervised Representations"
Official PyTorch implementation of Global Context Vision Transformers
PERT: Pre-training BERT with Permuted Language Model
Official Repository for the Uni-Mol Series Methods
Eden AI: simplify the use and deployment of AI technologies by providing a unique API that connects to the best possible AI engines
Searching prompt modules for parameter-efficient transfer learning.
A collection of Audio and Speech pre-trained models.
Exploring Visual Prompts for Adapting Large-Scale Models
Code of the CVPR 2021 Oral paper: A Recurrent Vision-and-Language BERT for Navigation
Official repository of the AAAI'2022 paper "GALAXY: A Generative Pre-trained Model for Task-Oriented Dialog with Semi-Supervised Learning and Explicit Policy Injection"
A work in progress to build out solutions in Rust for MLOPs
Add a description, image, and links to the pre-trained-model topic page so that developers can more easily learn about it.
To associate your repository with the pre-trained-model topic, visit your repo's landing page and select "manage topics."