Here are
12 public repositories
matching this topic...
End-to-End recipes for pre-training and fine-tuning BERT using Azure Machine Learning Service
Updated
Jul 7, 2020
Jupyter Notebook
Recent Advances in Vision and Language PreTrained Models (VL-PTMs)
OpenAI GPT2 pre-training and sequence prediction implementation in Tensorflow 2.0
Updated
May 16, 2020
Python
Paddle Distributed Training Extended. 飞桨分布式训练扩展包
Updated
Sep 10, 2020
Shell
Parameter-Efficient Transfer from Sequential Behaviors for User Modeling and Recommendation
Updated
Sep 8, 2020
Python
AAAI-20 paper: Cross-Lingual Natural Language Generation via Pre-Training
Updated
Jul 27, 2020
Python
Transformer with Untied Positional Encoding (TUPE). Code of paper "Rethinking Positional Encoding in Language Pre-training". Improve existing models like BERT.
Updated
Jul 20, 2020
Python
Papers about pretraining and self-supervised learning on Graph Neural Networks (GNN).
Dynamic Transfer Learning for Low-Resource Neural Machine Translation
Updated
Aug 4, 2020
Python
Pretraining on 2015, 2019 and IDRIDs with ResNet 101 and 152 and fine tuning with 2019 dataset only
Updated
Dec 9, 2019
Python
Understanding "A Lite BERT". An Transformer approach for learning self-supervised Language Models. (wip)
Updated
Mar 16, 2020
Python
This is a flexible class for training specific layers of deep neural-nets in an online manner. Supports Keras models.
Updated
Aug 6, 2020
Python
Improve this page
Add a description, image, and links to the
pretraining
topic page so that developers can more easily learn about it.
Curate this topic
Add this topic to your repo
To associate your repository with the
pretraining
topic, visit your repo's landing page and select "manage topics."
Learn more
You can’t perform that action at this time.
You signed in with another tab or window. Reload to refresh your session.
You signed out in another tab or window. Reload to refresh your session.