#
gpt
Here are 229 public repositories matching this topic...
An implementation of model parallel GPT-2 and GPT-3-style models using the mesh-tensorflow library.
-
Updated
Feb 25, 2022 - Python
-
Updated
Oct 6, 2022 - Rust
LightSeq: A High Performance Library for Sequence Processing and Generation
training
cuda
inference
transformer
accelerate
bart
beam-search
sampling
gpt
bert
multilingual-nmt
diverse-decoding
-
Updated
Oct 28, 2022 - C++
Open Source Pre-training Model Framework in PyTorch & Pre-trained Model Zoo
natural-language-processing
model-zoo
pytorch
classification
bart
chinese
gpt
pegasus
ner
clue
albert
bert
fine-tuning
roberta
elmo
pre-training
gpt-2
t5
unilm
xlm-roberta
-
Updated
Oct 20, 2022 - Python
Transformer related optimization, including BERT, GPT
-
Updated
Oct 27, 2022 - C++
-
Updated
Nov 29, 2021 - Python
machine-learning
deep-learning
clustering
tensorflow
scikit-learn
keras
transformers
pytorch
gan
neural-networks
convolutional-neural-networks
gpt
gans
albert
dbscan
bert
keras-tensorflow
pytorch-tutorial
pytorch-implementation
huggingface-transformers
-
Updated
Jun 15, 2022
Rust native ready-to-use NLP pipelines and transformer-based models (BERT, DistilBERT, GPT2,...)
nlp
rust
machine-learning
translation
deep-learning
sentiment-analysis
transformer
rust-lang
question-answering
bart
gpt
ner
bert
language-generation
electra
roberta
gpt-2
-
Updated
Oct 29, 2022 - Rust
KakaoBrain KoGPT (Korean Generative Pre-trained Transformer)
-
Updated
Oct 23, 2022 - Python
RWKV is a RNN with transformer-level performance. It can be directly trained like a GPT (parallelizable). So it's combining the best of RNN and transformer - great performance, fast inference, saves VRAM, fast training, "infinite" ctx_len, and free sentence embedding.
deep-learning
transformers
pytorch
transformer
lstm
rnn
gpt
language-model
attention-mechanism
gpt-2
gpt-3
linear-attention
rwkv
-
Updated
Oct 20, 2022 - Python
Simple implementations of NLP models. Tutorials are written in Chinese on my website https://mofanpy.com
-
Updated
Jun 22, 2022 - Python
PatrickStar enables Larger, Faster, Greener Pretrained Models for NLP and democratizes AI for everyone.
-
Updated
Jun 17, 2022 - Python
Super UEFIinSecureBoot Disk: Boot any OS or .efi file without disabling UEFI Secure Boot
-
Updated
Jun 20, 2022
An easy to use Natural Language Processing library and framework for predicting, training, fine-tuning, and serving up state-of-the-art NLP models.
nlp
docker
machine-learning
natural-language-processing
deep-learning
gpu
transformers
pytorch
api-rest
easy
gpt
language-models
deep-learning-tutorial
bert
fine-tuning
ulmfit
xlnet
-
Updated
Nov 30, 2021 - Jupyter Notebook
Improve this page
Add a description, image, and links to the gpt topic page so that developers can more easily learn about it.
Add this topic to your repo
To associate your repository with the gpt topic, visit your repo's landing page and select "manage topics."