#
gpt
Here are 196 public repositories matching this topic...
An implementation of model parallel GPT-2 and GPT-3-style models using the mesh-tensorflow library.
-
Updated
Feb 25, 2022 - Python
-
Updated
May 10, 2022 - Rust
LightSeq: A High Performance Library for Sequence Processing and Generation
training
cuda
inference
transformer
accelerate
bart
beam-search
sampling
gpt
bert
multilingual-nmt
diverse-decoding
-
Updated
May 10, 2022 - Cuda
Open Source Pre-training Model Framework in PyTorch & Pre-trained Model Zoo
natural-language-processing
model-zoo
pytorch
classification
bart
chinese
gpt
pegasus
ner
clue
albert
bert
fine-tuning
roberta
elmo
pre-training
gpt-2
t5
unilm
xlm-roberta
-
Updated
May 12, 2022 - Python
good first issue
Good for newcomers
Rust native ready-to-use NLP pipelines and transformer-based models (BERT, DistilBERT, GPT2,...)
nlp
rust
machine-learning
translation
deep-learning
sentiment-analysis
transformer
rust-lang
question-answering
bart
gpt
ner
bert
language-generation
electra
roberta
gpt-2
-
Updated
May 13, 2022 - Rust
Transformer related optimization, including BERT, GPT
-
Updated
May 13, 2022 - C++
Simple implementations of NLP models. Tutorials are written in Chinese on my website https://mofanpy.com
-
Updated
Feb 10, 2022 - Python
KakaoBrain KoGPT (Korean Generative Pre-trained Transformer)
-
Updated
Apr 27, 2022 - Python
PatrickStar enables Larger, Faster, Greener Pretrained Models for NLP and democratizes AI for everyone.
-
Updated
May 11, 2022 - Python
Super UEFIinSecureBoot Disk: Boot any OS or .efi file without disabling UEFI Secure Boot
-
Updated
May 8, 2022
An easy to use Natural Language Processing library and framework for predicting, training, fine-tuning, and serving up state-of-the-art NLP models.
nlp
docker
machine-learning
natural-language-processing
deep-learning
gpu
transformers
pytorch
api-rest
easy
gpt
language-models
deep-learning-tutorial
bert
fine-tuning
ulmfit
xlnet
-
Updated
Nov 30, 2021 - Jupyter Notebook
machine-learning
deep-learning
clustering
tensorflow
scikit-learn
keras
transformers
pytorch
gan
neural-networks
convolutional-neural-networks
gpt
gans
albert
dbscan
bert
keras-tensorflow
pytorch-tutorial
pytorch-implementation
huggingface-transformers
-
Updated
Apr 5, 2022
API for the GPT-J language model 🦜 . Including a FastAPI backend and a streamlit frontend
-
Updated
Oct 25, 2021 - Python
OpenAI GPT2 pre-training and sequence prediction implementation in Tensorflow 2.0
nlp
tensorflow
text-generation
transformer
openai
gpt
implementation
pre-training
tensorflow2
gpt-2
gpt2
pretraining
-
Updated
Feb 10, 2022 - Python
Discord AI Chatbot using DialoGPT, trained on the game transcript of The World Ends With You
-
Updated
Jun 20, 2021 - Jupyter Notebook
Code & Data for "Tabular Transformers for Modeling Multivariate Time Series" (ICASSP, 2021)
machine-learning
tabular-data
pytorch
artificial-intelligence
transformer
gpt
bert
fraud-detection
icassp
huggingface
credit-card-dataset
prsa-dataset
credit-card-transaction
icassp2021
-
Updated
Feb 3, 2022 - Python
QuickAI is a Python library that makes it extremely easy to experiment with state-of-the-art Machine Learning models.
python
nlp
fast
research
ai
deep-learning
neural-network
ml
pytorch
artificial-intelligence
yolo
easy-to-use
object-detection
gpt
dl
bert
tensorflow2
huggingface-transformers
gpt-neo
quickai
-
Updated
Apr 16, 2022 - Python
A React implementation of the Google DFP/GPT api. https://react-dfp.surge.sh
-
Updated
Apr 27, 2022 - JavaScript
A paper list of pre-trained language models (PLMs).
multilingual
machine-learning
natural-language-processing
deep-learning
gpt
representation-learning
multi-modal
bert
pre-training
-
Updated
Nov 30, 2021
Annotations of the interesting ML papers I read
nlp
machine-learning
deep-learning
transformers
gpt
research-paper
bert
gpt-2
xlnet
annotated-paper
megatron-lm
papers-annotations
-
Updated
Dec 12, 2021
Improve this page
Add a description, image, and links to the gpt topic page so that developers can more easily learn about it.
Add this topic to your repo
To associate your repository with the gpt topic, visit your repo's landing page and select "manage topics."
I'm playing around with this wonderful code but I'm running into a curious issue when I try to train the model with my own data.
I replicated the
personachat_self_original.jsonfile structure and added my own data. I deleteddataset_cache_OpenAIGPTTokenizerfile but when I try to train, I get this error: