The Reliable USB Formatting Utility
-
Updated
Feb 13, 2023 - C
The Reliable USB Formatting Utility
An implementation of model parallel GPT-2 and GPT-3-style models using the mesh-tensorflow library.
LightSeq: A High Performance Library for Sequence Processing and Generation
Open Source Pre-training Model Framework in PyTorch & Pre-trained Model Zoo
Tensor search for humans.
Transformer related optimization, including BERT, GPT
RWKV is a RNN with transformer-level LLM performance. It can be directly trained like a GPT (parallelizable). So it's combining the best of RNN and transformer - great performance, fast inference, saves VRAM, fast training, "infinite" ctx_len, and free sentence embedding.
Rust native ready-to-use NLP pipelines and transformer-based models (BERT, DistilBERT, GPT2,...)
A ChatGPT implementation with support for Bing's GPT-4 version of ChatGPT, plus the official ChatGPT model via OpenAI's API. Available as a Node.js module, REST API server, and CLI app.
KakaoBrain KoGPT (Korean Generative Pre-trained Transformer)
Add a description, image, and links to the gpt topic page so that developers can more easily learn about it.
To associate your repository with the gpt topic, visit your repo's landing page and select "manage topics."