-
Updated
Oct 6, 2021 - Python
#
transformers
Here are 874 public repositories matching this topic...
Implementation of Vision Transformer, a simple way to achieve SOTA in vision classification with only a single transformer encoder, in Pytorch
An implementation of model parallel GPT-2 and GPT-3-style models using the mesh-tensorflow library.
-
Updated
Oct 6, 2021 - Python
machine-learning
reinforcement-learning
deep-learning
transformers
pytorch
transformer
gan
neural-networks
deep-learning-tutorial
optimizers
-
Updated
Oct 9, 2021 - Jupyter Notebook
-
Updated
Oct 7, 2021 - Rust
Simple command line tool for text to image generation using OpenAI's CLIP and Siren (Implicit neural representation network). Technique was originally created by https://twitter.com/advadnoun
deep-learning
transformers
artificial-intelligence
siren
text-to-image
multi-modality
implicit-neural-representation
-
Updated
Sep 12, 2021 - Python
Implementation / replication of DALL-E, OpenAI's Text to Image Transformer, in Pytorch
-
Updated
Sep 29, 2021 - Python
A PyTorch-based Speech Toolkit
audio
transformers
pytorch
voice-recognition
speech-recognition
speech-to-text
language-model
speaker-recognition
speaker-verification
speech-processing
audio-processing
asr
speaker-diarization
speechrecognition
speech-separation
speech-enhancement
spoken-language-understanding
huggingface
speech-toolkit
speechbrain
-
Updated
Oct 13, 2021 - Python
Tutorials on getting started with PyTorch and TorchText for sentiment analysis.
nlp
natural-language-processing
tutorial
sentiment-analysis
word-embeddings
transformers
cnn
pytorch
recurrent-neural-networks
lstm
rnn
fasttext
bert
sentiment-classification
pytorch-tutorial
pytorch-tutorials
cnn-text-classification
lstm-sentiment-analysis
pytorch-nlp
torchtext
-
Updated
Jul 15, 2021 - Jupyter Notebook
A model library for exploring state-of-the-art deep learning topologies and techniques for optimizing Natural Language Processing neural networks
-
Updated
Sep 12, 2021 - Python
Transformers for Classification, NER, QA, Language Modelling, Language Generation, T5, Multi-Modal, and Conversational AI
-
Updated
Oct 4, 2021 - Python
State of the Art Natural Language Processing
nlp
natural-language-processing
spark
sentiment-analysis
text-classification
tensorflow
machine-translation
transformers
language-detection
pyspark
named-entity-recognition
seq2seq
lemmatizer
spell-checker
albert
bert
part-of-speech-tagger
entity-extraction
spark-ml
xlnet
-
Updated
Oct 13, 2021 - Scala
中文语言理解测评基准 Chinese Language Understanding Evaluation Benchmark: datasets, baselines, pre-trained models, corpus and leaderboard
benchmark
tensorflow
nlu
glue
corpus
transformers
pytorch
dataset
chinese
pretrained-models
language-model
albert
bert
roberta
chineseglue
-
Updated
Oct 13, 2021 - Python
Super easy library for BERT based NLP models
-
Updated
Sep 23, 2021 - Python
Reformer, the efficient Transformer, in Pytorch
-
Updated
Aug 25, 2021 - Python
Leveraging BERT and c-TF-IDF to create easily interpretable topics.
nlp
machine-learning
topic
transformers
topic-modeling
bert
topic-models
sentence-embeddings
topic-modelling
ldavis
-
Updated
Oct 13, 2021 - Python
jiant is an nlp toolkit
-
Updated
Jul 26, 2021 - Python
MLeap: Deploy ML Pipelines to Production
-
Updated
Oct 13, 2021 - Scala
An implementation of model parallel autoregressive transformers on GPUs, based on the DeepSpeed library.
-
Updated
Oct 13, 2021 - Python
A simple but complete full-attention transformer with a set of promising experimental features from various papers
-
Updated
Oct 4, 2021 - Python
自然语言处理、知识图谱、对话系统三大技术研究与应用。
-
Updated
Jan 3, 2021
bradennapier
commented
Mar 11, 2020
Hey! Thanks for the work on this.
Wondering how we can use this with mocha? tsconfig-paths has its own tsconfig-paths/register to make this work
https://github.com/dividab/tsconfig-paths#with-mocha-and-ts-node
Basically with mocha we have to run mocha -r ts-node/register -- but that wouldnt have the compiler flag.
Would be worthwhile to have the ability to do it which looks like
Generative Adversarial Transformers
transformers
attention
image-generation
gans
generative-adversarial-networks
compositionality
scene-generation
-
Updated
Sep 25, 2021 - Python
This repository contains demos I made with the Transformers library by HuggingFace.
-
Updated
Sep 27, 2021 - Jupyter Notebook
Korean BERT pre-trained cased (KoBERT)
-
Updated
Jul 22, 2021 - Jupyter Notebook
An implementation of Performer, a linear attention-based transformer, in Pytorch
-
Updated
Oct 3, 2021 - Python
This Word Does Not Exist
machine-learning
natural-language-processing
transformers
natural-language-generation
natural-language-understanding
gpt-2
-
Updated
Jan 12, 2021 - Python
Implementation of various self-attention mechanisms focused on computer vision. Ongoing repository.
machine-learning
deep-learning
machine-learning-algorithms
transformers
artificial-intelligence
transformer
attention
attention-mechanism
self-attention
-
Updated
Sep 14, 2021 - Python
This repo contains a PyTorch implementation of a pretrained BERT model for multi-label text classification.
nlp
text-classification
transformers
pytorch
multi-label-classification
albert
bert
fine-tuning
pytorch-implmention
xlnet
-
Updated
Jun 2, 2021 - Python
Huggingface Transformers + Adapters = ❤️
-
Updated
Oct 12, 2021 - Python
Improve this page
Add a description, image, and links to the transformers topic page so that developers can more easily learn about it.
Add this topic to your repo
To associate your repository with the transformers topic, visit your repo's landing page and select "manage topics."
Problem
Some of our transformers & estimators are not thoroughly tested or not tested at all.
Solution
Use
OpTransformerSpecandOpEstimatorSpecbase test specs to provide tests for all existing transformers & estimators.