#
albert
Here are 166 public repositories matching this topic...
A LITE BERT FOR SELF-SUPERVISED LEARNING OF LANGUAGE REPRESENTATIONS, 海量中文预训练ALBERT模型
-
Updated
Jul 5, 2022 - Python
State of the Art Natural Language Processing
nlp
natural-language-processing
spark
sentiment-analysis
text-classification
tensorflow
machine-translation
transformers
language-detection
pyspark
named-entity-recognition
seq2seq
lemmatizer
spell-checker
albert
bert
part-of-speech-tagger
entity-extraction
spark-ml
xlnet
-
Updated
Jul 9, 2022 - Scala
中文语言理解测评基准 Chinese Language Understanding Evaluation Benchmark: datasets, baselines, pre-trained models, corpus and leaderboard
benchmark
tensorflow
nlu
glue
corpus
transformers
pytorch
dataset
chinese
pretrained-models
language-model
albert
bert
roberta
chineseglue
-
Updated
Jul 5, 2022 - Python
Open Source Pre-training Model Framework in PyTorch & Pre-trained Model Zoo
natural-language-processing
model-zoo
pytorch
classification
bart
chinese
gpt
pegasus
ner
clue
albert
bert
fine-tuning
roberta
elmo
pre-training
gpt-2
t5
unilm
xlm-roberta
-
Updated
Jul 4, 2022 - Python
Language Understanding Evaluation benchmark for Chinese: datasets, baselines, pre-trained models,corpus and leaderboard
-
Updated
Dec 1, 2021 - Python
中文长文本分类、短句子分类、多标签分类、两句子相似度(Chinese Text Classification of Keras NLP, multi-label classify, or sentence classify, long or short),字词句向量嵌入层(embeddings)和网络层(graph)构建基类,FastText,TextCNN,CharCNN,TextRNN, RCNN, DCNN, DPCNN, VDCNN, CRNN, Bert, Xlnet, Albert, Attention, DeepMoji, HAN, 胶囊网络-CapsuleNet, Transformer-encode, Seq2seq, SWEM, LEAM, TextGCN
nlp
text-classification
keras
embeddings
transformer
fasttext
albert
bert
capsule
han
rcnn
dcnn
textcnn
crnn
dpcnn
vdcnn
charcnn
xlnet
keras-textclassification
leam
-
Updated
Jun 22, 2022 - Python
Chinese NER(Named Entity Recognition) using BERT(Softmax, CRF, Span)
-
Updated
May 29, 2021 - Python
good first issue
Good for newcomers
CLUENER2020 中文细粒度命名实体识别 Fine Grained Named Entity Recognition
dataset
named-entity-recognition
chinese
seq2seq
sequence-to-sequence
ner
albert
bert
sequence-labeling
chinese-ner
roberta
fine-grained-ner
chinesener
-
Updated
Jul 5, 2022 - Python
machine-learning
deep-learning
clustering
tensorflow
scikit-learn
keras
transformers
pytorch
gan
neural-networks
convolutional-neural-networks
gpt
gans
albert
dbscan
bert
keras-tensorflow
pytorch-tutorial
pytorch-implementation
huggingface-transformers
-
Updated
Jun 15, 2022
This repo contains a PyTorch implementation of a pretrained BERT model for multi-label text classification.
nlp
text-classification
transformers
pytorch
multi-label-classification
albert
bert
fine-tuning
pytorch-implmention
xlnet
-
Updated
Jun 2, 2021 - Python
高质量中文预训练模型集合:最先进大模型、最快小模型、相似度专门模型
text-classification
corpus
dataset
chinese
semantic-similarity
pretrained-models
sentence-classification
albert
bert
sentence-analysis
distillation
sentence-pairs
roberta
-
Updated
Jul 8, 2020 - Python
自然语言处理工具Macropodus,基于Albert+BiLSTM+CRF深度学习网络架构,中文分词,词性标注,命名实体识别,新词发现,关键词,文本摘要,文本相似度,科学计算器,中文数字阿拉伯数字(罗马数字)转换,中文繁简转换,拼音转换。tookit(tool) of NLP,CWS(chinese word segnment),POS(Part-Of-Speech Tagging),NER(name entity recognition),Find(new words discovery),Keyword(keyword extraction),Summarize(text summarization),Sim(text similarity),Calculate(scientific calculator),Chi2num(chinese number to arabic number)
-
Updated
Jun 22, 2022 - Python
Sentiment analysis neural network trained by fine-tuning BERT, ALBERT, or DistilBERT on the Stanford Sentiment Treebank.
nlp
flask
machine-learning
vuejs
sentiment-analysis
pytorch
transformer
stanford-sentiment-treebank
albert
bert
pytorch-implementation
bert-model
huggingface
distilbert
huggingface-transformer
huggingface-transformers
-
Updated
Jan 29, 2022 - Python
1. Use BERT, ALBERT and GPT2 as tensorflow2.0's layer. 2. Implement GCN, GAN, GIN and GraphSAGE based on message passing.
nlp
tf2
gin
gan
albert
bert
message-passing
graph-convolutional-networks
gcn
textcnn
graphsage
bilstm-attention
gnn
tensorflow2
gpt2
bert-ner
bert-cls
albert-ner
graph-classfication
textgcn
-
Updated
Jun 9, 2020 - Python
multi-label,classifier,text classification,多标签文本分类,文本分类,BERT,ALBERT,multi-label-classification
text-classification
tensorflow
cnn
multi-label-classification
albert
bert
multi-label
textcnn
text-classifier
classifier-multi-label
-
Updated
Oct 19, 2021 - Python
Models to perform neural summarization (extractive and abstractive) using machine learning transformers and a tool to convert abstractive summarization datasets to the extractive task.
machine-learning
text-summarization
summarization
albert
extractive-summarization
automatic-summarization
bert
roberta
transformer-models
pytorch-lightning
distilbert
summarization-dataset
-
Updated
Mar 26, 2022 - Python
MONPA 罔拍是一個提供正體中文斷詞、詞性標註以及命名實體辨識的多任務模型
nlp
named-entity-recognition
pos
ner
word-segmentation
albert
bert
pos-tagging
chinese-word-segmentation
-
Updated
Sep 13, 2021 - Python
ALBERT model Pretraining and Fine Tuning using TF2.0
classifier
glue
tf2
mlm
albert
squad
machine-comprehension
cola
tensoflow
fine-tuning
xla
multi-gpu-training
tf-hub
albert-tf2
weights-conversion
-
Updated
May 26, 2022 - Python
KB국민은행에서 제공하는 경제/금융 도메인에 특화된 한국어 ALBERT 모델
-
Updated
Oct 7, 2021 - Python
One-Stop Solution to encode sentence to fixed length vectors from various embedding techniques
nlp
ai
deep-learning
tensorflow
encoder
word2vec
embeddings
transformer
glove
fasttext
albert
embedding
bert
word-embedding
roberta
ulmfit
sentence-encoding
bert-as-service
xlnet
embedding-as-service
-
Updated
Jun 22, 2022 - Python
Pytorch-Named-Entity-Recognition-with-transformers
crf
transformers
pgd
pytorch
span
ner
albert
bert
softmax
fgm
electra
xlm
roberta
adversarial-training
distilbert
camembert
xlmroberta
-
Updated
Jun 1, 2020 - Python
Improve this page
Add a description, image, and links to the albert topic page so that developers can more easily learn about it.
Add this topic to your repo
To associate your repository with the albert topic, visit your repo's landing page and select "manage topics."
Hi, I am so interesting in your project, and wonder if you need contributor and how could I make my own contribution?