-
Updated
Mar 14, 2019 - Python
#
squad
Here are 112 public repositories matching this topic...
Bi-directional Attention Flow (BiDAF) network is a multi-stage hierarchical process that represents context at different levels of granularity and uses a bi-directional attention flow mechanism to achieve a query-aware context representation without early summarization.
A Tensorflow implementation of QANet for machine reading comprehension
-
Updated
May 30, 2018 - Python
Tensorflow Implementation of R-Net
-
Updated
Aug 8, 2018 - Python
-
Updated
May 27, 2020
Mikolaj
commented
Aug 25, 2020
Just add the test at the end of the current Tasty testsuite.
A Tensorflow Implementation of R-net: Machine reading comprehension with self matching networks
-
Updated
Oct 18, 2019 - Python
The official implementation of ICLR 2020, "Learning to Retrieve Reasoning Paths over Wikipedia Graph for Question Answering".
-
Updated
Apr 8, 2020 - Python
Open R-NET implementation and detailed analysis: https://git.io/vd8dx
-
Updated
Dec 26, 2017 - Python
ALBERT model Pretraining and Fine Tuning using TF2.0
classifier
glue
tf2
mlm
albert
squad
machine-comprehension
cola
tensoflow
fine-tuning
xla
multi-gpu-training
tf-hub
albert-tf2
weights-conversion
-
Updated
Aug 18, 2020 - Python
Mikolaj
commented
Aug 1, 2020
Would be very effective against regenerating foes (then we can add and/or improve foes that are very weak, but regenerate and possibly avoid melee to be safe from big hits that they can't regenerate against). Can be a throwing weapon, because swapping it in and out as a melee weapon worth it only for regenerating foes could be too tiring.
A PyTorch implementation of Mnemonic Reader for the Machine Comprehension task
-
Updated
Nov 15, 2018 - Python
A PyTorch implemention of Match-LSTM, R-NET and M-Reader for Machine Reading Comprehension
-
Updated
Jul 5, 2018 - Python
R-NET implementation in TensorFlow.
-
Updated
Dec 30, 2017 - Python
TensorFlow Models for the Stanford Question Answering Dataset
-
Updated
Dec 14, 2018 - Python
BERT which stands for Bidirectional Encoder Representations from Transformations is the SOTA in Transfer Learning in NLP.
download
pytorch
question-answering
pretrained-models
squad
bert
bert-model
bert-questionandanswering
bert-qna-pretrained-models
huggingface
bert-models
bert-pytorch
-
Updated
Mar 27, 2020 - Python
An Implementation of Bidirectional Attention Flow
-
Updated
Sep 6, 2017 - Python
Pytorch Implementation of Dynamic Co-attention Networks(DCN+) for Question Answering on SQuAD2.0
-
Updated
Jul 14, 2018 - Python
Pytorch implementation of "Dynamic Coattention Networks For Question Answering"
-
Updated
Oct 21, 2018 - Python
Machine Reading Comprehension in Tensorflow
machine-learning
natural-language-processing
deep-learning
recurrent-neural-networks
artificial-intelligence
convolutional-neural-networks
squad
reading-comprehension
bidaf
natural-language-understanding
r-net
attention-model
machine-reading-comprehension
qanet
-
Updated
Jul 14, 2019 - Python
Code and datasets of "Multilingual Extractive Reading Comprehension by Runtime Machine Translation"
multilingual
nlp
natural-language-processing
pytorch
question-answering
nmt
squad
reading-comprehension
-
Updated
Jan 2, 2019 - Python
Author implementation of "Learning Recurrent Span Representations for Extractive Question Answering" (Lee et al. 2016)
-
Updated
Apr 6, 2017 - Python
Improve this page
Add a description, image, and links to the squad topic page so that developers can more easily learn about it.
Add this topic to your repo
To associate your repository with the squad topic, visit your repo's landing page and select "manage topics."
Question
Hi, I have been experimenting with the QA capabilities of Haystack and so far. I was wondering if it was possible for the model to generate paragraph-like contexts.
Additional context
So far, when a question is asked, the model outputs an answer and the context the answer can be found in. The context output by the model is oftentimes fragments of a sentence or fragments of a