Highlights
- Arctic Code Vault Contributor
Create your own GitHub profile
Sign up for your own profile on GitHub, the best place to host code, manage projects, and build software alongside 50 million developers.
Sign up
Pinned
810 contributions in the last year
Contribution activity
August 2020
Created an issue in arXivTimes/arXivTimes that received 1 comment
Axial-DeepLab: Stand-Alone Axial-Attention for Panoptic Segmentation
一言でいうと 画像でも局所特徴間の相関を得るためにSelf-Attentionが使用されているが、演算効率に問題がある。そこで、列内(=1次元のSelf-Attention)を行った後に行内のSelf-Attentionを行うといういわゆる因子化演算を提案している。加えて相対位置/特徴をより反…
1
comment
- AMBERT: A Pre-trained Language Model with Multi-Grained Tokenization
- Photon: A Robust Cross-Domain Text-to-SQL System
- Large-scale Pretraining for Visual Dialog: A Simple State-of-the-Art Baseline
- Attention-Based Query Expansion Learning
- Whitening and second order optimization both destroy information about the dataset, and can make generalization impossible
- Hopfield Networks is All You Need
- Theoretical Limitations of Self-Attention in Neural Sequence Models
- Analysing Deep Reinforcement Learning Agents Trained with Domain Randomisation
- A Comprehensive Analysis of Preprocessing for Word Representation Learning in Affective Tasks
- Human Attention Maps for Text Classification: Do Humans and Neural Networks Focus on the Same Words?
- Headline Generation: Learning from Decomposable Document Titles
- Spatiotemporal Contrastive Video Representation Learning
- Learning Distributed Representations of Sentences from Unlabelled Data
- Learning Company Embeddings from Annual Reports for Fine-grained Industry Characterization
- Hyperparameter Selection for Offline Reinforcement Learning
- Neural Generation Meets Real People: Towards Emotionally Engaging Mixed-Initiative Conversations
- Data-efficient Hindsight Off-policy Option Learning
- From Characters to Words to in Between: Do We Capture Morphology?
- Byte Pair Encoding is Suboptimal for Language Model Pretraining
- SummEval: Re-evaluating Summarization Evaluation