A trend starts from "Chain of Thought Prompting Elicits Reasoning in Large Language Models".
-
Updated
Feb 9, 2023
A trend starts from "Chain of Thought Prompting Elicits Reasoning in Large Language Models".
PyTorch implementation of VALL-E(Zero-Shot Text-To-Speech)
This repository contains a collection of papers and resources on Reasoning in Large Language Models.
[ICLR 2023] Code for the paper "Binding Language Models in Symbolic Languages"
Official Code for "Language Models as Zero-Shot Planners: Extracting Actionable Knowledge for Embodied Agents"
Collection of papers and resources on Reasoning using Language Models
Official implementation and model release of the paper "What Makes Good Examples for Visual In-Context Learning?"
[ICLR 2023] Code for our paper "Selective Annotation Makes Language Models Better Few-Shot Learners"
Experiments and code to generate the GINC small-scale in-context learning dataset from "An Explanation for In-context Learning as Implicit Bayesian Inference"
SMASHED is a toolkit designed to apply transformations to samples in datasets, such as fields extraction, tokenization, prompting, batching, and more. Supports datasets from Huggingface, torchdata iterables, or simple lists of dictionaries.
Code for our paper “Compositional Exemplars for In-context Learning”.
Experiments on GPT-3's ability to fit numerical models in-context.
[EMNLP-2022 Findings] Code for paper “ProGen: Progressive Zero-shot Dataset Generation via In-context Feedback”.
Week 4 - Prompt Engineering: In-context learning with GPT-3 and other Large Language Models
Diverse Demonstrations Improve In-context Compositional Generalization
A repository to demonstrate some of the concepts behind large language models, transformer (foundation) models, in-context learning, and prompt engineering using open source large language models like Bloom and co:here.
Code base of In-Context Learning for Dialogue State tracking
Official implementation of paper "Improving Few-Shot Performance of Language Models via Nearest Neighbor Calibration"
Add a description, image, and links to the in-context-learning topic page so that developers can more easily learn about it.
To associate your repository with the in-context-learning topic, visit your repo's landing page and select "manage topics."