🤗 PEFT: State-of-the-art Parameter-Efficient Fine-Tuning.
-
Updated
Jan 29, 2026 - Python
🤗 PEFT: State-of-the-art Parameter-Efficient Fine-Tuning.
Build, personalize and control your own LLMs. From data pre-processing to fine-tuning, xTuring provides an easy way to personalize open-source LLMs. Join our discord community: https://discord.gg/TgHXuSJEk6
[ICLR 2023 Spotlight] Vision Transformer Adapter for Dense Predictions
Adapting Meta AI's Segment Anything to Downstream Tasks with Adapters and Prompts
A lightweight adapter bridges SAM with medical imaging [MedIA]
[VINT 2026] SAM2-UNet: Segment Anything 2 Makes Strong Encoder for Natural and Medical Image Segmentation
[NeurIPS 2022] Implementation of "AdaptFormer: Adapting Vision Transformers for Scalable Visual Recognition"
Design Pattern that described by Python, This is the source code for the book of Everybody Know Design Patterns.
Official repository for the ICLR 2024 paper "Towards Seamless Adaptation of Pre-trained Models for Visual Place Recognition".
A generalized framework for subspace tuning methods in parameter efficient fine-tuning.
Official repository for the CVPR 2024 paper "CricaVPR: Cross-image Correlation-aware Representation Learning for Visual Place Recognition".
codelab_adapter extensions
[Pattern Recognition 2025] Cross-Modal Adapter for Vision-Language Retrieval
Train an adapter for any embedding model in under a minute
[CVPR 2024] Memory-based Adapters for Online 3D Scene Perception
ACM MM'23 (oral), SUR-adapter for pre-trained diffusion models can acquire the powerful semantic understanding and reasoning capabilities from large language models to build a high-quality textual semantic representation for text-to-image generation.
A versatile sequenced read processor for nanopore direct RNA sequencing
【AAAI2025】MambaPro: Multi-Modal Object Re-Identification with Mamba Aggregation and Synergistic Prompt
Add a description, image, and links to the adapter topic page so that developers can more easily learn about it.
To associate your repository with the adapter topic, visit your repo's landing page and select "manage topics."