Lightning AI Studios: Never set up a local environment again →

← Back to glossary

Adapter

Adapter is a lightweight adaptation method that adds a small number of learnable parameters to a pre-trained model, allowing for efficient fine-tuning while preserving its original knowledge, enabling high-quality responses and improved performance on various tasks.

Related content

llama-adapter pseudo-code
Understanding Parameter-Efficient Finetuning of Large Language Models: From Prefix Tuning to LLaMA-Adapters
How To Finetune GPT Like Large Language Models on a Custom Dataset
Falcon – A guide to finetune and inference
Finetuning Falcon LLMs More Efficiently With LoRA and Adapters