Lightning AI Studios: Never set up a local environment again →

← Back to glossary

Attention

An attention mechanism is a component used in large language models to focus on specific parts of the input sequence during processing, assigning different weights to elements based on their relevance. It helps capture dependencies and improve the model’s ability to generate contextually informed predictions or outputs.