Back to Glossary

Attention Mechanism

アテンション機構(アテンションきこう)

AdvancedModels & Architecture

A technique that allows AI models to focus on the most relevant parts of the input when producing output, like how humans focus on key words in a sentence.

Why It Matters

Attention is the core innovation behind transformers and what enables LLMs to understand context over long text.

Example in Practice

When translating 'The bank by the river,' attention helps the model focus on 'river' to know 'bank' means riverbank, not a financial institution.

Want to understand AI, not just define it?

Our courses teach you to build with these concepts, not just memorize them.