Back to Glossary
BERTBERT
バート(バート)
IntermediateModels & Architecture
Bidirectional Encoder Representations from Transformers — Google's model that understands context by reading text in both directions simultaneously.
Why It Matters
BERT dramatically improved Google Search and set new benchmarks for language understanding tasks.
Example in Practice
Google Search understanding that 'bank' in 'I sat by the river bank' refers to land, not a financial institution.