Back to Glossary

MoEMixture of Experts

混合エキスパート(こんごうエキスパート)

AdvancedModels & Architecture

A model architecture that uses multiple specialized sub-networks (experts), activating only relevant ones for each input to improve efficiency.

Why It Matters

MoE allows models to be much larger without proportionally increasing computation cost.

Example in Practice

GPT-4 reportedly uses MoE, routing different types of questions to different specialized sub-networks.

Want to understand AI, not just define it?

Our courses teach you to build with these concepts, not just memorize them.