Why the Newest LLMs use a MoE (Mixture of Experts) Architecture

Why the Newest LLMs use a MoE (Mixture of Experts) Architecture