Google
News

Google

박찬 기자
2026.03.30
·News·by 성산/부산/잡부
#AI#Gemma#Google#LLM

Key Points

  • 1Google's Gemma 4 model is currently undergoing testing.
  • 2This evaluation is being conducted on the Chatbot Arena platform.
  • 3The models under test notably include a 120B Mixture of Experts (MoE) variant.

Google's 'Gemma 4' is currently undergoing testing within a chatbot arena. A notable component of this model is a 120-billion parameter Mixture-of-Experts (MoE) architecture. The MoE model is a type of neural network that utilizes multiple specialized "expert" sub-networks and a "gating network" to selectively activate a subset of these experts for a given input, allowing for a high total parameter count while maintaining computational efficiency during inference and training.