Open Source · MarkTechPost ·

Meet AntAngelMed: A 103B-Parameter Open-Source Medical Language Model Built on a 1/32 Activation-Ratio MoE Architecture

Meet AntAngelMed: A 103B-Parameter Open-Source Medical Language Model Built on a 1/32 Activation-Ratio MoE Architecture

MedAIBase released AntAngelMed, a 103B-parameter open-source medical language model built with a 1/32 activation-ratio MoE architecture that activates 6.1B parameters at inference. The model was trained with continual pre-training, supervised fine-tuning, and GRPO-based reinforcement learning, and ranks highly on sever

Read the full story at MarkTechPost →