top of page

ZurichNLP #20

Wed 01 Apr

|

ETH AI Center

Fabian Schaipp (Inria) on recent trends in training algorithms for ML and Valentina Pyatkin (Allen Institute, ETH) on lessons from training open-source LLMs.

ZurichNLP #20
ZurichNLP #20

Time & Location

01 Apr 2026, 18:00 – 20:00

ETH AI Center, OAT ETH Zurich (19th floor), Andreasstrasse 5, 8050 Zürich, Switzerland

About the Event

Fabian Schaipp (Inria): All Quiet on the Optimization Front? Recent Trends in Training Algorithms for ML

Fine-tuned training recipes remain a core challenge for crafting state-of-the-art machine learning models. In this talk, we will give an overview of recent developments around optimization and training algorithms. In particular, this will cover (i) new optimization methods (Muon, SOAP, ...) challenging the reign of AdamW and (ii) hyperparameter tuning and scaling laws (learning-rates and schedules, muP, ...) for efficient training at scale.


Valentina Pyatkin (Allen Institute, ETH): Open LLM Post-Training: Lessons from Training Olmo and Tülu

Drawing on the OLMo and Tülu projects, which openly release not just model weights but data, code, and training decisions, this talk shares lessons about what works, what doesn't, and what we still don't understand about the post-training of language models.

Share This Event

© ZurichAI 2022-2100

  • Grey LinkedIn Icon
  • Grey Facebook Icon
bottom of page