Announcing the ICML 2026 Tutorials
April 2 2026 Announcing the ICML 2026 Tutorials Gautam Kamath ICML 2026 By ICML 2026 Tutorial Chairs Claire Vernade and Adam White. Tutorials are a critical part of the conference experience. They provide graduate students the opportunity to learn in depth from topic experts. They provide practitioners and theoreticians the ability to learn foundational background and the dark-arts of how to get things working in practice. With our growing community, tutorials can often provide a much needed smaller community to connect with. With this rapid growth comes challenges. A huge diversity of topics. A large number of proposed tutorial submissions. At the same time we want to channel the community’s inputs, while relying on non-artificial evaluation of submitted proposals. In this post, we outline the review process we used for this year’s tutorials and formally announce the selected tutorials. This year’s process This year we used a three-pronged approach. We wanted some invited tutorials, suggestions from the community, and a rigorous review process. In late December we reached out and confirmed three invited tutorial presenters: Is numerical optimization theory irrelevant to machine learning practice in 2026? Mark Schmidt (UBC) Probabilistic Numerics — Computation is Machine Learning Philipp Hennig , Marvin Pförtner , Tim Weiland (U Tübingen) Calibration: From Predictions to Decisions, Collaboration, and Alignment Aaron Roth , Natalie Collina (UPenn) After we confirmed these speakers, we reached out to the ICML community to solicit nominations for both topic suggestions and members of the community to deliver tutorials. We received 118 suggestions from a health diversity of folks from undergraduates all the way to industry professionals: From these initial seeds, we created a call for proposals. We distilled the community suggestions into six suggested topics (as well as “Other,” of course): Beyond-Transformer sequence models (state-space, S4, etc.) Deep Learning and Deep RL Theory and Applications Diffusion models – quantitative & theoretical understanding LLM post-training and test-time training Safety, machine unlearning, watermarking, and fingerprinting Theorem proving and Lean We also reach out to 14 individuals, personally inviting them to submit a proposal, as they were nominated by the community in phase two. The call resulted in 52 submissions! All proposals were evaluated in the same way, even if they were invited. Both chairs independently reviewed and read every single proposal and made a short list for each of the 7 topic areas (including “Other”). In the final calibration, both chairs agreed almost unanimously on all topic areas. The main criteria used were: 1) quality of proposal, 2) established topic expertise by presenters, and 3) teaching experience. The latter did not exclude industry researchers but required folks to explain their teaching experience. In the end, we selected an additional seven tutorials for ICML 2026 (making a total of ten) and we are thrilled to announce them below. All tutorials will take place on the first day of the conference, Monday July 6. Proving Theorems with Lean and Machine Learning Rémy Degenne , Wenda Li Evaluating and Training LLMs for Math Copilots and Theorem Proving Simon Frieder , Philip Vonderlind Adaptive Reasoning in LLMs: From Post-Training to Test-Time Learning Akhil Arora , Nouha Dziri New Techniques for Sequence Prediction: Spectral Filtering and Preconditioning Elad Hazan , Annie Marsden Unifying Attention and Diffusion with Kan Extension Transformers: Structured Deep Learning with Diagrammatic Backpropagation Sridhar Mahadevan Unlearning Data at Scale Vinith M. Suriyakumar , Gautam Kamath , Ashia Wilson Diffusion and Flow-Matching: From Memorization to Generalization & Beyond Mathurin Massias , Quentin Bertrand
Executive Summary
The article discusses the review process and selected tutorials for the ICML 2026 conference. A three-pronged approach was used, consisting of invited tutorials, community suggestions, and a rigorous review process. The process involved a call for proposals, evaluation of 52 submissions, and selection of additional tutorials based on quality, expertise, and teaching experience. The selected tutorials cover a range of topics, including machine learning, optimization, and safety. The article highlights the importance of providing opportunities for graduate students, practitioners, and theoreticians to learn from topic experts and engage with the community.
Key Points
- ▸ A three-pronged approach was used to select tutorials for ICML 2026
- ▸ Invited tutorials, community suggestions, and a rigorous review process were employed
- ▸ The review process involved a call for proposals, evaluation of submissions, and selection of additional tutorials
Merits
Inclusive Review Process
The use of a community-driven approach and rigorous review process ensured that a diverse range of topics and presenters were considered, providing an inclusive and representative selection of tutorials.
Expert Evaluation
The evaluation process involved independent review and reading of every proposal by the chairs, ensuring that each submission was thoroughly assessed and considered.
Clear Criteria
The selection criteria of quality of proposal, established topic expertise by presenters, and teaching experience provided a clear and transparent framework for evaluating submissions.
Demerits
Complex Process
The multi-step review process may be time-consuming and resource-intensive, requiring significant effort from the chairs and reviewers.
Subjective Evaluation
While the selection criteria were clear, the evaluation process still involved some degree of subjectivity, which may have led to varying opinions among reviewers.
Limited Transparency
The article does not provide detailed information on the evaluation process or the selection of invited tutorials, which may limit transparency and accountability.
Expert Commentary
The article provides a detailed account of the review process and selected tutorials for ICML 2026, highlighting the importance of a rigorous and inclusive approach. While the process may be complex and time-consuming, the benefits of a more representative and diverse selection of tutorials are clear. The article also highlights the importance of community engagement and providing opportunities for graduate students, practitioners, and theoreticians to learn from topic experts. Overall, the article provides valuable insights for conference organizers, educators, and researchers interested in machine learning education and community engagement.
Recommendations
- ✓ Conferences should prioritize inclusive review processes and rigorous evaluation to ensure a diverse and representative selection of tutorials.
- ✓ Educators and researchers should consider the importance of community engagement and providing opportunities for graduate students, practitioners, and theoreticians to learn from topic experts.
Sources
Original: ICML