Swiss to Release Open, Multilingual LLM Model

Insider Brief

  • Switzerland will release its first fully public large language model (LLM) in late summer 2025, developed by ETH Zurich, EPFL, and the Swiss National Supercomputing Centre (CSCS) to promote transparency, multilingualism, and open AI innovation.
  • Funded by the ETH Board and trained on the “Alps” supercomputer powered by over 10,000 NVIDIA Grace Hopper Superchips, the Swiss LLM is designed for sovereign AI infrastructure and will be released under an Apache 2.0 license with full access to code, training data, and model weights.
  • Featuring multilingual capabilities across more than 1,000 languages and two model sizes (8B and 70B parameters), the model targets broad adoption in science, government, education, and industry, with researchers emphasizing compliance with Swiss and EU regulations, ethical data sourcing, and transparent documentation.

Switzerland is preparing to release its first large language model built entirely on public infrastructure, a move intended to promote transparency, multilingualism, and open innovation in artificial intelligence. Scheduled for public release in late summer 2025, the model is the result of collaboration between researchers at ETH Zurich, EPFL, and the Swiss National Supercomputing Centre (CSCS), according to an announcement from the institutions.

“Fully open models enable high-trust applications and are necessary for advancing research about the risks and opportunities of AI. Transparent processes also enable regulatory compliance,” Imanol Schlag, research scientist at the ETH AI Center, noted.

Funded by the ETH Board as part of the Swiss AI Initiative, the project leverages the “Alps” supercomputer at CSCS, powered by over 10,000 NVIDIA Grace Hopper Superchips and carbon-neutral electricity. The initiative is part of Switzerland’s effort to create sovereign AI infrastructure, supported by partnerships with NVIDIA and HPE/Cray over the past 15 years. The project aims to develop AI technology that can serve scientific, governmental, and industrial applications without reliance on closed-source commercial models, researchers said.

A defining feature of the Swiss LLM is its multilingual capacity. The model is trained on data representing more than 1,000 languages, aiming to improve global accessibility and inclusivity. Training included approximately 15 trillion tokens, roughly 60% English and 40% non-English content, as well as code and mathematical data. The release will include two model sizes—8 billion and 70 billion parameters—positioning the larger model among the most powerful open-source LLMs available globally.

The model, designed for open access, will be available under an Apache 2.0 license, with its code, training data, and model weights fully published. According to EPFL and ETH Zurich, this approach will support widespread adoption across scientific research, education, government services, and private industry. Full documentation will accompany the release, detailing architecture, training methodology, and responsible use guidelines to ensure transparency and safe deployment.

Researchers indicated the project adheres to Swiss data protection laws, Swiss copyright regulations, and transparency requirements under the EU AI Act. Researchers also report that respecting web crawling opt-outs did not significantly impact model performance, demonstrating the feasibility of more ethical data sourcing.

The forthcoming release reflects Switzerland’s strategy to foster AI research through open science and multinational cooperation, as detailed by the Swiss AI Initiative. Launched in December 2023, the initiative involves more than 10 Swiss academic institutions and over 800 researchers, with access to 20 million GPU hours annually from CSCS’s supercomputer. As part of the European Laboratory for Learning and Intelligent Systems (ELLIS), ETH Zurich and EPFL aim to strengthen Europe’s role in trustworthy and transparent AI.

“By embracing full openness — unlike commercial models that are developed behind closed doors — we hope that our approach will drive innovation in Switzerland, across Europe, and through multinational collaborations. Furthermore, it is a key factor in attracting and nurturing top talent,” EPFL professor Martin Jaggi pointed out.

Greg Bock

Greg Bock is an award-winning investigative journalist with more than 25 years of experience in print, digital, and broadcast news. His reporting has spanned crime, politics, business and technology, earning multiple Keystone Awards and a Pennsylvania Association of Broadcasters honors. Through the Associated Press and Nexstar Media Group, his coverage has reached audiences across the United States.

Share this article:

AI Insider

Discover the future of AI technology with "AI Insider" - your go-to platform for industry data, market insights, and groundbreaking AI news

Subscribe today for the latest news about the AI landscape