Inephany Raises $2.2M to tomate and Optimize AI Model Training

Insider Brief

  • Inephany has raised $2.2 million in pre-seed funding led by Amadeus Capital Partners to advance its AI-powered optimisation platform, which reduces the cost, time, and compute required to train large neural networks like LLMs.
  • The company’s real-time optimisation system improves training efficiency by avoiding brute-force trial-and-error methods, offering at least 10x cost savings and broader applicability across AI models in fields like finance, vision, and autonomous systems.
  • Founded by alumni from Apple Siri and Wluper, and chaired by AI pioneer Professor Steve Young, Inephany aims to democratize access to advanced AI through scalable, sustainable model training and future inference-time optimization.

PRESS RELEASE — Inephany, an AI optimization startup, has announced the close of a $2.2M pre-seed funding round to accelerate development of its AI-powered optimisation platform, which promises to revolutionise how neural networks — including Large Language Models (LLMs) — are trained and fine-tuned. The round was led by Amadeus Capital Partners, with participation from specialist AI VC Sure Valley Ventures and Professor Steve Young, the acclaimed AI pioneer and serial entrepreneur, who joins Inephany as both an angel investor and company chair.

As generative AI continues its rapid ascent, the soaring compute and energy costs of training cutting-edge models have emerged as a major bottleneck. Training GPT-4 is estimated to have cost between $60 million and $100 million, with next-generation models edging towards the $1 billion mark according to industry leaders such as Anthropic. AI compute demands are now doubling roughly every six months — outpacing Moore’s Law and rendering traditional training and optimisation methods increasingly unsustainable.

Inephany addresses this challenge head-on with a novel AI-driven optimisation system that intelligently controls the training process in real-time. Compared to traditional “brute force” approaches that rely on exhaustive trial-and-error optimisations, Inephany’s technology dramatically improves sample efficiency, accelerates training, reduces overall development time, and enhances final model performance — all while slashing compute costs. This breakthrough holds the potential to unlock scalable, sustainable AI development that’s at least 10x more cost-effective.

While the company’s initial focus is on training-time optimization for LLMs, its technology has broad applicability across the AI landscape — from Recurrent Neural Networks used in financial time-series forecasting, to Convolutional Neural Networks powering computer vision in autonomous systems. The company also has plans to expand its AI-powered optimisation approach to inference time compute. By slashing the cost and compute burden of training and deploying these models, the Inephany team aims to democratise access to advanced AI and accelerate innovation across industries.

Founded in July 2024 by Dr John Torr (formerly of Apple Siri’s machine learning team), Hami Bahraynian, and Maurice von Sturm (co-founders of conversational AI startup Wluper), Inephany brings together deep technical expertise in neural network optimisation. The funding will be used to grow the core engineering team, advance its optimisation platform, and onboard its first enterprise customers.

Professor Steve Young, renowned for his foundational contributions to speech recognition and dialogue systems — including key work behind Apple’s Siri — joins as chair to help guide the company’s next phase of growth.

John Torr, CEO at Inephany, said: “We are thrilled to be backed by such experienced investors, and having a seasoned entrepreneur and AI pioneer like Professor Steve Young as our chair is a true privilege. Current approaches to training LLMs and other neural networks are extremely wasteful across multiple dimensions. Our unique solution tackles this inefficiency head-on, with the potential to radically reduce both the cost and time required to train and optimise state-of-the-art models. As we prepare to deliver our first products later this year, we are incredibly excited to embark on the next chapter of our journey — and to help shape the ongoing AI revolution by transforming AI optimisation.”

Amelia Armour, Partner at Amadeus Capital Partners said: “We very much look forward to backing John, Hami, and Maurice as they tackle key efficiency challenges in current AI training. Their innovative approach to automating and optimising neural network training has the potential to reduce costs by an order of magnitude and accelerate advancements across AI applications. If rolled out at scale, the impact of this on what models can deliver will be very substantial.”

Professor Steve Young, said: “As the use of AI spreads ever wider, moving beyond the traditional applications of speech, language and vision into new and diverse areas such as weather prediction, healthcare, drug discovery and materials design, the need for very efficient training of accurate neural models is becoming critical. The groundbreaking new approach being developed by Inephany marks a step change in neural model training technology and I am delighted to join the team as chair and investor.”

SOURCE

Featured image: Credit: Inephany

AI Insider

Discover the future of AI technology with "AI Insider" - your go-to platform for industry data, market insights, and groundbreaking AI news

Subscribe today for the latest news about the AI landscape