“We’re The Only People Who’ve Ever Done This”, Claims Groq CEO Jonathan Ross

In the world of artificial intelligence (AI), a new challenger has emerged to take on industry titan NVIDIA. Groq, a startup founded in 2016, is armed with $640 million in fresh funding and a bold vision to reshape the AI computing landscape.

Groq CEO Jonathan Ross believes his company’s approach sets it apart from the competition.

“We build the LPU, you’ve heard of the GPU, but the LPU is a language processing unit, and the difference is GPUs are built for highly parallel programs, things where you can do a lot of tasks at the same time, but they’re not sequential, they don’t rely on each other,” he explained in an interview with Forbes this week.

This focus on sequential processing, rather than parallelism, is crucial for applications like natural language processing, where each word depends on the previous ones.

“If you want to write a story, you need a coherent arc, you need to know the beginning, the end, and everything’s going to depend on what else happens,” Ross noted.

Groq’s solution to this challenge is its cloud-based service, which allows developers to easily integrate its specialized chips into their applications.

“We don’t require that you buy these servers and put them in data centers yourself, we handle all that for you, makes it super easy,” said Ross.

The company’s strategy has paid off, with a surge in user adoption in the wake of the ChatGPT boom.

“In the last 14 weeks, we’ve gone from fewer than 7 developers to over 260,000 developers, and that’s because we made it super easy,” Ross revealed.

NVIDIA has long dominated the AI hardware market, but Groq believes it can disrupt the industry with its laser-focus on performance.

Ross stated that improving latency by 100 milliseconds typically results in about an 8% increase in engagement, but instead of just a 100-millisecond improvement, they reduced latency from 10 seconds to 1 second, achieving 90 instances of that 8% increase.

Looking ahead, Groq’s biggest challenge is scaling its hardware deployment.

“We have to get to about 1,300 racks by the end of the year, so that’s 200 to 1,300, and so everything that we’re doing is about scaling,” he said.

As the AI revolution continues to unfold, Groq is positioning itself as a formidable challenger to NVIDIA’s dominance. With its unique architecture, cloud-based approach, and rapid growth, the startup is poised to reshape the future of AI computing.

“We’re the only people who’ve ever done this, typically chips are designed by hardware engineers and hardware architects, and so they start with the chip and then they figure out the software later,” Ross said, before adding: “This was a little bit like having a driver design a car and then it caused all sorts of headaches in terms of how do we like fit the engine into this weird thing because it wasn’t what a mechanic would design, it wasn’t what a hardware engineer would design, but it actually works much better for the end user.”

Featured image: Credit: Forbes

Share this article:

AI Insider

Discover the future of AI technology with "AI Insider" - your go-to platform for industry data, market insights, and groundbreaking AI news

Subscribe today for the latest news about the AI landscape