Groq, a leader in ultra-fast inference technology, and HUMAIN, a PIF-backed company and Saudi Arabia’s premier AI services provider, have announced the immediate availability of OpenAI’s latest open models on GroqCloud. Starting now, the models gpt-oss-120B and gpt-oss-20B are live with full 128K context support, real-time response capabilities, and integrated server-side tools on Groq’s optimized inference platform.
This launch marks a significant step in accelerating the adoption of open-source AI models. Groq has a history of supporting OpenAI’s open-source initiatives, including large-scale deployments such as Whisper. The latest integration builds on that foundation, delivering enhanced global accessibility paired with localized support from HUMAIN in Saudi Arabia.
Also Read: Cloudera acquires Taikun to enable AI-driven data anywhere
“OpenAI is setting a new high-performance standard for open-source models,” said Jonathan Ross, CEO of Groq. “Groq is designed to run such models quickly and cost-effectively, so developers can use them from day one. Working with HUMAIN strengthens local access and support in the Kingdom of Saudi Arabia, enabling developers in the region to build smarter and faster.”
“Groq provides the unmatched inference speed, scalability, and cost-effectiveness we need to bring cutting-edge AI to the Kingdom,” said Tareq Amin, CEO of HUMAIN. “Together, we are enabling a new wave of Saudi innovation powered by the best open-source models and the infrastructure to scale them globally. We are proud to support OpenAI’s leadership in open-source AI.”
Full Model Capability from Day One
GroqCloud enables developers to take full advantage of OpenAI’s newest models by offering extended context length and integrated tools like code execution and real-time web search. These features enhance logical reasoning, streamline complex workflows, and deliver up-to-date information instantly all from day one. With 128K token context, developers can now build more sophisticated, large-scale AI applications efficiently.