Monday, December 23, 2024

H2O.ai Releases New Language Model H2O-Danube-1.8B for Mobile

Related stories

Doc.com Expands AI developments to Revolutionize Healthcare Access

Doc.com, a pioneering healthcare technology company, proudly announces the development...

Amesite Announces AI-Powered NurseMagic™ Growth in Marketing Reach to Key Markets

Amesite Inc., creator of the AI-powered NurseMagic™ app, announces...

Quantiphi Joins AWS Generative AI Partner Innovation Alliance

Quantiphi, an AI-first digital engineering company, has been named...
spot_imgspot_img

H2O-Danube-1.8B super tiny LLM model designed to run on smartphones, laptops, desktops and IoT devices, spurring growth in natural language applications and further democratizing AI

H2O.ai, the open source leader in Generative AI and machine learning and maker behind Enterprise h2oGPTe, is announcing the release of H2O-Danube-1.8B – an open source natural language model with 1.8 billion parameters. Despite being trained on significantly less data than comparable models, benchmark results show H2O-Danube-1.8B achieves highly competitive performance across a wide range of natural language tasks.

“We are excited to release H2O-Danube-1.8B as a portable LLM on small devices like your smartphone, something that Anthropic is not offering. The proliferation of smaller, lower-cost hardware and more efficient training now allows modestly-sized models to be accessible to a wider audience. With an Apache 2.0 license for commercial use and versatile capabilities, we believe H2O-Danube-1.8B will be a game changer for mobile offline applications,” said Sri Ambati, CEO and co-founder of H2O.ai.

Also Read: Rossum Aurora AI accelerates document automation with human-level accuracy and unprecedented speed

As detailed in the arXiv technical report, H2O-Danube-1.8B was trained on 1 trillion tokens collected from diverse web sources, with techniques refined from models like LLama 2 and Mistral. Despite the relatively limited training data, benchmark results show H2O-Danube-1.8B performs on par or better than other models in the 1-2 billion parameter size class across tasks like common sense reasoning, reading comprehension, summarization and translation.

H2O.ai also announced the release of H2O-Danube-1.8B-Chat, a version of the model fine-tuned specifically for conversational applications. Building on the base H2O-Danube-1.8B model, the chat version was tuned using supervised learning on dialog datasets followed by reinforcement learning using human preferences. Initial benchmark results show state-of-the-art performance compared to existing chat models with less than 2 billion parameters.

Both the base H2O-Danube-1.8B model and chat-tuned version are available immediately from Hugging Face. H2O.ai will be releasing additional tools to simplify using the models in applications, as well as exploring potential future model scaling.

SOURCE: BusinessWire

Subscribe

- Never miss a story with notifications


    Latest stories

    spot_img