Gen-AI-Today

GenAI TODAY NEWS

Free eNews Subscription

Red Hat Expands AI Horizons with RHEL AI 1.3

By Greg Tavarez

The adoption of open-source foundation models for generative AI has gained traction. In fact, according to an IDC report, "Market Analysis Perspective: Open GenAI, LLMs, and the Evolving Open Source Ecosystem," a majority of organizations plan to use open-source foundation models for their GenAI initiatives. Furthermore, the report revealed that over half of currently deployed foundation models are already open-source.

This growing trend toward open-source solutions only amplifies the increasing recognition of their flexibility, cost-effectiveness and potential for innovation in the GenAI landscape.

Red Hat sees this trend validating the company’s vision for enterprise GenAI (which calls for smaller, open source-licensed models that can run anywhere and everywhere needed across the hybrid cloud; this includes fine-tuning capabilities that enable organizations to more easily customize LLMs to private data and specific use cases, as well as more efficient AI models driven by inference performance engineering expertise).

Therefore, this provider of open source solutions announced the latest release of Red Hat Enterprise Linux AI, or RHEL AI, Red Hat’s foundation model platform for more seamlessly developing, testing and running GenAI models for enterprise applications.

RHEL AI forms a key pillar for Red Hat’s AI vision by bringing together the open source-licensed Granite model family and InstructLab model alignment tools, based on the Large-scale Alignment for chatBots methodology. These components are then packaged as an optimized, bootable Red Hat Enterprise Linux image for individual server deployments anywhere across the hybrid cloud.

RHEL AI 1.3 expands its capabilities in GenAI with the integration of Granite 3.0 8b, a converged language model supporting English and multiple other languages, code generation and function calling. This enhanced model, along with the inclusion of Docling, a tool for parsing and converting documents into AI-friendly formats, will allow users to streamline data preparation and improve the quality of AI-generated content.

The platform's versatility is further extended with support for Intel Gaudi 3 accelerators, expanding the range of hardware options available to users. Additionally, RHEL AI's availability on major cloud providers and as optimized solutions on marketplaces like Azure Marketplace and AWS Marketplace provides flexibility and convenience for deployment.

To address the growing demand for efficient LLM serving, Red Hat OpenShift AI introduces parallelized serving capabilities for faster response times and improved user experiences. Offering dynamic parameter alteration, such as model sharding and quantization, means that Red Hat OpenShift AI helps users optimize model performance and resource utilization.

And lastly, Red Hat AI and Red Hat OpenShift AI work synergistically to deliver an AI solution across hybrid cloud environments. RHEL AI provides the foundation for individual server environments, while Red Hat OpenShift AI offers a distributed Kubernetes platform with integrated MLOps capabilities. Together, they accelerate time to market and reduce operational costs for AI initiatives.

“To harness the transformative power of gen AI, we believe that smaller, optimized models are a necessity, and that these models need to be deployed anywhere and everywhere across the hybrid cloud,” said Joe Fernandes, Vice President and General Manager, Artificial Intelligence Business Unit, Red Hat. “Our enhancements to RHEL AI build on this belief, making it easier to prepare organizational data for private model training with Docling and incorporating the latest advancements in the Granite family of open source-licensed LLMs.”

RHEL AI 1.3 is now generally available.

Be part of the discussion about the latest trends and developments in the Generative AI space at Generative AI Expo, taking place February 11-13, 2025, in Fort Lauderdale, Florida. Generative AI Expo covers the evolution of GenAI and will feature conversations focused on the potential for GenAI across industries and how the technology is already being used to create new opportunities for businesses to improve operations, enhance customer experiences, and create new growth opportunities.




Edited by Alex Passett
Get stories like this delivered straight to your inbox. [Free eNews Subscription]

GenAIToday Editor

SHARE THIS ARTICLE
Related Articles

What enterprises learn from mixed open model stacks

By: Contributing Writer    1/23/2026

Enterprises are done waiting for a single model to solve every problem. The most effective teams blend specialist open models, proprietary endpoints a…

Read More

Achieving Predictability at Scale: How AI and Automation is Transforming IT Operations

By: Special Guest    12/22/2025

Achieve predictability at scale by leveraging multi-domain agentic workflows that fuse AI reasoning with deterministic execution to eliminate IT silos…

Read More

Enhancing Labor Productivity in Construction with AI Field Feedback Loops

By: Contributing Writer    12/15/2025

In the competitive construction industry, leveraging Drawer AI-powered field feedback loops is key to improving labor productivity and bid accuracy. T…

Read More

Icons8 Icons: an engineering-grade playbook

By: Contributing Writer    11/19/2025

Icons are functional parts of the interface. They compress intent, telegraph state, and reduce rereads. When you run them like a subsystem-documented,…

Read More

TMC and genaitoday.ai Announce 2025 Generative AI Product of the Year Award Winners

By: TMCnet News    11/14/2025

GenAI Product of the Year Award winners demonstrate exceptional ability to transform workflows, accelerate decision-making, elevate customer engagemen…

Read More

-->