corporatetechentertainmentresearchmiscwellnessathletics

Red Hat Details Vision For AI, Casts Foundations In Granite


Red Hat Details Vision For AI, Casts Foundations In Granite

An OS is smart. Your computer or mobile device's operating system is designed to work as the management motherlode inside your machine to manage memory processes and control the status of all hardware and software components. With enough depth to oversee the running of the central processing unit and storage layer, an OS also works upwards to help deliver information to the graphical user interface, while also managing security, input/output channels, connectivity to other machines and the state of installed applications.

But the humble old OS is getting smarter.

In the open source universe driven by Red Hat, the firm has now engineered the release of Red Hat Enterprise Linux AI (known as RHEL AI), Red Hat's foundation model platform for developing, testing and running generative artificial intelligence models for enterprise applications.

RHEL AI 1.3 (pronounced "rel-aye-eye") now has support for the new features in the Granite large language model family. Emanating from research and development carried out at Red Hat parent company IBM, Granite LLMs can be applied to generative AI use cases that span both human language and software application code. Their open source nature means data engineers and software developers are free to modify and experiment with them. IBM says that this makes Granite models a "good choice" for organizations that deal with sensitive data and want to run their own LLM rather than relying on an outside service.

Red Hat's work with Granite incorporates these open source advancements for data preparation (organizing information streams into a managed state where it is ready to be put to work inside any given AI model or indeed database or other location) while maintaining choice for hybrid cloud deployments, including the underlying accelerated compute architecture. Essentially this makes this technology an open foundation model function and according to IDC's "Market Analysis Perspective: Open GenAI, LLMs and the Evolving Open Source Ecosystem" report, almost two-thirds of organizations say that they plan to use open source foundation models for generative AI use cases.

As a welcome reminder, Red Hat tells us that, "Foundation models are trained to function with a general understanding of patterns, structures and representations of language. This 'foundational' training teaches the model how to communicate and identify those patterns. The IBM Granite AI models have this baseline of knowledge that can be further fine-tuned to perform specific tasks for almost any industry. Granite family models are trained on curated data and provide transparency into the data that's used for training."

Red Hat says there's a key trend to pick up on here in terms of the way enterprise generative AI can be used productively. The company says its vision for this space sees the wider use of smaller, open source-licensed models that can run anywhere and everywhere needed across the hybrid cloud. It also identifies fine-tuning capabilities that enable organizations to more easily customize LLMs to private data and specific use cases. Also important is the need to be able to optimize for more efficient AI models driven by inference performance engineering expertize.

"To harness the transformative power of gen AI, we believe that smaller, optimized models are a necessity and that these models need to be deployed anywhere and everywhere across the hybrid cloud. Our enhancements to RHEL AI build on this belief, making it easier to prepare organizational data for private model training with Docling [Docking an upstream open source project and tool for parsing and converting documents, from .pdf and .docx to .pptx and HTML etc.] and incorporating the latest advancements in the Granite family of open source-licensed LLMs.," said Joe Fernandes, vice president and general manager for the AI business units at Red Hat.

Future RHEL AI releases will continue to support and refine Docling components, including additional document formats as well as integration for retrieval-augmented generation (RAG) pipelines in addition to InstructLab knowledge tuning. Through Docling, RHEL AI 1.3 now also includes "context-aware chunking", which takes into account the structure and semantic elements of the documents used for generative AI training. This helps resulting gen AI applications maintain better levels of coherency and contextually-appropriate responses to questions and tasks, which otherwise would require further tuning and alignment.

Fernandes says that RHEL AI forms "a key pillar" for Red Hat's AI vision, bringing together the open source-licensed Granite model family and InstructLab model alignment tools, based on the large-scale alignment for chatBots methodology. These components are then packaged as an optimized, bootable Red Hat Enterprise Linux image for individual server deployments anywhere across the hybrid cloud.

Red Hat's vision for smaller, more customizable and more tightly controllable generative AI functions within the realms of an organization's own data estate and inside a business-specific set of firewalls, guardrails and policy controls falls well in line with what we've been hearing from the rest of the technology industry for the last eighteen months if not longer.

Red Hat OpenShift AI also allows users to dynamically alter an LLM's parameters when being served, such as sharding the model across multiple GPUs or quantizing the model to a smaller footprint. These improvements are aimed at speeding up response time for users, increasing customer satisfaction and lowering churn. So you're looking at a more eminently controllable evolution of AI, which is something the naysayers (and those who are afraid of the rise of the robots) have been wanting from the very start. What (arguably) matters here is the sheer size, scope and solidity of Red Hat given the company's (positive) influence on Linux and enterprise open source at a higher level.

Let's remember, Red Hat is a defiantly driven firm i.e. most companies that get acquired by a parent as large as IBM end up being a branded product or service; very few retain their original brand, identity and cultural operational ethos in quite this way.

Previous articleNext article

POPULAR CATEGORY

corporate

10189

tech

11464

entertainment

12517

research

5669

misc

13262

wellness

10079

athletics

13241