Sign In  |  Register  |  About Burlingame  |  Contact Us

Burlingame, CA
September 01, 2020 10:18am
7-Day Forecast | Traffic
  • Search Hotels in Burlingame

  • CHECK-IN:
  • CHECK-OUT:
  • ROOMS:

Red Hat Delivers Accessible, Open Source Generative AI Innovation with Red Hat Enterprise Linux AI

  • The offering is the first to deliver supported, indemnified and open source-licensed IBM Granite LLMs under Red Hat’s flexible and proven enterprise subscription model
  • Adds open source InstructLab model alignment tools to the world’s leading enterprise Linux platform to simplify generative AI model experimentation and alignment tuning
  • Provides a supported, enterprise-ready model runtime environment across AMD, Intel and NVIDIA platforms for fueling AI innovation built on open source

Red Hat, Inc., the world's leading provider of open source solutions, today announced the launch of Red Hat Enterprise Linux AI (RHEL AI), a foundation model platform that enables users to more seamlessly develop, test and deploy generative AI (GenAI) models. RHEL AI brings together the open source-licensed Granite large language model (LLM) family from IBM Research, InstructLab model alignment tools based on the LAB (Large-scale Alignment for chatBots) methodology and a community-driven approach to model development through the InstructLab project. The entire solution is packaged as an optimized, bootable RHEL image for individual server deployments across the hybrid cloud and is also included as part of OpenShift AI, Red Hat’s hybrid machine learning operations (MLOps) platform, for running models and InstructLab at scale across distributed cluster environments.

The launch of ChatGPT generated tremendous interest in GenAI, with the pace of innovation only accelerating since then. Enterprises have begun moving from early evaluations of GenAI services to building out AI-enabled applications. A rapidly growing ecosystem of open model options has spurred further AI innovation and illustrated that there won’t be “one model to rule them all.” Customers will benefit from an array of choices to address specific requirements, all of which stands to be further accelerated by an open approach to innovation.

Implementing an AI strategy requires more than simply selecting a model; technology organizations need the expertise to tune a given model for their specific use case, as well as deal with the significant costs of AI implementation. The scarcity of data science skills are compounded by substantial financial requirements including:

  • Procuring AI infrastructure or consuming AI services
  • The complex process of tuning AI models for specific business needs
  • Integrating AI into enterprise applications
  • Managing both the application and model lifecycle.

To truly lower the entry barriers for AI innovation, enterprises need to be able to expand the roster of who can work on AI initiatives while simultaneously getting these costs under control. With InstructLab alignment tools, Granite models and RHEL AI, Red Hat aims to apply the benefits of true open source projects - freely accessible and reusable, transparent and open to contributions - to GenAI in an effort to remove these obstacles.

Building AI in the open with InstructLab

IBM Research created the Large-scale Alignment for chatBots (LAB) technique, an approach for model alignment that uses taxonomy-guided synthetic data generation and a novel multi-phase tuning framework. This approach makes AI model development more open and accessible to all users by reducing reliance on expensive human annotations and proprietary models. Using the LAB method, models can be improved by specifying skills and knowledge attached to a taxonomy, generating synthetic data from that information at scale to influence the model and using the generated data for model training.

After seeing that the LAB method could help significantly improve model performance, IBM and Red Hat decided to launch InstructLab, an open source community built around the LAB method and the open source Granite models from IBM. The InstructLab project aims to put LLM development into the hands of developers by making, building and contributing to an LLM as simple as contributing to any other open source project.

As part of the InstructLab launch, IBM has also released a family of select Granite English language and code models in the open. These models are released under an Apache license with transparency on the datasets used to train these models. The Granite 7B English language model has been integrated into the InstructLab community, where end users can contribute the skills and knowledge to collectively enhance this model, just as they would when contributing to any other open source project. Similar support for Granite code models within InstructLab will be available soon.

Open source AI innovation on a trusted Linux backbone

RHEL AI builds on this open approach to AI innovation, incorporating an enterprise-ready version of the InstructLab project and the Granite language and code models along with the world’s leading enterprise Linux platform to simplify deployment across a hybrid infrastructure environment. This creates a foundation model platform for bringing open source-licensed GenAI models into the enterprise. RHEL AI includes:

  • Open source-licensed Granite language and code models that are supported and indemnified by Red Hat.
  • A supported, lifecycled distribution of InstructLab that provides a scalable, cost-effective solution for enhancing LLM capabilities and making knowledge and skills contributions accessible to a much wider range of users.
  • Optimized bootable model runtime instances with Granite models and InstructLab tooling packages as bootable RHEL images via RHEL image mode, including optimized Pytorch runtime libraries and accelerators for AMD Instinct™ MI300X, Intel and NVIDIA GPUs and NeMo frameworks.
  • Red Hat’s complete enterprise support and lifecycle promise that starts with a trusted enterprise product distribution, 24x7 production support and extended lifecycle support.

As organizations experiment and tune new AI models on RHEL AI, they have a ready on-ramp for scaling these workflows with Red Hat OpenShift AI, which will include RHEL AI, and where they can leverage OpenShift’s Kubernetes engine to train and serve AI models at scale and OpenShift AI’s integrated MLOps capabilities to manage the model lifecycle. IBM’s watsonx.ai enterprise studio, which is built on Red Hat OpenShift AI today, will benefit from the inclusion of RHEL AI in OpenShift AI upon availability, bringing additional capabilities for enterprise AI development, data management, model governance and improved price performance.

The cloud is hybrid. So is AI.

For more than 30 years, open source technologies have paired rapid innovation with greatly reduced IT costs and lowered barriers to innovation. Red Hat has been leading this charge for nearly as long, from delivering open enterprise Linux platforms with RHEL in the early 2000s to driving containers and Kubernetes as the foundation for open hybrid cloud and cloud-native computing with Red Hat OpenShift.

This drive continues with Red Hat powering AI/ML strategies across the open hybrid cloud, enabling AI workloads to run where data lives, whether in the datacenter, multiple public clouds or at the edge. More than just the workloads, Red Hat’s vision for AI brings model training and tuning down this same path to better address limitations around data sovereignty, compliance and operational integrity. The consistency delivered by Red Hat’s platforms across these environments, no matter where they run, is crucial in keeping AI innovation flowing.

RHEL AI and the InstructLab community further deliver on this vision, breaking down many of the barriers to experimenting with and building AI models while providing the tools, data and concepts needed to fuel the next wave of intelligent workloads.

Availability

Red Hat Enterprise Linux AI is now available as a developer preview. Building on the GPU infrastructure available on IBM Cloud, which is used to train the Granite models and support InstructLab, IBM Cloud will now be adding support for RHEL AI and OpenShift AI. This integration will allow enterprises to deploy generative AI more easily into their mission critical applications.

Red Hat Summit

Join the Red Hat Summit keynotes to hear the latest from Red Hat executives, customers and partners:

Supporting Quotes

Ashesh Badani, senior vice president and chief product officer, Red Hat

“GenAI presents a revolutionary leap forward for enterprises, but only if technology organizations are able to actually deploy and use AI models in a way that matches their specific business needs. RHEL AI and the InstructLab project, coupled with Red Hat OpenShift AI at scale, are designed to lower many of the barriers facing GenAI across the hybrid cloud, from limited data science skills to the sheer resources required, while fueling innovation both in enterprise deployments and in upstream communities.”

Ramine Roane, corporate vice president, AI Group, AMD

"AI is one of the most important shifts in technology in the past 50 years. To help accelerate broader adoption of AI, the models and tools used to build AI applications need to be made accessible to the enterprise. Built on a trusted Linux backbone with open source tooling and open source-licensed models, RHEL AI is one of the platforms that can provide this, and we’re pleased to support Red Hat’s effort in driving AI in the enterprise forward with our AMD technologies including Instinct AI accelerators.”

Jeremy Foster, senior vice President and general manager, Networking - Compute, Cisco

“The AI movement represents a seismic shift for enterprises, and many organizations are grappling with the best path forward. Cisco continues to work closely with Red Hat to advance AI adoption and RHEL AI will accelerate innovation by providing open-source enabled LLM models as part of an enterprise-ready Linux platform.”

Gil Shneorson, senior vice president, Solution Platforms, Dell Technologies

“Dell Technologies has been a pioneer in delivering reliable and consistent lifecycle management for infrastructure; we believe that consistent, reliable and safe system updates for enterprise IT operations is imperative as systems continue to evolve. New technologies offering organizations the ability to extend code updates and reduce deployment times will be important to maintaining progress in innovation."

Frances Guida, director, Compute Solutions and AI, Hewlett Packard Enterprise

"Hewlett Packard Enterprise has collaborated with Red Hat for two decades to provide industry-leading solutions that combine the HPE Compute platforms with RHEL. An open environment is critical to innovation in fast-growing technology categories like generative AI and we are looking forward to exploring new areas of collaboration with RHEL AI and Red Hat to help HPE customers find success.”

Darío Gil, senior vice president and director, Research, IBM

“Bringing true open source innovation to AI model development and harnessing the power of a broad community will change how enterprises think about their plans for AI adoption and scale. IBM has been a strong supporter of open source, backing influential communities like Linux, Apache, and Eclipse, and our collaboration with Red Hat represents a step forward in our open approach to building safe, responsible, and effective AI. RHEL AI and InstructLab, combined with IBM’s open source Granite family of models, will deliver new value and choice for clients who are looking to build fit for purpose models that address their use cases with their own data while minimizing cost across a diverse hybrid cloud environment.”

Hillery Hunter, CTO and general manager of innovation, IBM Infrastructure

“Many enterprises are looking to infuse AI into their mission-critical applications. The availability of RHEL AI and OpenShift AI on IBM Cloud will help transform how the community and companies build and leverage generative AI. We are enabling open collaboration, simplifying model customization and providing enterprise-quality supported models and tools for building AI into every application.”

Justin Hotard, executive vice president and general manager, Data Center & AI Group, Intel

“For AI adoption to scale, it needs to be fully open source, where community members contribute and create new applications and use cases. By incorporating the open source Granite models and the InstructLab project, RHEL AI is poised to bring a significant change in how we interact with, tune and ultimately use AI models in production.”

Kirk Skaugen, president, Lenovo ISG

“Customers deploying AI-driven applications need to be able to test and experiment with potential models on a trusted and familiar but still innovative platform. Lenovo believes the future of AI is hybrid, and we see RHEL AI as a critical pathway for widespread hybrid AI innovation, and through Red Hat OpenShift AI, production scalability. We’re pleased to work with Red Hat in driving this hybrid AI innovation forward through joint testing on our ThinkSystem 8-GPU servers, providing a trusted hardware foundation for powering open source AI.”

Justin Boitano, vice president, Enterprise Products, NVIDIA

“Red Hat and NVIDIA have a long history of close collaboration, and Red Hat Enterprise Linux AI demonstrates our shared focus on bringing full-stack computing and software to the developers and researchers building the next wave of AI technology and applications.”

Additional Resources

Connect with Red Hat

About Red Hat, Inc.

Red Hat is the world’s leading provider of enterprise open source software solutions, using a community-powered approach to deliver reliable and high-performing Linux, hybrid cloud, container, and Kubernetes technologies. Red Hat helps customers integrate new and existing IT applications, develop cloud-native applications, standardize on our industry-leading operating system, and automate, secure, and manage complex environments. Award-winning support, training, and consulting services make Red Hat a trusted adviser to the Fortune 500. As a strategic partner to cloud providers, system integrators, application vendors, customers, and open source communities, Red Hat can help organizations prepare for the digital future.

Forward-Looking Statements

Except for the historical information and discussions contained herein, statements contained in this press release may constitute forward-looking statements within the meaning of the Private Securities Litigation Reform Act of 1995. Forward-looking statements are based on the company’s current assumptions regarding future business and financial performance. These statements involve a number of risks, uncertainties and other factors that could cause actual results to differ materially. Any forward-looking statement in this press release speaks only as of the date on which it is made. Except as required by law, the company assumes no obligation to update or revise any forward-looking statements.

Red Hat, Red Hat Enterprise Linux, the Red Hat logo and OpenShift are trademarks or registered trademarks of Red Hat, Inc. or its subsidiaries in the U.S. and other countries. Linux® is the registered trademark of Linus Torvalds in the U.S. and other countries.

Contacts

Data & News supplied by www.cloudquote.io
Stock quotes supplied by Barchart
Quotes delayed at least 20 minutes.
By accessing this page, you agree to the following
Privacy Policy and Terms and Conditions.
 
 
Copyright © 2010-2020 Burlingame.com & California Media Partners, LLC. All rights reserved.