Teradata Makes Real-World GenAI Easier, Speeds Business Value
New bring-your-own LLM capability enables
New integration with the full-stack NVIDIA AI platform delivers accelerated computing
As GenAI moves from idea to reality, enterprises are increasingly interested in a more comprehensive AI strategy that prioritizes practical use cases known for delivering more immediate business value – a critical benefit when 84 percent of executives expect ROI from AI initiatives within a year. With the advances in the innovation of large language models (LLMs), and the emergence of small and medium models, AI providers can offer fit-for-purpose open-source models to provide significant versatility across a broad spectrum of use cases, but without the high cost and complexity of large models.
By adding bring-your-own LLM (BYO-LLM), Teradata customers can take advantage of small or mid-sized open LLMs, including domain-specific models. In addition to these models being easier to deploy and more cost-effective overall, Teradata’s new features bring the LLMs to the data (versus the other way around) so that organizations can also minimize expensive data movement and maximize security, privacy and trust.
Teradata also now provides customers with the flexibility to strategically leverage either GPUs or CPUs, depending on the complexity and size of the LLM. If required, GPUs can be used to offer speed and performance at scale for tasks like inferencing and model fine-tuning, both of which will be available on
“Teradata customers want to swiftly move from exploration to meaningful application of generative AI,” said
Real-World GenAI with Open-Source LLMs
Organizations have come to recognize that larger LLMs aren’t suited for every use case and can be cost-prohibitive. BYO-LLM allows users to choose the best model for their specific business needs, and according to Forrester, 46 percent of AI leaders plan to leverage existing open-source LLMs in their generative AI strategy. With Teradata’s implementation of BYO-LLM,
Smaller LLMs are typically domain-specific and tailored for valuable, real-world use cases, such as:
- Regulatory compliance: Banks use specialized open LLMs to identify emails with potential regulatory implications, reducing the need for expensive GPU infrastructure.
- Healthcare note analysis: Open LLMs can analyze doctor’s notes to automate information extraction, enhancing patient care without moving sensitive data.
- Product recommendations: Utilizing LLM embeddings combined with in-database analytics from Teradata ClearScape Analytics, businesses can optimize their recommendation systems.
- Customer complaint analysis: Open LLMs help analyze complaint topics, sentiments, and summaries, integrating insights into a 360° view of the customer for improved resolution strategies.
Teradata’s commitment to an open and connected ecosystem means that as more open LLMs come to market, Teradata’s customers will be able to keep pace with innovation and use BYO-LLM to switch to models with less vendor lock-in.
GPU Analytic Clusters for Inferencing and Fine-Tuning
By adding full-stack NVIDIA accelerated computing support to
Availability
ClearScape Analytics BYO-LLM for
About Teradata
At Teradata, we believe that people thrive when empowered with trusted information. We offer the most complete cloud analytics and data platform for AI. By delivering harmonized data and trusted AI, we enable more confident decision-making, unlock faster innovation, and drive the impactful business results organizations need most.
See how at Teradata.com.
The Teradata logo is a trademark, and Teradata is a registered trademark of
View source version on businesswire.com: https://www.businesswire.com/news/home/20241008518850/en/
MEDIA CONTACT
jennifer.donahue@teradata.com
Source: Teradata