IT Brief UK - Technology news for CIOs & IT decision-makers
Modern data center with interconnected servers and ai models integration

Teradata upgrades ModelOps for scalable enterprise AI use

Wed, 30th Jul 2025

Teradata has introduced ModelOps updates to its ClearScape Analytics offering, targeting streamlined integration and deployment for Agentic AI and Generative AI applications as organisations transition from experimentation to production at scale.

ModelOps platform

The updated ModelOps platform aims to support analytics professionals and data scientists with native compatibility for open-source ONNX embedding models and leading cloud service provider large language model (LLM) APIs, including Azure OpenAI, Amazon Bedrock, and Google Gemini. With these enhancements, organisations can deploy, manage, and monitor AI models without having to rely on custom development, with newly added LLMOps capabilities designed to simplify workflows.

For less technical users such as business analysts, ModelOps also integrates low-code AutoML tools, providing an interface that facilitates intuitive access for users of different skill levels. The platform's unified interface is intended to reduce onboarding time and increase productivity by offering consistent interactions across its entire range of tools.

Challenges in AI adoption

Many organisations encounter challenges when progressing from AI experimentation to enterprise-wide implementation. According to Teradata, the use of multiple LLM providers and the adoption of various open-source models can cause workflow fragmentation, limited interoperability, and steep learning curves, ultimately inhibiting wider adoption and slowing down innovation. Unified governance frameworks are often lacking, making it difficult for organisations to maintain reliability and compliance requirements as they scale their AI capabilities.

These issues may cause generative and agentic AI projects to remain in isolation, rather than delivering integrated business insights. As a result, organisations could lose value if they are unable to effectively scale AI initiatives due to operational complexity and fragmented systems.

Unified access and governance

"The reality is that organisations will use multiple AI models and providers - it's not a question of if, but how, to manage that complexity effectively. Teradata's ModelOps offering provides the flexibility to work across combinations of models while maintaining trust and governance. Companies can then move confidently from experimentation to production, at scale, realising the full potential of their AI investments," said Sumeet Arora, Teradata's Chief Product Officer.

Teradata's ModelOps strategy is designed to provide unified access to a range of AI models and workflows, while maintaining governance and ease of use. This is intended to allow business users to deploy AI models quickly and safely, supporting both experimentation and production use.

An example scenario described by Teradata involved a bank seeking to improve its digital customer experience and retention rates by analysing customer feedback across channels. The unified ModelOps platform would allow the bank to consolidate multiple AI models - such as LLMs for sentiment analysis, embedding models for categorisation, and AutoML for predictive analytics - within one environment. The aim is to equip both technical and non-technical teams to act on customer intelligence at greater speed and scale.

Key features

The updated ModelOps capabilities in ClearScape Analytics include:

  • Seamless Integration with Public LLM APIs: Users can connect with APIs from providers such as Azure OpenAI, Google Gemini, and Amazon Bedrock for a variety of LLMs, including Anthropic, Mistral, DeepSeek, and Meta. This integration supports secure registration, monitoring, observability, autoscaling, and usage analytics. Administrative options are available for retry policies, concurrency, and health or spend tracking at the project or model level.
  • Managing and monitoring LLMs with LLMOps: The platform supports rapid deployment of NVIDIA NIM LLMs within GPU environments. Features include LLM Model Cards for transparency, monitoring, and governance, as well as full lifecycle management - covering deployment, versioning, performance tracking, and retirement.
  • ONNX Embedding Model Deployment: ClearScape Analytics natively supports ONNX embedding models and tokenisers, including support for Bring-Your-Own-Model workflows and unified deployment processes for custom vector search models.
  • Low-Code AutoML: Teams can create, train, monitor, and deploy models through an accessible low-code interface with performance monitoring and visual explainability features.
  • User Interface Improvements: The upgrade provides a unified user experience across all major tools, such as AutoML, Playground, Tables, and Datasets, with guided wizards and new table interaction options aimed at reducing skill barriers.

Availability of the updated ModelOps in ClearScape Analytics is anticipated in the fourth quarter for users of AI Factory and VantageCloud platforms.

Follow us on:
Follow us on LinkedIn Follow us on X
Share on:
Share on LinkedIn Share on X