Gen-AI-Today

GenAI TODAY NEWS

Free eNews Subscription

Datadog Launches LLM Observability to Enhance GenAI Monitoring and Security

By Stefania Viscusi

Datadog, a provider of monitoring and security solutions for cloud applications, has introduced LLM Observability, a new offering designed to streamline the monitoring, optimization, and security of Generative AI applications. The launch marks a significant advancement in supporting AI application developers and machine learning (ML) engineers and enables organizations to deploy generative AI features more efficiently and reliably in production environments.

As AI technologies rapidly evolve, organizations across industries are looking to integrate generative AI capabilities into their operations. However, deploying and managing these AI models poses significant challenges due to their intricate nature, non-deterministic behavior, and inherent security risks.

With Datadog’s LLM Observability offering, these challenges are mitigated by providing comprehensive visibility into LLM chains, so users can pinpoint errors, anomalies, and security vulnerabilities.

"The Datadog LLM Observability solution helps our team understand, debug and evaluate the usage and performance of our GenAI applications," said Kyle Triplett, VP of Product at AppFolio. “With it, we are able to address real-world issues, including monitoring response quality to prevent negative interactions and performance degradations, while ensuring we are providing our end users with positive experiences.”

Features and benefits of the observability tools include:

  • Enhanced Visibility and Monitoring: LLM Observability offers detailed insights into each step of the LLM chain, allowing developers to identify and resolve issues such as errors and unexpected responses, commonly known as hallucinations. Operational metrics like latency and token usage can be monitored in real-time to optimize performance and control costs efficiently.
  • Quality and Safety Evaluations: Users can also evaluate the quality of AI applications based on criteria like topic relevance and toxicity. This helps in maintaining the integrity of AI-generated content while mitigating security and privacy risks.
  • Integration and Scalability: Unlike traditional tools, Datadog's solution integrates seamlessly with its Application Performance Monitoring (APM) capabilities. It supports major platforms including OpenAI, Anthropic, Azure OpenAI, and Amazon Bedrock, providing a unified dashboard for monitoring and optimizing AI applications across diverse environments.

"There's a rush to adopt new LLM-based technologies, but organizations of all sizes and industries are finding it difficult to do so in a way that is both cost effective and doesn't negatively impact the end user experience," said Yrieix Garnier, VP of Product at Datadog. "Datadog LLM Observability provides the deep visibility needed to help teams manage and understand performance, detect drifts or biases, and resolve issues before they have a significant impact on the business or end-user experience."




Edited by Erik Linask
Get stories like this delivered straight to your inbox. [Free eNews Subscription]
SHARE THIS ARTICLE
Related Articles

What enterprises learn from mixed open model stacks

By: Contributing Writer    1/23/2026

Enterprises are done waiting for a single model to solve every problem. The most effective teams blend specialist open models, proprietary endpoints a…

Read More

Achieving Predictability at Scale: How AI and Automation is Transforming IT Operations

By: Special Guest    12/22/2025

Achieve predictability at scale by leveraging multi-domain agentic workflows that fuse AI reasoning with deterministic execution to eliminate IT silos…

Read More

Enhancing Labor Productivity in Construction with AI Field Feedback Loops

By: Contributing Writer    12/15/2025

In the competitive construction industry, leveraging Drawer AI-powered field feedback loops is key to improving labor productivity and bid accuracy. T…

Read More

Icons8 Icons: an engineering-grade playbook

By: Contributing Writer    11/19/2025

Icons are functional parts of the interface. They compress intent, telegraph state, and reduce rereads. When you run them like a subsystem-documented,…

Read More

TMC and genaitoday.ai Announce 2025 Generative AI Product of the Year Award Winners

By: TMCnet News    11/14/2025

GenAI Product of the Year Award winners demonstrate exceptional ability to transform workflows, accelerate decision-making, elevate customer engagemen…

Read More

-->