New metrics capabilities for OpenTelemetry on Azure Monitor

New metrics capabilities for OpenTelemetry on Azure Monitor

Microsoft has released a series of preview updates to its Azure Monitor OpenTelemetry Exporter packages for .NET, Node.js, and Python applications. New features include: exporting OpenTelemetry metrics to Azure Monitor Application Insights (AMAI), improved sampling control for traces and spans, and caching and retrying of telemetry during transient connectivity disruptions to Azure Monitor Application Insights.

Azure Monitor is a suite of tools for collecting, analyzing, and responding to infrastructure and application telemetry from the cloud and on-premises environments. AMAI is one of the tools in Azure Monitor and provides Application Performance Monitoring (APM) to its users. Additionally, Azure Monitor Application Insights supports distributed tracing, one of the pillars of the observability paradigm, across multiple applications.

OpenTelemetry is a framework that provides vendor-agnostic APIs, SDKs, and tools to consume, transform, and export telemetry data in observability backends. In a blog post in 2021, Microsoft outlined its roadmap for integrating OpenTelemetry into its broader Azure Monitor ecosystem. The immediate focus was on building direct exporters from OpenTelemetry-based applications to AMAI as opposed to the OpenTelemetry de facto route of an OTLP exporter to Azure Monitor via the OpenTelemetry Collector.


Source: https://ukautohits.com/wp-content/uploads/2022/11/New-metrics-capabilities-for-OpenTelemetry-on-Azure-Monitor.png;charset=UTF-8

An example of direct exporter in a Node.js application with existing OpenTelemetry trace would be:

const { AzureMonitorTraceExporter } = require("@azure/monitor-opentelemetry-exporter");
const { NodeTracerProvider } = require("@opentelemetry/sdk-trace-node");
const { BatchSpanProcessor } = require("@opentelemetry/sdk-trace-base");


const provider = new NodeTracerProvider({
  resource: new Resource({
    [SemanticResourceAttributes.SERVICE_NAME]: "basic-service",
  }),
});
provider.register();

// Create an exporter instance
const exporter = new AzureMonitorTraceExporter({
  connectionString:
    process.env["APPLICATIONINSIGHTS_CONNECTION_STRING"] || ""
});

// Add the exporter to the provider
provider.addSpanProcessor(
  new BatchSpanProcessor(exporter, {
    bufferTimeout: 15000,
    bufferSize: 1000
  })
);

With the release of the new updates to the Azure Monitor OpenTelemetry Exporter packages, exporting metrics to AMAI would now be possible as shown below:

const { MeterProvider, PeriodicExportingMetricReader } = require("@opentelemetry/sdk-metrics");
const { Resource } = require("@opentelemetry/resources");
const { AzureMonitorMetricExporter } = require("@azure/monitor-opentelemetry-exporter");

// Add the exporter into the MetricReader and register it with the MeterProvider
const provider = new MeterProvider();
const exporter = new AzureMonitorMetricExporter({
  connectionString:
    process.env["APPLICATIONINSIGHTS_CONNECTION_STRING"] || "",
});
const metricReaderOptions = {
  exporter: exporter,
};
const metricReader = new PeriodicExportingMetricReader(metricReaderOptions);
provider.addMetricReader(metricReader);
);

To manage the amount of telemetry sent to Application Insights, the packets now include a sampler that controls the percentage of traces sent. For the Node.js trace example from earlier, this would look like this:

import { ApplicationInsightsSampler, AzureMonitorTraceExporter } from "@azure/monitor-opentelemetry-exporter";

// Sampler expects a sample rate of between 0 and 1 inclusive
// A rate of 0.75 means approximately 75% of traces are sent
const aiSampler = new ApplicationInsightsSampler(0.75);
const provider = new NodeTracerProvider({
  resource: new Resource({
    [SemanticResourceAttributes.SERVICE_NAME]: "basic-service",
  }),
  sampler: aiSampler
});

Finally, in the event of connection failures to AMAI, the direct exporters write their payloads to local storage and attempt redelivery at regular intervals over a 48-hour period. These settings can be configured when instantiating an exporter, as shown below:

const exporter = new AzureMonitorTraceExporter({
    connectionString:
        process.env["APPLICATIONINSIGHTS_CONNECTION_STRING"],
    storageDirectory: "C:\\SomeDirectory",     // your desired location
    disableOfflineStorage: false               // enabled by default, set to disable
});


#metrics #capabilities #OpenTelemetry #Azure #Monitor

Leave a Comment

Your email address will not be published. Required fields are marked *