你当前正在访问 Microsoft Azure Global Edition 技术文档网站。 如果需要访问由世纪互联运营的 Microsoft Azure 中国技术文档网站,请访问 https://docs.azure.cn。
The Azure Monitor Ingestion client library is used to send custom logs to Azure Monitor using the Logs Ingestion API.
此库允许你将数据从几乎任何源发送到受支持的内置表或你在 Log Analytics 工作区中创建的自定义表。 您甚至可以使用自定义列扩展内置表的架构。
Resources:
Getting started
Prerequisites
安装软件包
Install the Azure Monitor Ingestion client library for JS with npm:
npm install @azure/monitor-ingestion
对客户端进行身份验证
需要经过身份验证的客户端才能摄取数据。 To authenticate, create an instance of a TokenCredential class (see @azure/identity for DefaultAzureCredential
and other TokenCredential
implementations). 将其传递给 Client 类的构造函数。
To authenticate, the following example uses DefaultAzureCredential
from the @azure/identity package:
import { DefaultAzureCredential } from "@azure/identity";
import { LogsIngestionClient } from "@azure/monitor-ingestion";
const logsIngestionEndpoint = "https://<my-endpoint>.azure.com";
const credential = new DefaultAzureCredential();
const logsIngestionClient = new LogsIngestionClient(logsIngestionEndpoint, credential);
为 Azure 主权云配置客户端
默认情况下,客户端配置为使用 Azure 公有云。 要改用主权云,请在实例化客户端时提供正确的终端节点和受众值。 For example:
import { DefaultAzureCredential } from "@azure/identity";
import { LogsIngestionClient } from "@azure/monitor-ingestion";
const logsIngestionEndpoint = "https://<my-endpoint>.azure.cn";
const credential = new DefaultAzureCredential();
const logsIngestionClient = new LogsIngestionClient(logsIngestionEndpoint, credential, {
audience: "https://api.loganalytics.azure.cn/.default",
});
Key concepts
数据收集终结点
数据收集终端节点 (DCE) 允许您唯一地配置 Azure Monitor 的引入设置。 This article provides an overview of data collection endpoints including their contents and structure and how you can create and work with them.
数据收集规则
数据收集规则 (DCR) 定义 Azure Monitor 收集的数据,并指定数据的发送或存储方式和位置。 REST API 调用必须指定要使用的 DCR。 单个 DCE 可以支持多个 DCR,因此您可以为不同的源和目标表指定不同的 DCR。
DCR 必须了解输入数据的结构以及目标表的结构。 如果两者不匹配,则可以使用转换来转换源数据以匹配目标表。 您还可以使用转换来筛选源数据并执行任何其他计算或转换。
有关更多详细信息,请参阅 Azure Monitor 中的数据收集规则。有关如何检索 DCR ID 的信息,请参阅 本教程。
Log Analytics 工作区表
自定义日志可以将数据发送到你创建的任何自定义表,以及 Log Analytics 工作区中的某些内置表。 目标表必须先存在,然后才能向它发送数据。 目前支持以下内置表:
Examples
You can familiarize yourself with different APIs using Samples.
上传自定义日志
您可以创建客户端并调用客户端 Upload
的方法。 Take note of the data ingestion limits.
import { DefaultAzureCredential } from "@azure/identity";
import { LogsIngestionClient, isAggregateLogsUploadError } from "@azure/monitor-ingestion";
const logsIngestionEndpoint = "https://<my-endpoint>.azure.com";
const ruleId = "data_collection_rule_id";
const streamName = "data_stream_name";
const credential = new DefaultAzureCredential();
const logsIngestionClient = new LogsIngestionClient(logsIngestionEndpoint, credential);
const logs = [
{
Time: "2021-12-08T23:51:14.1104269Z",
Computer: "Computer1",
AdditionalContext: "context-2",
},
{
Time: "2021-12-08T23:51:14.1104269Z",
Computer: "Computer2",
AdditionalContext: "context",
},
];
try {
await logsIngestionClient.upload(ruleId, streamName, logs);
} catch (e) {
const aggregateErrors = isAggregateLogsUploadError(e) ? e.errors : [];
if (aggregateErrors.length > 0) {
console.log("Some logs have failed to complete ingestion");
for (const error of aggregateErrors) {
console.log(`Error - ${JSON.stringify(error.cause)}`);
console.log(`Log - ${JSON.stringify(error.failedLogs)}`);
}
} else {
console.log(`An error occurred: ${e}`);
}
}
Verify logs
You can verify that your data has been uploaded correctly by using the @azure/monitor-query library. 在验证日志之前,请先运行 Upload custom logs (上传自定义日志 ) 示例。
import { DefaultAzureCredential } from "@azure/identity";
import { LogsQueryClient } from "@azure/monitor-query";
const monitorWorkspaceId = "workspace_id";
const tableName = "table_name";
const credential = new DefaultAzureCredential();
const logsQueryClient = new LogsQueryClient(credential);
const queriesBatch = [
{
workspaceId: monitorWorkspaceId,
query: tableName + " | count;",
timespan: { duration: "P1D" },
},
];
const result = await logsQueryClient.queryBatch(queriesBatch);
if (result[0].status === "Success") {
console.log("Table entry count: ", JSON.stringify(result[0].tables));
} else {
console.log(
`Some error encountered while retrieving the count. Status = ${result[0].status}`,
JSON.stringify(result[0]),
);
}
大批量上传日志
在对 upload
on LogsIngestionClient
方法的单次调用中上传超过 1MB 的日志时,上传将被拆分为几个较小的批次,每个批次不超过 1MB。 默认情况下,这些批次将并行上传,最多同时上传 5 个批次。 如果担心内存使用,则可能需要降低最大并发数。 可以使用该 maxConcurrency
选项控制最大并发上传数,如以下示例所示:
import { DefaultAzureCredential } from "@azure/identity";
import { LogsIngestionClient, isAggregateLogsUploadError } from "@azure/monitor-ingestion";
const logsIngestionEndpoint = "https://<my-endpoint>.azure.com";
const ruleId = "data_collection_rule_id";
const streamName = "data_stream_name";
const credential = new DefaultAzureCredential();
const client = new LogsIngestionClient(logsIngestionEndpoint, credential);
// Constructing a large number of logs to ensure batching takes place
const logs = [];
for (let i = 0; i < 100000; ++i) {
logs.push({
Time: "2021-12-08T23:51:14.1104269Z",
Computer: "Computer1",
AdditionalContext: `context-${i}`,
});
}
try {
// Set the maximum concurrency to 1 to prevent concurrent requests entirely
await client.upload(ruleId, streamName, logs, { maxConcurrency: 1 });
} catch (e) {
let aggregateErrors = isAggregateLogsUploadError(e) ? e.errors : [];
if (aggregateErrors.length > 0) {
console.log("Some logs have failed to complete ingestion");
for (const error of aggregateErrors) {
console.log(`Error - ${JSON.stringify(error.cause)}`);
console.log(`Log - ${JSON.stringify(error.failedLogs)}`);
}
} else {
console.log(e);
}
}
Retrieve logs
使用 Monitor Ingestion 客户端库上传的日志可以使用 Monitor Query 客户端库进行检索。
Troubleshooting
For details on diagnosing various failure scenarios, see our troubleshooting guide.
Next steps
若要了解有关 Azure Monitor 的详细信息,请参阅 Azure Monitor 服务文档。 Please take a look at the samples directory for detailed examples on how to use this library.
Contributing
If you'd like to contribute to this library, please read the contributing guide to learn more about how to build and test the code.