This exporter sends logs to Azure Log Analytics via the Log Analytics Ingestion API. The output format depends on the log body type and the raw_log_field configuration:
- Structured (default, map body): Each key in the log body becomes a top-level JSON field, mapping directly to a column in your Log Analytics table. No
RawDatawrapper. Metadata fields (TimeGenerated,SeverityText,SeverityNumber,TraceId,SpanId) are included automatically. - Unstructured (default, string body): String-body logs are wrapped in
{"RawData": "<string>", "TimeGenerated": "...", ...}so ingestion works with tables that have aRawDatacolumn. - Raw Log Mode (
raw_log_fieldset): Extracts the specified field via an OTTL expression and sends{"RawData": "<extracted value>"}.
- Introduced: v1.75.0
- Logs
This exporter sends logs to Azure Log Analytics using the Log Analytics Ingestion API. Before using the exporter, you must configure a Data Collection Rule (DCR) or Data Collection Endpoint (DCE) and a custom table within your Log Analytics workspace.
The required schema for the custom table depends on the log body type and the raw_log_field configuration option:
- Structured JSON (default, map body): If
raw_log_fieldis not specified and the log body is a map, each key in the body is sent as a top-level JSON field. Your custom table columns should match the keys in your log data. For example, a log body{"source_ip": "10.0.0.1", "action": "ALLOW"}produces:[{"source_ip": "10.0.0.1", "action": "ALLOW", "TimeGenerated": "2025-01-01T00:00:00Z", "SeverityText": "INFO", "SeverityNumber": 9}] - Unstructured fallback (default, string body): If
raw_log_fieldis not specified and the log body is a plain string, the exporter wraps it in aRawDatafield:[{"RawData": "plain text log message", "TimeGenerated": "2025-01-01T00:00:00Z", "SeverityText": "INFO", "SeverityNumber": 9}] - Raw Log Mode: If
raw_log_fieldis specified, the exporter extracts the designated field via an OTTL expression and wraps it inRawData:[{"RawData": "<extracted field value>"}]
In all cases, TimeGenerated is included automatically (required by Azure).
| Field | Type | Default | Required | Description |
|---|---|---|---|---|
| endpoint | string | ✓ | The DCR logs ingestion endpoint URL, or a DCE logs ingestion endpoint URL if your DCR does not expose one (see Endpoint Configuration) | |
| client_id | string | ✓ | Azure client ID for authentication | |
| raw_log_field | string | "" | OTTL expression for the log field to extract and send as RawData. When empty, structured JSON is sent for map bodies and RawData for string bodies | |
| client_secret | string | ✓ | Azure client secret for authentication | |
| tenant_id | string | ✓ | Azure tenant ID for authentication | |
| rule_id | string | ✓ | Data Collection Rule (DCR) ID or immutableId | |
| stream_name | string | ✓ | The stream name as defined in your DCR. Must be prefixed with Custom- for custom tables (e.g., Custom-MyTable_CL) |
|
| timeout | string | See doc for details | ||
| sending_queue | map | See doc for details | ||
| retry_on_failure | map | See doc for details |
exporters:
azureloganalytics:
endpoint: "<your-log-ingestion-endpoint>"
client_id: "<your-client-id>"
client_secret: "<your-client-secret>"
tenant_id: "<your-tenant-id>"
raw_log_field: body
rule_id: "<your-dcr-id>"
stream_name: "<your-stream-name>"exporters:
azureloganalytics:
endpoint: "<your-log-ingestion-endpoint>"
client_id: "<your-client-id>"
client_secret: "<your-client-secret>"
tenant_id: "<your-tenant-id>"
rule_id: "<your-dcr-id>"
stream_name: "<your-stream-name>"This configuration shows the minimum required fields to export logs to Azure Log Analytics. All fields are required for the exporter to function properly.
exporters:
azureloganalytics:
endpoint: "<your-log-ingestion-endpoint>"
client_id: "<your-client-id>"
client_secret: "<your-client-secret>"
tenant_id: "<your-tenant-id>"
rule_id: "<your-dcr-id>"
stream_name: "<your-stream-name>"
timeout: 30s
sending_queue:
queue_size: 1000
enabled: true
retry_on_failure:
enabled: true
initial_interval: 5s
max_interval: 30s
max_elapsed_time: 300sBefore configuring the exporter, you'll need to set up several components in the Azure portal:
- Navigate to Azure Active Directory > App registrations
- Click "New registration"
- Give your application a name
- Select supported account types (usually "Single tenant")
- Click "Register"
- After creation, note down the following:
- Application (client) ID
- Directory (tenant) ID
- Under "Certificates & secrets":
- Create a new client secret
- Copy the secret value immediately (you won't be able to see it again)
-
Go to your Log Analytics workspace
-
Navigate to "Tables" under Settings
-
Click "New Custom Table"
-
Configure your table:
- Give it a name (this will be the display name in the Azure portal). Important: The actual
stream_namevalue used in the exporter configuration must be prefixed withCustom-. For example, if you name the tablemy_logsin the portal, thestream_nameconfiguration value should beCustom-my_logs. - Select "JSON" as the data format
- Provide an example schema based on your log format:
- Structured (map body,
raw_log_fieldnot set): Your table columns should match the keys in your log body, plus metadata fields:[ { "source_ip": "10.0.0.1", "action": "ALLOW", "bytes": 1234, "TimeGenerated": "2025-01-01T00:00:00Z", "SeverityText": "INFO", "SeverityNumber": 9 } ] - Unstructured (string body,
raw_log_fieldnot set): Your table needs aRawDatacolumn:[ { "RawData": "Sample log entry content", "TimeGenerated": "2025-01-01T00:00:00Z", "SeverityText": "INFO", "SeverityNumber": 9 } ] - Raw Log Mode (
raw_log_fieldset): Your table needs aRawDatacolumn:[ { "RawData": "Sample log entry content" } ]
- Structured (map body,
- Give it a name (this will be the display name in the Azure portal). Important: The actual
-
Click "Create"
- Navigate to Microsoft Sentinel
- Go to Settings > Data Collection Rules
- Click "Create"
- Configure the DCR:
- Select your subscription and resource group
- Choose your Log Analytics workspace
- Select the custom table you created
- Set up any necessary transformations
- After creation, note down:
- The Rule ID (will be your
rule_id) - The endpoint URL (see Endpoint Configuration below)
- The Rule ID (will be your
- Go to your DCR
- Navigate to "Access control (IAM)"
- Add a role assignment:
- Role: "Monitoring Metrics Publisher"
- Assign access to: User, group, or service principal
- Select your previously created Azure AD application (you may need to use the search functionality to find it)
- Repeat the same for the Log Analytics workspace resource if needed.
Now you have all the required information to configure the exporter:
endpoint: The logs ingestion endpoint URL (see below)client_id: The Application (client) IDclient_secret: The secret value you createdtenant_id: The Directory (tenant) IDrule_id: The DCR Rule IDstream_name: The stream name from your DCR (must be prefixed withCustom-for custom tables)
The endpoint field requires a logs ingestion endpoint URL. There are two ways to obtain this:
-
From the DCR directly: Open your DCR in the Azure portal, click "JSON View", and look for
properties.endpoints.logsIngestion. If present, use this URL as theendpointvalue. If the field is missing, try switching to a newer API version (e.g.,2023-03-11) in the JSON view. -
From a Data Collection Endpoint (DCE): If your DCR does not expose a
logsIngestionendpoint (common with older DCRs or certain configurations), you must create a separate Data Collection Endpoint (DCE) and use its logs ingestion endpoint URL instead. After creating the DCE, associate it with your DCR.
For more information, see the Logs Ingestion API overview.
- The first export of logs may take anywhere from 5-15 minutes on a freshly created table.
- The
stream_namemust be prefixed withCustom-for custom log tables (e.g.,Custom-MyTable_CL). Omitting this prefix will cause silent ingestion failures. - Transient HTTP errors (429, 502, 503, 504) are automatically retried. Permanent errors (400, 401, 403, 500) are not retried. For more details on response codes, see the Logs Ingestion API overview.