Log Analytics Workspace
Exporting Logs from Azure Log Analytics to KubeSense
KubeSense supports ingesting logs from Azure Log Analytics Workspace by exporting logs to Event Hub or Blob Storage, which are then consumed by the KubeSense aggregator. This integration enables you to leverage existing Log Analytics queries and export results to KubeSense.
Prerequisites
Before you begin, ensure you have:
- Azure Log Analytics Workspace with logs
- Event Hub or Blob Storage for log export
- KubeSense aggregator deployed and accessible
- Appropriate Azure permissions to configure data export and access storage
Architecture
Log Analytics logs can be exported via two methods:
- Data Export Rule → Event Hub → KubeSense Aggregator (Real-time)
- Data Export Rule → Blob Storage → KubeSense Aggregator (Batch)
Method 1: Event Hub Export (Real-time)
Step 1: Create Event Hub with Kafka Protocol
Create an Event Hub with Kafka protocol enabled to receive exported logs:
# Create Event Hub namespace with Kafka enabled
az eventhubs namespace create \
--resource-group kubesense-rg \
--name kubesense-la-export \
--location eastus \
--sku Standard \
--enable-kafka true
# Create Event Hub
az eventhubs eventhub create \
--resource-group kubesense-rg \
--namespace-name kubesense-la-export \
--name log-analytics-export \
--message-retention 7 \
--partition-count 4Step 2: Create Data Export Rule
Create a data export rule to export logs from Log Analytics to Event Hub:
# Get Log Analytics Workspace ID
LA_WORKSPACE_ID=$(az monitor log-analytics workspace show \
--resource-group kubesense-rg \
--workspace-name my-log-analytics \
--query id -o tsv)
# Get Event Hub resource ID
EVENT_HUB_ID=$(az eventhubs eventhub show \
--resource-group kubesense-rg \
--namespace-name kubesense-la-export \
--name log-analytics-export \
--query id -o tsv)
# Create data export rule
az monitor log-analytics workspace data-export create \
--resource-group kubesense-rg \
--workspace-name my-log-analytics \
--name kubesense-export \
--destination event-hub \
--event-hub-resource-id $EVENT_HUB_ID \
--enable true \
--tables "Syslog" "WindowsEvent" "ContainerLog"Step 3: Configure KubeSense Aggregator
Configure the aggregator to consume from Event Hub using Kafka source:
aggregator:
customSources:
enabled: true
sources:
log_analytics_export:
type: kafka
bootstrap_servers: "<NAMESPACE>.servicebus.windows.net:9093"
topics:
- log-analytics-export
group_id: la-export-consumer
auth:
sasl:
mechanism: PLAIN
username: "$ConnectionString"
password: "<EVENT_HUB_CONNECTION_STRING>"
tls:
enabled: true
verify_certificate: true
verify_hostname: trueNote: Ensure the Event Hub namespace has Kafka protocol enabled (--enable-kafka true) when creating it.
Method 2: Blob Storage Export (Batch)
Step 1: Create Storage Account and Container
# Create storage account
az storage account create \
--resource-group kubesense-rg \
--name kubesenselaexport \
--location eastus \
--sku Standard_LRS
# Create container
az storage container create \
--account-name kubesenselaexport \
--name log-analytics-export \
--auth-mode loginStep 2: Create Data Export Rule
# Get storage account resource ID
STORAGE_ACCOUNT_ID=$(az storage account show \
--resource-group kubesense-rg \
--name kubesenselaexport \
--query id -o tsv)
# Create data export rule
az monitor log-analytics workspace data-export create \
--resource-group kubesense-rg \
--workspace-name my-log-analytics \
--name kubesense-blob-export \
--destination storage-account \
--storage-account-resource-id $STORAGE_ACCOUNT_ID \
--enable true \
--tables "Syslog" "WindowsEvent" "ContainerLog"Step 3: Configure KubeSense Aggregator for Blob Storage
For Blob Storage exports, use Event Grid → Event Hub → Kafka approach. See Blob Storage Logs for detailed configuration.
Alternatively, you can use Event Grid → HTTP endpoint:
aggregator:
customSources:
enabled: true
sources:
log_analytics_blob_export:
type: http_server
address: 0.0.0.0:30052
decoding:
codec: json
framing:
method: newline_delimitedSupported Log Tables
Common Log Analytics tables that can be exported:
- Syslog - Linux system logs
- WindowsEvent - Windows event logs
- ContainerLog - Container logs
- KubePodInventory - Kubernetes pod inventory
- KubeNodeInventory - Kubernetes node inventory
- KubeEvents - Kubernetes events
- Perf - Performance counters
- Heartbeat - Agent heartbeat
- Custom tables - Your custom log tables
Query-Based Export
You can also export query results instead of entire tables:
Using Log Analytics Query API
Create a scheduled query that exports results:
# Create saved query
az monitor log-analytics query \
--workspace my-log-analytics \
--analytics-query "Syslog | where TimeGenerated > ago(1h) | project TimeGenerated, Computer, Message" \
--output tableThen configure a Logic App or Function App to run the query periodically and forward results to KubeSense.
Configuration Examples
Export Multiple Tables
az monitor log-analytics workspace data-export create \
--resource-group kubesense-rg \
--workspace-name my-log-analytics \
--name kubesense-full-export \
--destination event-hub \
--event-hub-resource-id $EVENT_HUB_ID \
--enable true \
--tables "Syslog" "WindowsEvent" "ContainerLog" "KubePodInventory" "KubeEvents" "Perf"Filtered Export
Use Log Analytics queries to filter before export:
# Export only error logs
az monitor log-analytics workspace data-export create \
--resource-group kubesense-rg \
--workspace-name my-log-analytics \
--name kubesense-errors-only \
--destination event-hub \
--event-hub-resource-id $EVENT_HUB_ID \
--enable true \
--tables "Syslog" \
--query "Syslog | where SeverityLevel >= 3"Monitoring and Verification
- Check data export status: Verify data export rules are enabled and active
- Monitor Event Hub/Blob Storage: Verify logs are being exported
- Check aggregator connection: Verify aggregator is consuming logs
- Check KubeSense dashboard: Verify logs appear with Log Analytics metadata
- Review aggregator logs: Check for any ingestion errors
Troubleshooting
Logs Not Appearing
- Verify data export rule: Check that data export is enabled and active
- Check table names: Ensure table names are correct and exist
- Verify destination: Check Event Hub or Blob Storage is receiving data
- Check permissions: Verify data export has permissions to write to destination
- Review aggregator logs: Check for connection or parsing errors
Export Delays
- Check export latency: Log Analytics exports may have some latency
- Monitor export metrics: Check data export metrics in Azure Monitor
- Verify table volume: High-volume tables may take longer to export
Missing Tables
- Verify table names: Ensure table names match exactly (case-sensitive)
- Check table existence: Verify tables have data
- Review export configuration: Check data export rule configuration
Best Practices
- Select specific tables: Export only needed tables to reduce volume and costs
- Use Event Hub for real-time: Use Event Hub for real-time log streaming
- Use Blob Storage for batch: Use Blob Storage for batch processing and backfills
- Monitor costs: Track data export and storage costs
- Set up alerts: Configure alerts for data export failures
- Organize by export rule: Create separate export rules for different log types
- Use queries for filtering: Filter logs at export time to reduce volume
Cost Considerations
- Data export: Charged per GB exported from Log Analytics
- Event Hub: Charged per million events and storage
- Blob Storage: Charged per GB stored
- Query execution: Queries used for export may incur costs
Conclusion
Log Analytics integration enables you to leverage existing log queries and export results to KubeSense, providing a unified observability platform that combines Azure Log Analytics insights with KubeSense's advanced analytics capabilities.