Azure Blob Storage Integration
Overview
KubeSense supports configuring cold storage integration with Azure Blob Storage for long-term storage of historical observability data. This enables you to move data to Azure after a specified duration while keeping your active storage optimized.
info: Azure Blob Storage integration provides cost-effective, scalable cold storage for traces and metrics in Azure environments.
Prerequisites
Before setting up Azure Blob Storage integration, ensure you have:
- KubeSense deployed in an Azure or Kubernetes environment
- An Azure Storage Account created
- Access to Azure storage account name and keys
- Access to modify the KubeSense Helm values
Step 1: Create Azure Storage Account
- Go to Azure Portal → Storage accounts → Create
- Enter a storage account name (e.g.,
kubesensestorage) - Select your subscription and resource group
- Choose performance tier (Standard recommended for cold storage)
- Select replication type
- Create the storage account
note: Note down the storage account name and resource group for later use.
Step 2: Create Blob Container
- Navigate to your storage account in Azure Portal
- Go to Containers under Data storage
- Click + Container
- Enter container name (e.g.,
kubesense) - Set public access level to Private
- Click Create
warning: Keep the container name for use in the configuration. Using a private access level ensures your data is secure.
Step 3: Get Storage Account Keys
- In your storage account, go to Access keys under Security + networking
- Click Show next to
key1orkey2to reveal the key - Copy the storage account name and one of the access keys
warning: Store these credentials securely. Never commit access keys to version control. Consider using Azure Key Vault for production environments.
Step 4: Update KubeSense Configuration
Add the cold storage configuration to your KubeSense Helm values:
coldStorageConfig:
enabled: true
endpoint: "https://kubesensestorage.blob.core.windows.net"
# Required for Azure
account_name: "kubesensestorage"
account_key: "YOUR_STORAGE_ACCOUNT_KEY"
container_name: "kubesense"
cloudProvider: "azure"Replace:
kubesensestoragewith your actual storage account nameYOUR_STORAGE_ACCOUNT_KEYwith your actual storage account keykubesensewith your container name- Update the endpoint URL with your storage account name and Azure region if needed
note: The endpoint format is typically: https://<account_name>.blob.core.windows.net
Step 5: Upgrade KubeSense
Apply the configuration by upgrading your KubeSense Helm deployment:
helm upgrade kubesense ./kubesense-chart \
-f values.yaml \
--namespace kubesenseConfiguration Parameters
Here's a detailed breakdown of the Azure cold storage configuration:
| Parameter | Type | Required | Description |
|---|---|---|---|
enabled | boolean | Yes | Enable or disable cold storage |
endpoint | string | Yes | Azure Blob Storage endpoint URL |
account_name | string | Yes | Azure storage account name |
account_key | string | Yes | Azure storage account access key |
container_name | string | Yes | Blob container name |
cloudProvider | string | Yes | Must be set to "azure" |
Verifying Integration
After configuring Azure Blob Storage integration:
-
Check KubeSense logs to ensure Azure connection is successful:
kubectl logs -n kubesense sts/kubesense-datastore-shard-0 -f -
Verify data in Azure Portal:
- Navigate to your storage account in Azure Portal
- Check the container for backup files
- Monitor data being written over time
-
Test cold storage functionality:
- Query historical data that should be in cold storage
- Verify data is accessible through KubeSense UI
- Check data retrieval performance
Best Practices
Security
- Use Azure Key Vault for managing storage account keys in production (optional)
- Enable Azure Storage encryption at rest
- Use Managed Identity when running on AKS
- Set up network rules to restrict access to storage account
- Consider geo-redundancy: For disaster recovery requirements
Performance
- Choose appropriate tier: Use Hot if frequently queried. Use Cool or Archive tier for long term storage
- Consider same region buckets: For lower latency
Troubleshooting
Issue: Authentication failures
Symptoms: 403 Forbidden or 401 Unauthorized errors
Solution:
- Verify the storage account name is correct
- Check that the account key matches what's configured
- Ensure the storage account exists and is accessible
Issue: Container not found errors
Symptoms: Container does not exist errors
Solution:
- Verify the container name matches the configuration
- Ensure the container was created successfully in Azure Portal
- Check that the storage account name is correct
Issue: Network connectivity issues
Symptoms: Cannot connect to Azure Blob Storage endpoint
Solution:
- Verify network rules allow access from your cluster
- Check firewall rules if using Azure Private Endpoints
- Ensure the endpoint URL is correctly formatted
Issue: Cold storage not working
Symptoms: Data not being written to Azure
Solution:
- Check Datastore logs for errors
- Verify the configuration parameters are correct
- Ensure sufficient permissions on the storage account
- Check if the container has proper access settings
Advanced Configuration
Using Private Endpoints
For enhanced security, you can use Azure Private Endpoints:
- Create a Private Endpoint for your storage account
- Configure DNS resolution for the private endpoint
- Update the endpoint URL in your configuration if needed
- Ensure network policies allow traffic through the private endpoint
Configuring Lifecycle Management
Set up Azure Blob lifecycle policies to automatically move data to Cool or Archive tiers:
- Go to your storage account → Lifecycle management
- Add a policy rule
- Configure tier transitions based on access patterns
- Apply the policy to optimize costs
Conclusion
Azure Blob Storage integration provides a reliable, cost-effective solution for storing historical observability data in Azure environments. By properly configuring authentication, monitoring, and lifecycle policies, you can optimize both performance and costs for long-term data storage.