Kubesense

Cloud Logging via Pub/Sub

Ingesting GCP Cloud Logging with KubeSense

KubeSense supports ingesting logs from Google Cloud Logging by setting up a Pub/Sub subscription. This integration enables you to centralize your GCP logs alongside your Kubernetes and application observability data.

Note: This feature requires public ingress endpoint enabled for the KubeSense aggregator.

Prerequisites

Before you begin, ensure you have:

  1. GCP project with Cloud Logging enabled
  2. Pub/Sub API enabled in your GCP project
  3. KubeSense aggregator deployed with public ingress enabled
  4. The KubeSense aggregator Pub/Sub endpoint URL and API token
  5. Appropriate GCP IAM permissions to create Pub/Sub topics, subscriptions, and log sinks

Architecture

Cloud Logging logs flow through the following path:

GCP Services → Cloud Logging → Log Sink → Pub/Sub Topic → Pub/Sub Subscription → KubeSense Aggregator

Step 1: Create Pub/Sub Topic

Create a Pub/Sub topic to receive logs from Cloud Logging:

Using gcloud CLI

gcloud pubsub topics create kubesense-cloud-logs \
  --project=YOUR_PROJECT_ID

Using GCP Console

  1. Go to Pub/Sub Topics
  2. Click Create Topic
  3. Enter topic name: kubesense-cloud-logs
  4. Click Create

Step 2: Create Log Sink

Create a log sink to export Cloud Logging logs to Pub/Sub:

Using gcloud CLI

gcloud logging sinks create kubesense-sink \
  pubsub.googleapis.com/projects/YOUR_PROJECT_ID/topics/kubesense-cloud-logs \
  --log-filter='resource.type="gce_instance" OR resource.type="gke_cluster" OR resource.type="cloud_function"' \
  --project=YOUR_PROJECT_ID

Using GCP Console

  1. Go to Cloud Logging
  2. Click Logs Router in the left menu
  3. Click Create Sink
  4. Configure:
    • Sink name: kubesense-sink
    • Sink destination: Select Cloud Pub/Sub topic
    • Select Cloud Pub/Sub topic: Choose kubesense-cloud-logs
    • Choose logs to include in sink: Configure filters as needed
  5. Click Create Sink

Log Filter Examples

Filter logs by resource type:

# GKE cluster logs
resource.type="gke_cluster"

# GCE instance logs
resource.type="gce_instance"

# Cloud Functions logs
resource.type="cloud_function"

# All compute resources
resource.type=~"gce_|gke_"

# Specific log severity
severity>=ERROR

Step 3: Grant Pub/Sub Permissions

The log sink creation automatically creates a service account. Grant it permissions to publish to Pub/Sub:

# Get the service account email (usually in format: service-ACCOUNT_NUMBER@gcp-sa-logging.iam.gserviceaccount.com)
gcloud logging sinks describe kubesense-sink \
  --project=YOUR_PROJECT_ID \
  --format="value(writerIdentity)"

# Grant Pub/Sub Publisher role
gcloud pubsub topics add-iam-policy-binding kubesense-cloud-logs \
  --member="serviceAccount:SERVICE_ACCOUNT_EMAIL" \
  --role="roles/pubsub.publisher" \
  --project=YOUR_PROJECT_ID

Step 4: Create Pub/Sub Subscription

Create a subscription that will push messages to KubeSense aggregator:

Using gcloud CLI

gcloud pubsub subscriptions create kubesense-subscription \
  --topic=kubesense-cloud-logs \
  --push-endpoint=https://<KUBESENSE_AGGREGATOR_HOST>:<PORT>/pubsub \
  --push-auth-token=<KUBESENSE_API_TOKEN> \
  --project=YOUR_PROJECT_ID

Using GCP Console

  1. Go to Pub/Sub Subscriptions
  2. Click Create Subscription
  3. Configure:
    • Subscription ID: kubesense-subscription
    • Topic: Select kubesense-cloud-logs
    • Delivery type: Select Push
    • Endpoint URL: https://<KUBESENSE_AGGREGATOR_HOST>:<PORT>/pubsub
    • Authentication: Configure authentication header
      • Header name: Authorization
      • Header value: Bearer <KUBESENSE_API_TOKEN>
  4. Click Create

Step 5: Configure KubeSense Aggregator

Configure the aggregator to accept logs from Pub/Sub:

aggregator:
  customSources:
    enabled: true
    sources:
      pubsub_logs:
        type: gcp_pubsub
        project: YOUR_PROJECT_ID
        subscription: projects/YOUR_PROJECT_ID/subscriptions/kubesense-subscription
        credentials_path: /etc/kubesense/gcs-key.json

Update Helm Values

If deploying via Helm:

global:
  cluster_name: "gcp-cluster"

aggregator:
  customSources:
    enabled: true
    sources:
      pubsub_logs:
        type: gcp_pubsub
        project: YOUR_PROJECT_ID
        subscription: projects/YOUR_PROJECT_ID/subscriptions/kubesense-subscription
        credentials_path: /etc/kubesense/gcs-key.json

Alternative: Pull Subscription

Instead of push subscription, you can use pull subscription with the aggregator polling Pub/Sub:

Configure Pull Subscription

gcloud pubsub subscriptions create kubesense-pull-subscription \
  --topic=kubesense-cloud-logs \
  --project=YOUR_PROJECT_ID

Configure Aggregator for Pull

aggregator:
  customSources:
    enabled: true
    sources:
      pubsub_pull_logs:
        type: gcp_pubsub
        project: YOUR_PROJECT_ID
        subscription: projects/YOUR_PROJECT_ID/subscriptions/kubesense-pull-subscription
        credentials_path: /etc/kubesense/gcs-key.json

Log Enrichment

KubeSense aggregator automatically enriches GCP Cloud Logging with:

  • Resource metadata: Resource type, labels, location
  • GCP project information: Project ID, project number
  • Log metadata: Severity, timestamp, log name
  • Source information: Service name, method name (for API logs)

Monitoring and Verification

  1. Check log sink: Verify logs are being exported to Pub/Sub
  2. Monitor Pub/Sub metrics: Check message count and delivery metrics
  3. Verify subscription: Ensure subscription is active and delivering messages
  4. Check KubeSense dashboard: Verify logs appear in KubeSense with GCP metadata
  5. Review aggregator logs: Check for any ingestion errors

Troubleshooting

Logs Not Appearing

  1. Verify log sink: Check that the sink is active and exporting logs
  2. Check Pub/Sub topic: Verify messages are being published to the topic
  3. Verify subscription: Ensure subscription is active and configured correctly
  4. Check IAM permissions: Verify service account has Pub/Sub publisher role
  5. Review network connectivity: Ensure GCP can reach KubeSense aggregator endpoint
  6. Check aggregator logs: Review aggregator logs for connection or parsing errors

Authentication Issues

  1. Verify API token: Ensure the API token is valid and has correct permissions
  2. Check authentication header: Verify header name and value are correct
  3. Review aggregator auth config: Ensure authentication is properly configured

Performance Issues

  1. Adjust batch size: Configure Pub/Sub batch settings
  2. Monitor message backlog: Check for message accumulation in subscription
  3. Scale aggregator: Increase aggregator resources if needed
  4. Use pull mode: Consider pull mode for better control

Best Practices

  • Use specific log filters: Filter logs at the sink level to reduce volume and costs
  • Organize by topic: Create separate topics for different log types or environments
  • Monitor Pub/Sub quotas: Be aware of Pub/Sub quotas and limits
  • Set up alerts: Configure alerts for subscription delivery failures
  • Use structured logging: Ensure applications use structured logging for better parsing
  • Tag resources: Use GCP resource labels for better log organization
  • Monitor costs: Track Pub/Sub message and storage costs

Cost Considerations

  • Pub/Sub messages: Charged per million messages
  • Pub/Sub storage: Charged for message retention
  • Cloud Logging export: No additional charge for log exports
  • Data transfer: Consider data transfer costs between GCP and KubeSense

Advanced Configuration

Multiple Log Sinks

Create separate sinks for different log types:

# GKE logs
gcloud logging sinks create kubesense-gke-sink \
  pubsub.googleapis.com/projects/YOUR_PROJECT_ID/topics/kubesense-gke-logs \
  --log-filter='resource.type="gke_cluster"' \
  --project=YOUR_PROJECT_ID

# Cloud Functions logs
gcloud logging sinks create kubesense-functions-sink \
  pubsub.googleapis.com/projects/YOUR_PROJECT_ID/topics/kubesense-functions-logs \
  --log-filter='resource.type="cloud_function"' \
  --project=YOUR_PROJECT_ID

Custom Log Processing

Configure aggregator for custom log processing using transforms (configured separately):

aggregator:
  customSources:
    enabled: true
    sources:
      pubsub_logs:
        type: gcp_pubsub
        project: YOUR_PROJECT_ID
        subscription: projects/YOUR_PROJECT_ID/subscriptions/kubesense-subscription
        credentials_path: /etc/kubesense/gcs-key.json

Conclusion

Cloud Logging via Pub/Sub integration provides real-time log streaming from GCP services, enabling comprehensive observability across your entire GCP infrastructure alongside your Kubernetes and application data.