Kubesense

Integrations

KubeSense supports multiple notification channels for delivering alerts to your team. Configure one or more channels and associate them with alert rules for flexible routing.

Creating a Notification Channel

Navigate to Alerts in the sidebar and open the notification channel settings. Click to add a new channel.

Create Notification Channel

FieldDescription
NameA unique name for the channel
Matching LabelsOptional key-value matchers for label-based routing (e.g., severity = critical)
TypeThe channel type — Slack, Microsoft Teams, Email, PagerDuty, or Webhook

After selecting a type, provide the required configuration fields (see below), then save the channel.


Supported Channels

Slack

Send alert notifications to a Slack channel via an incoming webhook.

FieldRequiredDescription
Webhook URLYesSlack incoming webhook URL
ChannelNoOverride the default channel

Slack messages include alert name, status (firing/resolved), severity, value, threshold, description, labels, and a link to open the alert in KubeSense. When multiple alerts are batched, the title shows firing and resolved counts (e.g., [FIRING:6, RESOLVED:1] Logs Availability).

Microsoft Teams

Send alert notifications to a Microsoft Teams channel via a webhook connector.

FieldRequiredDescription
Webhook URLYesTeams incoming webhook URL (from a Teams Workflow or Power Automate connector)

Teams messages use Adaptive Card formatting with the same information as Slack — alert name, status, severity, value, threshold, labels, and timestamps.

Email

Deliver alert notifications via SMTP email to one or more recipients.

FieldRequiredDescription
SMTP HostYesSMTP server hostname (e.g., smtp.gmail.com, smtp.office365.com)
SMTP PortYesSMTP port (typically 587 for TLS, 465 for SSL)
UsernameYesSMTP authentication username
PasswordYesSMTP authentication password or app-specific password
From EmailYesSender email address
To EmailsYesList of recipient email addresses
Use TLSNoEnable TLS encryption (recommended)

PagerDuty

Create incidents in PagerDuty when alerts fire, and auto-resolve when alerts recover.

FieldRequiredDescription
Integration KeyYesPagerDuty service integration key (Events API v2)

Alert severity is mapped to PagerDuty severity levels. Resolved alerts automatically resolve the corresponding PagerDuty incident.

Webhook

Send alert payloads as HTTP requests to any endpoint — useful for custom integrations, ChatOps bots, ticketing systems, or internal tooling.

FieldRequiredDescription
URLYesTarget HTTP endpoint
MethodNoHTTP method (defaults to POST)
HeadersNoCustom HTTP headers (e.g., Authorization: Bearer <token>)
TimeoutNoRequest timeout in seconds

The payload contains the full alert context including name, status, severity, value, threshold, labels, and annotations.


Alert Routing

There are two ways to route alerts to notification channels:

Direct Assignment

When creating an alert rule, select a specific notification channel from the Notification Channel dropdown in the Alert Routing section. All alerts from that rule are sent to the selected channel.

Label-Based Routing (Notification Channel Policy)

Configure Matching Labels on a notification channel to enable automatic routing. When an alert fires, its labels are compared against the matching labels of all channels. If a channel's matchers match the alert's labels, that channel receives the notification.

This enables patterns like:

  • Channel with severity = critical receives all critical alerts
  • Channel with team = platform receives all alerts labeled for the platform team
  • Channel with no matchers acts as a catch-all for unmatched alerts

Multiple channels can match a single alert, allowing you to send the same alert to both Slack and PagerDuty simultaneously.


Testing Channels

After creating a notification channel, use the Test action to send a test notification and verify connectivity before associating it with alert rules.


Notification Content

All notification channels receive the same core alert information:

FieldDescription
Alert NameName of the alert rule
Statusfiring or resolved
Severitycritical, warning, or info
ValueThe evaluated metric value that triggered the alert
ThresholdThe configured threshold and operator (e.g., greater_than 100)
DescriptionAlert description text
LabelsAll labels attached to the alert (including group-by values for multi-series alerts)
StartedTimestamp when the alert started firing
LinkDirect link to view the alert in KubeSense

For batched notifications (multiple alerts grouped together), the title includes firing and resolved counts — for example, [FIRING:6, RESOLVED:1] Logs Availability.