Skip to main content

PagerDuty

Port's PagerDuty integration allows you to model PagerDuty resources in your software catalog and ingest data into them.

Overview

This integration allows you to:

  • Map and organize your desired PagerDuty resources and their metadata in Port (see supported resources below).
  • Watch for PagerDuty object changes (create/update/delete) in real-time, and automatically apply the changes to your entities in Port.

Supported Resources

The resources that can be ingested from PagerDuty into Port are listed below. It is possible to reference any field that appears in the API responses linked below in the mapping configuration.

Setup

Choose one of the following installation methods:

Using this installation option means that the integration will be hosted by Port, with a customizable resync interval to ingest data into Port.

Beta feature

The Hosted by Port option is currently in beta, and is still undergoing final testing before its official release.

Should you encounter any bugs or functionality issues, please let us know so we can rectify them as soon as possible.
Your help is greatly appreciated! ⭐

Live event support

Currently, live events are not supported for integrations hosted by Port.
Resyncs will be performed periodically every 1 hour by default (can be configured differently after installation), or manually triggered by you via Port's UI.

Therefore, real-time events (including GitOps) will not be ingested into Port immediately.
Support for live events is WIP and will be supported in the near future.

Installation

To install, follow the following steps:

  1. Go to the Data sources page of your portal.

  2. Click on the + Data source button in the top-right corner.

  3. Click on the relevant integration in the list.

  4. Under Select your installation method, choose Hosted by Port.

  5. Configure the integration settings and application settings as you wish (see below for details).

Application settings

Every integration hosted by Port has the following customizable application settings, which are configurable after installation:

  • Resync interval: The frequency at which Port will ingest data from the integration. There are various options available, ranging from every 1 hour to once a day.

  • Send raw data examples: A boolean toggle (enabled by default). If enabled, raw data examples will be sent from the integration to Port. These examples are used when testing your mapping configuration, they allow you to run your jq expressions against real data and see the results.

Integration settings

Every integration has its own tool-specific settings, under the Integration settings section.
Each of these settings has an ⓘ icon next to it, which you can hover over to see a description of the setting.

Port secrets

Some integration settings require sensitive pieces of data, such as tokens.
For these settings, Port secrets will be used, ensuring that your sensitive data is encrypted and secure.

When changing such a setting, you will be prompted to choose an existing secret or create a new one:



Port source IP addresses

When using this installation method, Port will make outbound calls to your 3rd-party applications from static IP addresses.
You may need to add these addresses to your allowlist, in order to allow Port to interact with the integrated service:

54.73.167.226  
63.33.143.237
54.76.185.219

Configuration

Port integrations use a YAML mapping block to ingest data from the third-party api into Port.

The mapping makes use of the JQ JSON processor to select, modify, concatenate, transform and perform other operations on existing fields and values from the integration API.

Capabilities

Ingesting service analytics

To enrich your PagerDuty service entities with analytics data, follow the steps below:

  1. Update the service blueprint to include analytics properties. You can add any property that is returned from the PagerDuty aggregated service analytics API

    Updated service blueprint
    {
    "identifier":"pagerdutyService",
    "description":"This blueprint represents a PagerDuty service in our software catalog",
    "title":"PagerDuty Service",
    "icon":"pagerduty",
    "schema":{
    "properties":{
    "status":{
    "title":"Status",
    "type":"string"
    },
    "url":{
    "title":"URL",
    "type":"string",
    "format":"url"
    },
    "oncall":{
    "title":"On Call",
    "type":"array",
    "items":{
    "type":"string",
    "format":"user"
    }
    },
    "meanSecondsToResolve":{
    "title":"Mean Seconds to Resolve",
    "type":"number"
    },
    "meanSecondsToFirstAck":{
    "title":"Mean Seconds to First Acknowledge",
    "type":"number"
    },
    "meanSecondsToEngage":{
    "title":"Mean Seconds to Engage",
    "type":"number"
    },
    "totalIncidentCount":{
    "title":"Total Incident Count",
    "type":"number"
    },
    "totalIncidentsAcknowledged":{
    "title":"Total Incidents Acknowledged",
    "type":"number"
    },
    "totalIncidentsAutoResolved":{
    "title":"Total Incidents Auto Resolved",
    "type":"number"
    },
    "totalIncidentsManualEscalated":{
    "title":"Total Incident Manual Escalated",
    "type":"number"
    }
    },
    "required":[]
    },
    "mirrorProperties":{},
    "calculationProperties":{},
    "relations":{}
    }
  2. Add serviceAnalytics property to the integration selector key. When set to true, the integration will fetch data from the PagerDuty aggregated service analytics API and ingest it to Port. By default, this property is set to true.

    Also, by default, the integration aggregates the analytics over a period of 3 months. Use the analyticsMonthsPeriod filter to override this date range. The accepted values are positive number between 1 to 12. In the provided example below, we aggregate the analytics over the past 6 months.

    resources:
    - kind: services
    selector:
    query: "true"
    serviceAnalytics: "true"
    analyticsMonthsPeriod: 6
    port:
    entity:
    mappings:
    identifier: .id
    title: .name
    blueprint: '"pagerdutyService"'
    properties:
    status: .status
    url: .html_url
    oncall: .__oncall_user | sort_by(.escalation_level) | .[0].user.email
    secondaryOncall: .__oncall_user | sort_by(.escalation_level) | .[1].user.email
  3. Establish a mapping between the analytics properties and the service analytics data response. Following a convention, the aggregated result of the PagerDuty service analytics API is saved to the __analytics key and merged with the response of the service API. Consequently, users can access specific metrics such as the mean seconds to resolve by referencing __analytics.mean_seconds_to_resolve.

    resources:
    - kind: services
    selector:
    query: "true"
    serviceAnalytics: "true"
    analyticsMonthsPeriod: 6
    port:
    entity:
    mappings:
    identifier: .id
    title: .name
    blueprint: '"pagerdutyService"'
    properties:
    status: .status
    url: .html_url
    oncall: .__oncall_user | sort_by(.escalation_level) | .[0].user.email
    secondaryOncall: .__oncall_user | sort_by(.escalation_level) | .[1].user.email
    meanSecondsToResolve: .__analytics.mean_seconds_to_resolve
  4. Below is the complete integration configuration for enriching the service blueprint with analytics data.

    Service analytics integration configuration
    resources:
    - kind: services
    selector:
    query: "true"
    serviceAnalytics: "true"
    analyticsMonthsPeriod: 6
    port:
    entity:
    mappings:
    identifier: .id
    title: .name
    blueprint: '"pagerdutyService"'
    properties:
    status: .status
    url: .html_url
    oncall: .__oncall_user | sort_by(.escalation_level) | .[0].user.email
    secondaryOncall: .__oncall_user | sort_by(.escalation_level) | .[1].user.email
    meanSecondsToResolve: .__analytics.mean_seconds_to_resolve
    meanSecondsToFirstAck: .__analytics.mean_seconds_to_first_ack
    meanSecondsToEngage: .__analytics.mean_seconds_to_engage
    totalIncidentCount: .__analytics.total_incident_count
    totalIncidentsAcknowledged: .__analytics.total_incidents_acknowledged
    totalIncidentsAutoResolved: .__analytics.total_incidents_auto_resolved
    totalIncidentsManualEscalated: .__analytics.total_incidents_manual_escalated

Ingesting incident analytics

To enrich your PagerDuty incident entities with analytics data, follow the steps below:

  1. Update the incident blueprint to include an analytics property.

    Updated incident blueprint
    {
    "identifier": "pagerdutyIncident",
    "description": "This blueprint represents a PagerDuty incident in our software catalog",
    "title": "PagerDuty Incident",
    "icon": "pagerduty",
    "schema": {
    "properties": {
    "status": {
    "type": "string",
    "title": "Incident Status",
    "enum": [
    "triggered",
    "annotated",
    "acknowledged",
    "reassigned",
    "escalated",
    "reopened",
    "resolved"
    ],
    "enumColors": {
    "triggered": "red",
    "annotated": "blue",
    "acknowledged": "yellow",
    "reassigned": "blue",
    "escalated": "yellow",
    "reopened": "red",
    "resolved": "green"
    }
    },
    "url": {
    "type": "string",
    "format": "url",
    "title": "Incident URL"
    },
    "urgency": {
    "title": "Incident Urgency",
    "type": "string",
    "enum": [
    "high",
    "low"
    ],
    "enumColors": {
    "high": "red",
    "low": "green"
    }
    },
    "priority": {
    "type": "string",
    "title": "Priority",
    "enum": [
    "P1",
    "P2",
    "P3",
    "P4",
    "P5"
    ],
    "enumColors": {
    "P1": "red",
    "P2": "yellow",
    "P3": "blue",
    "P4": "lightGray",
    "P5": "darkGray"
    }
    },
    "description": {
    "type": "string",
    "title": "Description"
    },
    "assignees": {
    "title": "Assignees",
    "type": "array",
    "items": {
    "type": "string",
    "format": "user"
    }
    },
    "escalation_policy": {
    "type": "string",
    "title": "Escalation Policy"
    },
    "created_at": {
    "title": "Create At",
    "type": "string",
    "format": "date-time"
    },
    "updated_at": {
    "title": "Updated At",
    "type": "string",
    "format": "date-time"
    },
    "analytics": {
    "title": "Analytics",
    "type": "object"
    }
    },
    "required": []
    },
    "mirrorProperties": {},
    "calculationProperties": {},
    "relations": {
    "pagerdutyService": {
    "title": "PagerDuty Service",
    "target": "pagerdutyService",
    "required": false,
    "many": true
    }
    }
    }
  2. Add incidentAnalytics property to the integration selector key. When set to true, the integration will fetch data from the PagerDuty Analytics API and ingest it to Port. By default, this property is set to false.

    resources:
    - kind: incidents
    selector:
    query: "true"
    incidentAnalytics: "true"
    port:
    entity:
    mappings:
    identifier: .id | tostring
    title: .title
    blueprint: '"pagerdutyIncident"'
    properties:
    status: .status
    url: .self
  3. Establish a mapping between the analytics blueprint property and the analytics data response.

    resources:
    - kind: incidents
    selector:
    query: 'true'
    include: ['assignees']
    port:
    entity:
    mappings:
    identifier: .id | tostring
    title: .title
    blueprint: '"pagerdutyIncident"'
    properties:
    status: .status
    url: .self
    urgency: .urgency
    assignees: .assignments | map(.assignee.email)
    escalation_policy: .escalation_policy.summary
    created_at: .created_at
    updated_at: .updated_at
    priority: if .priority != null then .priority.summary else null end
    description: .description
    analytics: .__analytics
    relations:
    pagerdutyService: .service.id
  4. Below is the complete integration configuration for enriching the incident blueprint with analytics data.

    Incident analytics integration configuration
    resources:
    - kind: incidents
    selector:
    query: 'true'
    include: ['assignees']
    port:
    entity:
    mappings:
    identifier: .id | tostring
    title: .title
    blueprint: '"pagerdutyIncident"'
    properties:
    status: .status
    url: .self
    urgency: .urgency
    assignees: .assignments | map(.assignee.email)
    escalation_policy: .escalation_policy.summary
    created_at: .created_at
    updated_at: .updated_at
    priority: if .priority != null then .priority.summary else null end
    description: .description
    analytics: .__analytics
    relations:
    pagerdutyService: .service.id

Examples

To view and test the integration's mapping against examples of the third-party API responses, use the jq playground in your data sources page. Find the integration in the list of data sources and click on it to open the playground.

Additional examples of blueprints and the relevant integration configurations can be found on the pagerduty examples page

Let's Test It

This section includes sample response data from Pagerduty. In addition, it includes the entity created from the resync event based on the Ocean configuration provided in the previous section.

Payload

Here is an example of the payload structure from Pagerduty:

Schedule response data
{
"id": "PWAXLIH",
"type": "schedule",
"summary": "Port Test Service - Weekly Rotation",
"self": "https://api.pagerduty.com/schedules/PWAXLIH",
"html_url": "https://getport-io.pagerduty.com/schedules/PWAXLIH",
"name": "Port Test Service - Weekly Rotation",
"time_zone": "Asia/Jerusalem",
"description": "This is the weekly on call schedule for Port Test Service associated with your first escalation policy.",
"users": [
{
"id": "PJCRRLH",
"type": "user_reference",
"summary": "Adam",
"self": "https://api.pagerduty.com/users/PJCRRLH",
"html_url": "https://getport-io.pagerduty.com/users/PJCRRLH"
},
{
"id": "P4K4DLP",
"type": "user_reference",
"summary": "Alice",
"self": "https://api.pagerduty.com/users/P4K4DLP",
"html_url": "https://getport-io.pagerduty.com/users/P4K4DLP"
},
{
"id": "HDW63E2",
"type": "user_reference",
"summary": "Doe",
"self": "https://api.pagerduty.com/users/HDW63E2",
"html_url": "https://getport-io.pagerduty.com/users/HDW63E2"
},
{
"id": "PRGAUI4",
"type": "user_reference",
"summary": "Pages",
"self": null,
"html_url": "https://getport-io.pagerduty.com/users/PRGAUI4",
"deleted_at": "2023-10-17T18:58:07+03:00"
},
{
"id": "PYIEKLY",
"type": "user_reference",
"summary": "Demo",
"self": "https://api.pagerduty.com/users/PYIEKLY",
"html_url": "https://getport-io.pagerduty.com/users/PYIEKLY"
}
],
"escalation_policies": [
{
"id": "P7LVMYP",
"type": "escalation_policy_reference",
"summary": "Test Escalation Policy",
"self": "https://api.pagerduty.com/escalation_policies/P7LVMYP",
"html_url": "https://getport-io.pagerduty.com/escalation_policies/P7LVMYP"
}
],
"teams": []
}
Oncall response data
{
"escalation_policy":{
"id":"P7LVMYP",
"type":"escalation_policy_reference",
"summary":"Test Escalation Policy",
"self":"https://api.pagerduty.com/escalation_policies/P7LVMYP",
"html_url":"https://getport-io.pagerduty.com/escalation_policies/P7LVMYP"
},
"escalation_level":1,
"schedule":{
"id":"PWAXLIH",
"type":"schedule_reference",
"summary":"Port Test Service - Weekly Rotation",
"self":"https://api.pagerduty.com/schedules/PWAXLIH",
"html_url":"https://getport-io.pagerduty.com/schedules/PWAXLIH"
},
"user":{
"name":"John Doe",
"email":"johndoe@domain.io",
"time_zone":"Asia/Jerusalem",
"color":"red",
"avatar_url":"https://secure.gravatar.com/avatar/149cf38119ee25af9b8b3a68d06f39e3.png?d=mm&r=PG",
"billed":true,
"role":"user",
"description":null,
"invitation_sent":false,
"job_title":null,
"teams":[

],
"contact_methods":[
{
"id":"PK3SHEX",
"type":"email_contact_method_reference",
"summary":"Default",
"self":"https://api.pagerduty.com/users/HDW63E2/contact_methods/PK3SHEX",
"html_url":null
},
{
"id":"PO3TNV8",
"type":"phone_contact_method_reference",
"summary":"Other",
"self":"https://api.pagerduty.com/users/HDW63E2/contact_methods/PO3TNV8",
"html_url":null
},
{
"id":"P7U59FI",
"type":"sms_contact_method_reference",
"summary":"Other",
"self":"https://api.pagerduty.com/users/HDW63E2/contact_methods/P7U59FI",
"html_url":null
}
],
"notification_rules":[
{
"id":"PMTOCX1",
"type":"assignment_notification_rule_reference",
"summary":"0 minutes: channel PK3SHEX",
"self":"https://api.pagerduty.com/users/HDW63E2/notification_rules/PMTOCX1",
"html_url":null
},
{
"id":"P3HAND3",
"type":"assignment_notification_rule_reference",
"summary":"0 minutes: channel P7U59FI",
"self":"https://api.pagerduty.com/users/HDW63E2/notification_rules/P3HAND3",
"html_url":null
}
],
"id":"HDW63E2",
"type":"user",
"summary":"John Doe",
"self":"https://api.pagerduty.com/users/HDW63E2",
"html_url":"https://getport-io.pagerduty.com/users/HDW63E2"
},
"start":"2024-02-25T00:00:00Z",
"end":"2024-04-14T11:10:48Z"
}
Service response data
{
"id": "PGAAJBE",
"name": "My Test Service",
"description": "For testing",
"created_at": "2023-08-03T16:53:48+03:00",
"updated_at": "2023-08-03T16:53:48+03:00",
"status": "active",
"teams": [],
"alert_creation": "create_alerts_and_incidents",
"addons": [],
"scheduled_actions": [],
"support_hours": "None",
"last_incident_timestamp": "None",
"escalation_policy": {
"id": "P7LVMYP",
"type": "escalation_policy_reference",
"summary": "Test Escalation Policy",
"self": "https://api.pagerduty.com/escalation_policies/P7LVMYP",
"html_url": "https://getport-io.pagerduty.com/escalation_policies/P7LVMYP"
},
"incident_urgency_rule": {
"type": "constant",
"urgency": "high"
},
"acknowledgement_timeout": "None",
"auto_resolve_timeout": "None",
"integrations": [],
"type": "service",
"summary": "My Test Service",
"self": "https://api.pagerduty.com/services/PGAAJBE",
"html_url": "https://getport-io.pagerduty.com/service-directory/PGAAJBE",
"__oncall_user": [
{
"escalation_policy": {
"id": "P7LVMYP",
"type": "escalation_policy_reference",
"summary": "Test Escalation Policy",
"self": "https://api.pagerduty.com/escalation_policies/P7LVMYP",
"html_url": "https://getport-io.pagerduty.com/escalation_policies/P7LVMYP"
},
"escalation_level": 1,
"schedule": {
"id": "PWAXLIH",
"type": "schedule_reference",
"summary": "Port Test Service - Weekly Rotation",
"self": "https://api.pagerduty.com/schedules/PWAXLIH",
"html_url": "https://getport-io.pagerduty.com/schedules/PWAXLIH"
},
"user": {
"name": "demo",
"email": "devops-port@pager-demo.com",
"time_zone": "Asia/Jerusalem",
"color": "teal",
"avatar_url": "https://secure.gravatar.com/avatar/5cc831a4e778f54460efc4cd20d13acd.png?d=mm&r=PG",
"billed": true,
"role": "admin",
"description": "None",
"invitation_sent": true,
"job_title": "None",
"teams": [],
"contact_methods": [
{
"id": "POKPUFD",
"type": "email_contact_method_reference",
"summary": "Default",
"self": "https://api.pagerduty.com/users/PYIEKLY/contact_methods/POKPUFD",
"html_url": "None"
}
],
"notification_rules": [
{
"id": "P9NWEKF",
"type": "assignment_notification_rule_reference",
"summary": "0 minutes: channel POKPUFD",
"self": "https://api.pagerduty.com/users/PYIEKLY/notification_rules/P9NWEKF",
"html_url": "None"
},
{
"id": "PPJHFA5",
"type": "assignment_notification_rule_reference",
"summary": "0 minutes: channel POKPUFD",
"self": "https://api.pagerduty.com/users/PYIEKLY/notification_rules/PPJHFA5",
"html_url": "None"
}
],
"id": "PYIEKLY",
"type": "user",
"summary": "demo",
"self": "https://api.pagerduty.com/users/PYIEKLY",
"html_url": "https://getport-io.pagerduty.com/users/PYIEKLY"
},
"start": "2023-10-17T15:57:50Z",
"end": "2024-02-13T22:16:48Z"
}
]
}
Incident response data
{
"incident_number": 2,
"title": "Example Incident",
"description": "Example Incident",
"created_at": "2023-05-15T13:59:45Z",
"updated_at": "2023-05-15T13:59:45Z",
"status": "triggered",
"incident_key": "89809d37f4344d36a90c0a192c20c617",
"service": {
"id": "PWJAGSD",
"type": "service_reference",
"summary": "Port Test Service",
"self": "https://api.pagerduty.com/services/PWJAGSD",
"html_url": "https://getport-io.pagerduty.com/service-directory/PWJAGSD"
},
"assignments": [
{
"at": "2023-05-15T13:59:45Z",
"assignee": {
"id": "PJCRRLH",
"type": "user_reference",
"summary": "Username",
"self": "https://api.pagerduty.com/users/PJCRRLH",
"html_url": "https://getport-io.pagerduty.com/users/PJCRRLH"
}
}
],
"assigned_via": "escalation_policy",
"last_status_change_at": "2023-05-15T13:59:45Z",
"resolved_at": null,
"first_trigger_log_entry": {
"id": "R5S5T07QR1SZRQFYB7SXEO2EKZ",
"type": "trigger_log_entry_reference",
"summary": "Triggered through the website.",
"self": "https://api.pagerduty.com/log_entries/R5S5T07QR1SZRQFYB7SXEO2EKZ",
"html_url": "https://getport-io.pagerduty.com/incidents/Q1P3AHC3KLGVAS/log_entries/R5S5T07QR1SZRQFYB7SXEO2EKZ"
},
"alert_counts": {
"all": 0,
"triggered": 0,
"resolved": 0
},
"is_mergeable": true,
"escalation_policy": {
"id": "P7LVMYP",
"type": "escalation_policy_reference",
"summary": "Test Escalation Policy",
"self": "https://api.pagerduty.com/escalation_policies/P7LVMYP",
"html_url": "https://getport-io.pagerduty.com/escalation_policies/P7LVMYP"
},
"teams": [],
"pending_actions": [],
"acknowledgements": [],
"basic_alert_grouping": null,
"alert_grouping": null,
"last_status_change_by": {
"id": "PWJAGSD",
"type": "service_reference",
"summary": "Port Test Service",
"self": "https://api.pagerduty.com/services/PWJAGSD",
"html_url": "https://getport-io.pagerduty.com/service-directory/PWJAGSD"
},
"urgency": "high",
"id": "Q1P3AHC3KLGVAS",
"type": "incident",
"summary": "[#2] Example Incident",
"self": "https://api.pagerduty.com/incidents/Q1P3AHC3KLGVAS",
"html_url": "https://getport-io.pagerduty.com/incidents/Q1P3AHC3KLGVAS"
}

Mapping Result

The combination of the sample payload and the Ocean configuration generates the following Port entity:

Schedule entity in Port
{
"identifier": "PWAXLIH",
"title": "Port Test Service - Weekly Rotation",
"icon": null,
"blueprint": "pagerdutySchedule",
"team": [],
"properties": {
"url": "https://getport-io.pagerduty.com/schedules/PWAXLIH",
"timezone": "Asia/Jerusalem",
"description": "Asia/Jerusalem",
"users": ["adam@getport-io.com", "alice@getport-io.com", "doe@getport-io.com", "demo@getport-io.com", "pages@getport-io.com"]
},
"relations": {},
"createdAt": "2023-12-01T13:18:02.215Z",
"createdBy": "hBx3VFZjqgLPEoQLp7POx5XaoB0cgsxW",
"updatedAt": "2023-12-01T13:18:02.215Z",
"updatedBy": "hBx3VFZjqgLPEoQLp7POx5XaoB0cgsxW"
}

Relevant Guides

For relevant guides and examples, see the guides section.

Alternative installation via webhook

While the Ocean integration described above is the recommended installation method, you may prefer to use a webhook to ingest data from PagerDuty. If so, use the following instructions:

Note that when using the webhook installation method, data will be ingested into Port only when the webhook is triggered.

Webhook installation (click to expand)

In this example you are going to create a webhook integration between PagerDuty and Port, which will ingest PagerDuty services and its related incidents into Port. This integration will involve setting up a webhook to receive notifications from PagerDuty whenever an incident is created or updated, allowing Port to ingest and process the incident entities accordingly.

Import PagerDuty services and incidents

Port configuration

Create the following blueprint definitions:

PagerDuty service blueprint
{
"identifier": "pagerdutyService",
"description": "This blueprint represents a PagerDuty service in our software catalog",
"title": "PagerDuty Service",
"icon": "pagerduty",
"schema": {
"properties": {
"status": {
"title": "Status",
"type": "string",
"enum": [
"active",
"warning",
"critical",
"maintenance",
"disabled"
],
"enumColors": {
"active": "green",
"warning": "yellow",
"critical": "red",
"maintenance": "lightGray",
"disabled": "darkGray"
}
},
"url": {
"title": "URL",
"type": "string",
"format": "url"
},
"oncall": {
"title": "On Call",
"type": "string",
"format": "user"
},
"escalationLevels": {
"title": "Escalation Levels",
"type": "number"
},
"meanSecondsToResolve": {
"title": "Mean Seconds to Resolve",
"type": "number"
},
"meanSecondsToFirstAck": {
"title": "Mean Seconds to First Acknowledge",
"type": "number"
},
"meanSecondsToEngage": {
"title": "Mean Seconds to Engage",
"type": "number"
}
},
"required": []
},
"mirrorProperties": {},
"calculationProperties": {},
"relations": {}
}
PagerDuty incident blueprint
{
"identifier": "pagerdutyIncident",
"description": "This blueprint represents a PagerDuty incident in our software catalog",
"title": "PagerDuty Incident",
"icon": "pagerduty",
"schema": {
"properties": {
"status": {
"type": "string",
"title": "Incident Status",
"enum": [
"triggered",
"annotated",
"acknowledged",
"reassigned",
"escalated",
"reopened",
"resolved"
],
"enumColors": {
"triggered": "red",
"annotated": "blue",
"acknowledged": "yellow",
"reassigned": "blue",
"escalated": "yellow",
"reopened": "red",
"resolved": "green"
}
},
"url": {
"type": "string",
"format": "url",
"title": "Incident URL"
},
"urgency": {
"title": "Incident Urgency",
"type": "string",
"enum": [
"high",
"low"
],
"enumColors": {
"high": "red",
"low": "green"
}
},
"priority": {
"type": "string",
"title": "Priority",
"enum": [
"P1",
"P2",
"P3",
"P4",
"P5"
],
"enumColors": {
"P1": "red",
"P2": "yellow",
"P3": "blue",
"P4": "lightGray",
"P5": "darkGray"
}
},
"description": {
"type": "string",
"title": "Description"
},
"assignees": {
"title": "Assignees",
"type": "array",
"items": {
"type": "string",
"format": "user"
}
},
"escalation_policy": {
"type": "string",
"title": "Escalation Policy"
},
"created_at": {
"title": "Create At",
"type": "string",
"format": "date-time"
},
"updated_at": {
"title": "Updated At",
"type": "string",
"format": "date-time"
}
},
"required": []
},
"mirrorProperties": {},
"calculationProperties": {},
"relations": {
"pagerdutyService": {
"title": "PagerDuty Service",
"target": "pagerdutyService",
"required": false,
"many": true
}
}
}

Create the following webhook configuration using Port UI

PagerDuty webhook configuration
  1. Basic details tab - fill the following details:

    1. Title : PagerDuty Mapper;
    2. Identifier : pagerduty_mapper;
    3. Description : A webhook configuration to map PagerDuty services and its related incidents to Port;
    4. Icon : Pagerduty;
  2. Integration configuration tab - fill the following JQ mapping:

    [
    {
    "blueprint": "microservice",
    "filter": ".body.event.event_type | startswith(\"service\")",
    "entity": {
    "identifier": ".body.event.data.id",
    "title": ".body.event.data.summary",
    "properties": {
    "status": ".body.event.data.status",
    "url": ".body.event.data.html_url",
    "oncall": ".body.event.data.__oncall_user[] | select(.escalation_level == 1) | .user.email",
    "escalationLevels": ".body.event.data.__oncall_user | map(.escalation_level) | unique | length",
    "meanSecondsToResolve": ".body.event.data.__analytics.mean_seconds_to_resolve",
    "meanSecondsToFirstAck": ".body.event.data.__analytics.mean_seconds_to_first_ack",
    "meanSecondsToEngage": ".body.event.data.__analytics.mean_seconds_to_engage",
    }
    }
    },
    {
    "blueprint": "pagerdutyIncident",
    "filter": ".body.event.event_type | startswith(\"incident\")",
    "entity": {
    "identifier": ".body.event.data.id",
    "title": ".body.event.data.title",
    "properties": {
    "status": ".body.event.data.status",
    "url": ".body.event.data.html_url",
    "urgency": ".body.event.data.urgency",
    "assignees": ".body.event.data.assignments | map(.assignee.email)",
    "escalation_policy": ".body.event.data.escalation_policy.summary",
    "created_at": ".body.event.data.created_at",
    "updated_at": ".body.event.data.updated_at",
    "priority": ".body.event.dataif .priority != null then .priority.summary else null end",
    "description": ".body.event.data.description"
    },
    "relations": {
    "microservice": ".body.event.data.service.id"
    }
    }
    }
    ]
  3. Scroll down to Advanced settings and input the following details:

    1. secret: WEBHOOK_SECRET;
    2. Signature Header Name : X-Pagerduty-Signature;
    3. Signature Algorithm : Select sha256 from dropdown option;
    4. Signature Prefix : v1=
    5. Click Save at the bottom of the page.

    Remember to update the WEBHOOK_SECRET with the real secret you receive after subscribing to the webhook in PagerDuty.

Create a webhook in PagerDuty

  1. Go to PagerDuty and select the account you want to configure the webhook for.
  2. Navigate to Integrations in the navigation bar and click on Generic Webhooks (v3).
  3. Click New Webhook and provide the following information:
    1. Webhook URL - enter the value of the url key you received after creating the webhook configuration.
    2. Scope Type - select whether you want to receive webhook events for a specific service (select Service if applicable) or for all services in your account (select Account if applicable).
    3. Description - provide an optional description for your webhook.
    4. Event Subscription - choose the event types you would like to subscribe to.
    5. Custom Header - enter any optional HTTP header to be added to your webhook payload.
  4. Click Add webhook to create your webhook.
  5. Alternatively, you can use the curl method to create the webhook. Copy the code below and run it in your terminal:
  curl --request POST \
--url \
https://api.pagerduty.com/webhook_subscriptions
--header 'Accept: application/vnd.pagerduty+json;version=2' \
--header 'Authorization: Token token=<YOUR_PAGERDUTY_API_TOKEN>' \
--header 'Content-Type: application/json' \
--data \
'{
"webhook_subscription": {
"delivery_method": {
"type": "http_delivery_method",
"url": "https://ingest.getport.io/<YOUR_PORT_WEBHOOK_KEY>",
"custom_headers": [
{
"name": "your-header-name",
"value": "your-header-value"
}
]
},
"description": "Sends PagerDuty v3 webhook events to Port.",
"events": [
"service.created",
"service.updated",
"incident.triggered",
"incident.responder.added",
"incident.acknowledged",
"incident.annotated",
"incident.delegated",
"incident.escalated",
"incident.priority_updated",
"incident.reassigned",
"incident.reopened",
"incident.resolved",
"incident.responder.replied",
"incident.status_update_published",
"incident.unacknowledged"
],
"filter": {
"type": "account_reference"
},
"type": "webhook_subscription"
}
}'
tip

In order to view the different events available in PagerDuty webhooks, look here

Done! any change that happens to your services or incidents in PagerDuty will trigger a webhook event to the webhook URL provided by Port. Port will parse the events according to the mapping and update the catalog entities accordingly.

Let's Test It

This section includes a sample webhook event sent from PagerDuty when an incident is created or updated. In addition, it includes the entity created from the event based on the webhook configuration provided in the previous section.

Payload

Here is an example of the payload structure sent to the webhook URL when a PagerDuty incident is created:

Webhook event payload
{
"event": {
"id": "01DVUHO6P4XQDFJ9AHOADT3UQ4",
"event_type": "incident.triggered",
"resource_type": "incident",
"occurred_at": "2023-06-12T11:56:08.355Z",
"agent": {
"html_url": "https://your_account.pagerduty.com/users/PJCRRLH",
"id": "PJCRRLH",
"self": "https://api.pagerduty.com/users/PJCRRLH",
"summary": "username",
"type": "user_reference"
},
"client": "None",
"data": {
"id": "Q01J2OS7YBWLNY",
"type": "incident",
"self": "https://api.pagerduty.com/incidents/Q01J2OS7YBWLNY",
"html_url": "https://your_account.pagerduty.com/incidents/Q01J2OS7YBWLNY",
"number": 7,
"status": "triggered",
"incident_key": "acda20953f7446248f90260db65144f8",
"created_at": "2023-06-12T11:56:08Z",
"title": "Test PagerDuty Incident",
"service": {
"html_url": "https://your_account.pagerduty.com/services/PWJAGSD",
"id": "PWJAGSD",
"self": "https://api.pagerduty.com/services/PWJAGSD",
"summary": "Port Internal Service",
"type": "service_reference"
},
"assignees": [
{
"html_url": "https://your_account.pagerduty.com/users/PRGAUI4",
"id": "PRGAUI4",
"self": "https://api.pagerduty.com/users/PRGAUI4",
"summary": "username",
"type": "user_reference"
}
],
"escalation_policy": {
"html_url": "https://your_account.pagerduty.com/escalation_policies/P7LVMYP",
"id": "P7LVMYP",
"self": "https://api.pagerduty.com/escalation_policies/P7LVMYP",
"summary": "Test Escalation Policy",
"type": "escalation_policy_reference"
},
"teams": [],
"priority": "None",
"urgency": "high",
"conference_bridge": "None",
"resolve_reason": "None"
}
}
}

Mapping Result

The combination of the sample payload and the webhook configuration generates the following Port entity:

{
"identifier": "Q01J2OS7YBWLNY",
"title": "Test PagerDuty Incident",
"blueprint": "pagerdutyIncident",
"team": [],
"properties": {
"status": "triggered",
"url": "https://your_account.pagerduty.com/incidents/Q01J2OS7YBWLNY",
"details": "Test PagerDuty Incident",
"urgency": "high",
"responder": "Username",
"escalation_policy": "Test Escalation Policy"
},
"relations": {
"microservice": "PWJAGSD"
}
}

Import PagerDuty historical data

In this example you are going to use the provided Bash script to fetch data from the PagerDuty API and ingest it to Port.

The script extracts services and incidents from PagerDuty, and sends them to Port as microservice and incident entities respectively.

Port configuration

This example utilizes the same blueprint definition from the previous section, along with a new webhook configuration:

Create the following webhook configuration using Port UI

PagerDuty webhook configuration for historical data
  1. Basic details tab - fill the following details:

    1. Title : PagerDuty History Mapper;
    2. Identifier : pagerduty_history_mapper;
    3. Description : A webhook configuration to map PagerDuty Historical services and its related incidents to Port;
    4. Icon : Pagerduty;
  2. Integration configuration tab - fill the following JQ mapping:

    [
    {
    "blueprint": "microservice",
    "filter": ".body.event.event_type | startswith(\"service\")",
    "entity": {
    "identifier": ".body.event.data.identifier",
    "title": ".body.event.data.title",
    "properties": {
    "status": ".body.event.data.properties.status",
    "url": ".body.event.data.properties.html_url",
    "oncall": ".body.event.data.properties.__oncall_user[] | select(.escalation_level == 1) | .user.email",
    "escalationLevels": ".body.event.data.properties.__oncall_user | map(.escalation_level) | unique | length",
    "meanSecondsToResolve": ".body.event.data.properties.__analytics.mean_seconds_to_resolve",
    "meanSecondsToFirstAck": ".body.event.data.properties.__analytics.mean_seconds_to_first_ack",
    "meanSecondsToEngage": ".body.event.data.properties.__analytics.mean_seconds_to_engage",
    }
    }
    },
    {
    "blueprint": "pagerdutyIncident",
    "filter": ".body.event.event_type | startswith(\"incident\")",
    "entity": {
    "identifier": ".body.event.data.identifier",
    "title": ".body.event.data.title",
    "properties": {
    "status": ".body.event.data.properties.status",
    "url": ".body.event.data.properties.url",
    "details": ".body.event.data.properties.details",
    "priority": ".body.event.data.properties.priority",
    "urgency": ".body.event.data.properties.urgency",
    "responder": ".body.event.data.properties.responder",
    "escalation_policy": ".body.event.data.properties.escalation_policy"
    },
    "relations": {
    "microservice": ".body.event.data.relations.microservice"
    }
    }
    }
    ]
  3. Scroll down to Advanced settings and input the following details:

    1. secret: WEBHOOK_SECRET;
    2. Signature Header Name : X-Pagerduty-Signature;
    3. Signature Algorithm : Select sha256 from dropdown option;
    4. Signature Prefix : v1=
    5. Click Save at the bottom of the page.

Remember to update the WEBHOOK_SECRET with the real secret you receive after subscribing to the webhook in PagerDuty.

PagerDuty Bash script for historical data
#!/bin/bash

set -e
# PagerDuty API endpoints and token
PD_API_SERVICES='https://api.pagerduty.com/services'
PD_API_INCIDENTS='https://api.pagerduty.com/incidents'
PD_TOKEN='API_TOKEN'

# Port webhook URL
PORT_URL='https://ingest.getport.io/WEBHOOK_SECRET'

# Fetch services from PagerDuty
services=$(curl --silent --header "Authorization: Token token=${PD_TOKEN}" --header "Accept: application/vnd.pagerduty+json;version=2" --request GET ${PD_API_SERVICES})

# Extract services array
services_list=$(echo ${services} | jq '.services')

# Iterate over services and push to Port
for i in $(seq 0 $(($(echo ${services_list} | jq '. | length') - 1)))
do
service=$(echo ${services_list} | jq ".[$i]")

# Prepare payload for Port
payload=$(jq -n \
--arg id "$(echo ${service} | jq -r '.id')" \
--arg name "$(echo ${service} | jq -r '.name')" \
--arg status "$(echo ${service} | jq -r '.status')" \
'{event: {event_type: "service", data: {identifier: $id, title: $name, properties: {status: $status}}}}')

# Push to Port
curl --silent --header 'Content-Type: application/json' --request POST --data "${payload}" ${PORT_URL}

# Output payload to output.json file
echo ${payload} >> output.json
done

# Fetch incidents from PagerDuty
incidents=$(curl --silent --header "Authorization: Token token=${PD_TOKEN}" --header "Accept: application/vnd.pagerduty+json;version=2" --request GET ${PD_API_INCIDENTS})

# Extract incidents array
incidents_list=$(echo ${incidents} | jq '.incidents')

# Iterate over incidents and push to Port
for i in $(seq 0 $(($(echo ${incidents_list} | jq '. | length') - 1)))
do
incident=$(echo ${incidents_list} | jq ".[$i]")

# Prepare payload for Port
payload=$(jq -n \
--arg id "$(echo ${incident} | jq -r '.id')" \
--arg title "$(echo ${incident} | jq -r '.title')" \
--arg status "$(echo ${incident} | jq -r '.status')" \
--arg url "$(echo ${incident} | jq -r '.html_url')" \
--arg description "$(echo ${incident} | jq -r '.description')" \
--arg urgency "$(echo ${incident} | jq -r '.urgency')" \
--arg responder "$(echo ${incident} | jq -r '.last_status_change_by.summary')" \
--arg escalation_policy "$(echo ${incident} | jq -r '.escalation_policy.summary')" \
--arg service "$(echo ${incident} | jq -r '.service.id')" \
'{event: {event_type: "incident", data: {identifier: $id, title: $title, properties: {status: $status, url: $url, details: $description, urgency: $urgency, responder: $responder, escalation_policy: $escalation_policy}, relations: {microservice: $service}}}}')

# Push to Port
curl --header 'Content-Type: application/json' --request POST --data "${payload}" ${PORT_URL}

# Output payload to output.json file
echo ${payload} >> output.json
done


How to Run the script

This script requires two configuration values:

  1. PD_TOKEN: your PagerDuty API token;
  2. PORT_URL: your Port webhook URL.

Then trigger the script by running:

bash pagerduty_to_port.sh

This script fetches services and incidents from PagerDuty and sends them to Port.

tip

The script writes the JSON payload for each service and incident to a file named output.json. This can be useful for debugging if you encounter any issues.

Done! you can now import historical data from PagerDuty into Port. Port will parse the events according to the mapping and update the catalog entities accordingly.