Azure DevOps Integration Setup

Updated 2 months ago by Copado Solutions

The instructions provided in this article are for the new unlocked package. If you are working with the old Change Management Integrations unmanaged package, follow the instructions provided in this other article.

Introduction

Copado’s integration with Azure DevOps allows you to work on your user stories by easily creating a project and importing your work items to the project in Copado. All you need to do to start leveraging Copado and Azure DevOps is:

Setup 

The first thing you need to do is log in to our Success community and install the package. You can install the package in a sandbox or in a production/developer org, and choose to install it for admins only, which is the recommended option, for all users or for specific profiles.

Configuring the Named Credential

Once the package has been successfully installed, you need to configure a named credential. According to Salesforce, a named credential specifies the URL of a callout endpoint and its required authentication parameters. To configured a named credential, follow the steps below:

  1. Go to Setup > Named Credentials and click on New Named Credential:
  1. Give your named credential a name, e.g. Azure DevOps Integration.
  2. Enter a URL. The accepted Azure DevOps URL is https://dev.azure.com/[OrganizationName]/. Taking Copado Academy as the organization name, the URL would look like this: https://dev.azure.com/CopadoAcademy/.
    If you configured the integration before VSTS was rebranded as Azure DevOps, the old URL endpoint https://[CompanyName].visualstudio.com/ is still valid.
  3. In the Identify Type field, select Named Principal. This means that one user will provide the authentication for all callouts from Salesforce to the external application. 
  4. Select Password Authentication as Authentication Protocol.
  5. Enter your username and the Azure DevOps API token. If you don’t have an API token yet, follow these steps to create one:
    1. Sign in to your organization in Azure DevOps using https://dev.azure.com/[CompanyName].
    2. Go to User Settings > Security > Personal access tokens
    3. Click on New Token:
    1. Give your token a name and set an expiration date.
  6. Back in Salesforce, set the Generate Authentication Header checkbox to True.
  7. Click on Save. Your named credential would look like this:
Creating the Copado Integration Setting Record

Now that you have the named credential, you need to create a Copado Integration Setting record to link the named credential with the project where you would like to import your work items.

  1. Navigate to the Copado Integration Setting tab and click on New.
  2. Give the setting a name, e.g. Copado <> Azure DevOps.
  3. In the External System field, select Visual Studio Team Services.
  4. In the Named Credential field, type in exactly the same name (API name) you used for the named credential you created in a previous step.
  5. Click on Save. Your record should look like this:

Updating the Project Layout

In order for you to be able to make the necessary changes to the project, you will need to add a few elements to the project layout.

  1. Go to Setup > Object Manager > Project.
  2. Click on Page Layouts and then on Project Layout.
  3. Add a new section named Azure DevOps Integration with the following fields to the layout:
    1. Workspace Id
    2. Project External Id
    3. Copado Integration Setting
    4. Enable Logs
    5. Team Info
    6. Enable Community Users
In the Team Info field you can specify the teams whose sprints should be fetched. If you don't specify anything, sprints from all teams will be fetched.
  1. Add the following buttons to the layout:
    1. Sync External User Stories
    2. Sync User Stories with Sprints
  2. Add the following related lists:
    1. Field Mappings. Click on the wrench icon in the related list to open the related list properties and include the fields as related list’s columns:
      1. Field Mapping Name
      2. Salesforce Field Name
      3. Third Party Field Name
      4. Exclude from Salesforce update
      5. Exclude from Third Party update
      6. Created by
      7. Last Modified by
    2. Callout Logs
    3. Record Type Mappings and include the following fields:
      • Salesforce Record Type Name 
      • Third Party Record Type Name
Make sure the relevant profiles or permission sets have the right field-level and object-level security settings. 

Updating the User Story Object

Once you have updated the Project layout, it is time to update the Status field picklist to reflect the picklist options for this field within Azure DevOps. This is to ensure the seamless bidirectional update of the status field.

To update it, follow these steps:

  1. Go to Setup > Object Manager > User Story.
  2. Click on Fields and Relationships and find the Status field.
  3. Edit the picklist values to reflect the available values for the Status field in Azure DevOps.
Setting Up Your Project

It is time now to set up your project or if you already have one, to update it.

  1. Navigate to the project where you want to import your Azure DevOps work items. If you have not created a project yet, follow the instructions in the article Project Overview and Setup to create a new one.
  2. In the Copado Integration Setting lookup, find the setting you have previously created.
  3. If you want to include any active community users for user field mapping in the sync, you need to flag the Enable Community Users checkbox. 
    1. This allows you to include or exclude community users in the Jira sync with Copado. The default value is unchecked, so if you want to include any active community users for user field mapping, such as Assignee to Developer, you can check this field to enable them.
  4. Populate the Project Id from Azure DevOps. This will be the name of the project as shown under Project details in Azure. If the Project ID has spaces, replace them with %20.
  5. Enter the Workspace Id from Azure DevOps. To get this Id, follow the steps below:
    1. Navigate to your Azure project > Boards > Queries, and click on New query:
    1. Build a query that suits your needs.
    2. Click on Run Query to test it and then on Save Query if you are happy with the results.
    3. Give your query a name and select a folder to store it. Then, click on OK.
    4. Finally, copy the Id included in the URL bar.
Configuring the Field Mapping

Configure the Field Mapping object by doing an insert of the default mapping file which can be downloaded using the following link: Azure DevOps Field Mappings. The insert can be done by means of any of the Salesforce data upload tools such as data loader or metadata inspector.

Before importing the file, you need to enter the corresponding Project__c record id in the CSV file.

Azure integration is able to map only the fields that are included in the default mapping. Adding extra fields to that mapping is not supported by Copado and might not work successfully.

While custom field mappings are not currently supported by Copado they can be created manually from the Field Mapping related list or by uploading a CSV file.  In order to do so, you will need the Salesforce API name of the field that you are mapping to and the reference name for the field in Azure.  The reference names in Azure DevOps can be found by updating the following link to reflect your Azure DevOps instance: https://dev.azure.com/{organization}/{project}/_apis/wit/fields?api-version=6.0

The results can be reformatted in a more palatable manner using a JSON formatter like: https://jsonformatter.curiousconcept.com/#

Configuring the Record Type Mapping

Mapping Work Item Types with Record Types in the Field Mapping object is supported in the Azure DevOps integration. 

You need to manually configure the Record Type Mapping records for the following record types:

Record Type Mapping records for Azure DevOps

To do that, follow the steps below: 

  1. Navigate to the Projects tab.
  2. Select the project you want to map.
  3. On the Record Type Mapping related list click on New:
New Record Type Mapping
  1. Fill in the data as required and click on Save
    1. Example for User Story record type:
      1. Salesforce Record Type Name: User Story.
      2. Third Party Record Type Name: User Story.
      3. Click on Save.
Example of a Record Type Mapping records for User Story

Repeat the process for all the other record types. 

Optional Steps

Schedule User Stories Retrieve

You can schedule an Apex job to retrieve the user stories from Jira based on your preferred frequency.

The class ScheduleUserStoryFetch has been created to perform a bulk import of User Story records from the external provider to Salesforce. Depending on the configuration of its cron expression, it will carry out the bulk operation periodically. It will retrieve all the mapped fields and will update the Salesforce fields with the external data.

Find below a sample script on how to schedule the fetch process:

//Parameters to use on schedule job 
String myProjectRecordId = 'your_project_id'; Boolean withSprint = true/false; 

//Now let's schedule the project sync for everyday at 12:00
copadoccmint.ScheduleUserStoryFetch scheduledClass = new copadoccmint.ScheduleUserStoryFetch (myProjectRecordId, withSprint);
String scheduleJobId = system.schedule('UserStoryFetch - DailyJob', '0 0 12 1/1 * ? *', scheduledClass);

A process builder flow called SendUpdatedValues2TP has been created to update changes in user stories on the external provider. It is executed every time a change in a user story is detected and will send the modified fields to the external object fields. You can change the criteria of this process builder to update it to your needs/process.

Troubleshooting

If you experience any parsing issues in the integration, replace line 186 with line 188 in the VSTSIntegration class inside the fetchRecords method.

Map<String, Object> results = (Map<String, Object>) JSON.deserializeUntyped(responseText);
//use below line if you experience parsing issues on above
//Map<String, Object> results = (Map<String, Object>) JSON.deserializeUntyped(CopadoCCMUtilities.escapeInvalidChars(responseText));


How did we do?