Automating API Management CI/CD with APIM DevOps Resource Kit

Hey everyone! It’s been a while right? The only thing I have to say in my defense is that the last 5 months have been crazy with travel, presentations and lots of work, so I couldn’t get myself to create a good blog post. But enough of excuses already… Remember the “lots of work” part? That involved working with different clients, and a common theme among many of those clients was the implementation of a centralized API layer, tho expose the back services they were creating. That meant that I had to find a good way to automate the publishing of the APIs created in API Management across the different environments. Previously, I’ve relied on the fantastic work that Mattias Logdberg did with the API Management ARM Template Creator (which is also worth a look). But now that Microsoft had some official guidance for that extraction, I decided to have a go with it.

API Management DevOps Resource Kit

This guidance from Microsoft came in the form of a DevOps Resource Kit, an open source project that provide us guidance and components to support the creation and/or extraction of APIs from an API Instance deployed to Azure. You can find the resource kit on its official github repository. Since the page is fairly well documented and explains how Microsoft sees the lifecycle of APIs within API Management been managed through a DevOps pipeline, I will not spend much time in it. But I think it is still worth to share the process visually, so I will present here a copy of the life cycle diagram which is presented in the repository, for your own convenience:

API Management life cycle from the APIM DevOps Resource Kit.

As per the image below, the whole guidance is based on a set of ARM templates that can be used to sync one or more APIs between APIM instances. Allowing you to:

  • Generate a set of base templates, and publish them to a working instance of APIM.
  • Create and/or update existing APIs in the working instance.
  • Export the templates from the work instance, and push them to the master repository.
  • Publish from that master repository into Azure.

The “magic sauce” on the Resource Kit is done by a .NET Core solution, which contains two main applications – the creator and the extractor.

  • The creator allows you to create a new ARM template representing an API management definition, based on an OpenAPI specification and optionally a steps of policies. So if you prefer working with code automation, you can create your API definition to APIM without having to use the Azure Portal at all. That unlock quite interesting cases, where the team developing the API doesn’t require contributor access to APIM instance, and is still able to contribute with APIs.
  • The extractor allows you to connect to an APIM instance and extract one or all APIs defined in that instance. This is perfect for teams that are using the GUI or started without any devops requirements or guidance and now need to automate the publication.

In my case, I’ve used the extractor to export all the APIs in greenfields implementation. It was as simple as following the instructions of the tool, and required only three commands:

az login
az account set --subscription <subscription_id>
dotnet run extract --sourceApimName <name_of_the_source_APIM_instance> --destinationApimName <name_of_the_destination_APIM_instance> --resourceGroup <name_of_resource_group> --fileFolder <path_to_folder>

First two commands, which requires Azure CLI, will setup the environment on the correct Azure subscription.

The last command, which is the actual extract function, must be executed from the root folder of the .NET Core solution and requires .NET Core 2.1. The documentation doesn’t say what level of permission you are required int the resource group and apim instance, so assume contributor as a safe option.

If you want to generate the files for a single api, you will replace the last command above with the following:

dotnet run extract --sourceApimName <name_of_the_source_APIM_instance> --destinationApimName <name_of_the_destination_APIM_instance> --resourceGroup <name_of_resource_group> --fileFolder <path_to_folder> --apiName <api_name>

where the <api_name> value is the name of the API in APIM:

Where to find the api_name settings in the portal

The extractor component have other, more advanced commands, to deal with linked templates and policy XML files. You can find all the details of the extractor, as well as creator, here.

Generated Files

The extract and create commands both generate the following files:

  • <apim instance name>-apis-template.json – extracts all the APIs in the APIM instance and extract the API, operations and connection between api and products.
  • <apim instance name>-apiVersionSets.json – extracts The linkage between the APIs and its defined versions.
  • <apim instance name>-authorizationServers.json – extracts all the Authorization Servers defined in the instance.
  • <apim instance name>-backends.json – extracts all the back end services defined
  • <apim instance name>-loggers.json – extracts the configured logging settings of the APIs.
  • <apim instance name>-namedValues.json – extracts all stored named values. One thing to notice is that all secrets are extracted encrypted, so no secrets will be exposed in the ARM template.
  • <apim instance name>-products.json – extracts all the configured products that are configured in the APIM instance.

When extracting a single API, you will find that the name will change slightly to <apim instance name>-<api name>-api-template.json.

One important thing to notice is that apart from the API file, all the other files will have the same name in both cases, but will only include information related to the API being extracted. The one exception to this is the namedValues file, which contains all the named Values for the whole instance.

Automating the process

Once you get all the files extracted, automating this process is relatively simple, using Azure DevOps. In my case, what I did was:

  • Add a parameter file for each file (when you export files there is only one parameter generated – the apiminstance name – but you will probably want to parameterize other items like back end services etc.).
  • Add the files to an Azure DevOps repository
  • Create a new build pipeline file.
  • Configure the pipeline to be automatically executed when a new commit is pushed into the repository.

The Azure Pipeline is quite simple to implement. Here is how I’ve created it:

For that to work on your case, you will need to update the following items:

  • <service connection> – this will be the service connection you create in Azure DevOps, associating a service principal to one or more resource groups. You can find more information about how to create the service connection here.
  • <apiminstance> – the name of the apim instance from where you extracted the files.
  • <apim region> – the location (azure region) where the destination APIM instance is deployed.
  • <apim resourcegroup> – the resource group where the destination APIM instance is deployed.


The new API Management DevOps Resource Kit speed up the process of extracting APIs from your development environment into new environments. The process is quite straight forward and doesn’t require much extra work to get it automated apart from parameterizing items like backend service and policies.

Automating is a simple as adding a quite straightforward build pipeline to the repository and attaching it to a build/release pipeline.

I am toying with the idea of using variable groups instead of ARM template parameters. This is something that I need to do some research, so stay tuned. If that works parameterization might become a bit simpler.

Sharing is caring...

15 thoughts on “Automating API Management CI/CD with APIM DevOps Resource Kit”

  1. —Please disregard my previous comment. I forgot to remove the real names from my code

    Hi, great article but I could use some help or more details. I have 3 environments in Azure. In the DEV env. I have an API Management that I have now been able to extract and get all these generated json files that you mention. But I am a bit in doubt, how I proceed from here. Automating The Process section is a bit short for me 🙂

    I want to be able to take those extracted APIs and deploy them to my TEST env, but I can see that all the APIs in the backends.json file are the ones from the DEV env and all URLs in them are hardcoded:

    “resources”: [
    “properties”: {
    “description”: “MyLogicApp-LA”,
    “resourceId”: “”,
    “url”: “”,
    “protocol”: “http”
    “name”: “[concat(parameters(‘ApimServiceName’), ‘/MyLogicApp-LA’)]”,
    “type”: “Microsoft.ApiManagement/service/backends”,
    “apiVersion”: “2019-01-01”

    I am not sure what the process is from here. I want to use the LogicApps that are already deployed to my TEST env and use this open source tool for this, but I could use some more guidance.

    Am I of the wrong impression, when I think that this extraction only gave me these json files that I can store in my repo and ONLY deploy back to my DEV env?

    1. Hi Oliver,

      The extraction is the first step. Once you have it extracted, you should be parameterizing the bits that should be changing from each environment. At this stage, the tool do not provide parameterization during the extraction process yet, although it is something that I think it is in the pipeline, since it is one of the main tasks when we are creating ARM templates.

      You can find a begginer’s guide to parameters here and the official ARM template guidance here.

      If you noticed at the end of the post, I added the following paragraph I am toying with the idea of using variable groups instead of ARM template parameters. This is something that I need to do some research, so stay tuned. If that works parameterization might become a bit simpler.. I am still in two minds on using this or ARM template parameters because using tokens (which are basically replacing the values I want to change with specially coded values, like {{api-a-backend}}, then using a batch process that replaces each token for the respective value in that environment) would tie the ARM template to Azure DevOps only (or at least require some pre-parsing using powershell or something similar), while using ARM templates, although is more work the first time, would give me more flexibility, as there are powershell commands and support on Azure Portal to import those templates (with the associated parameter file). I will write a follow up post in the next week or so about those options.

  2. Good day I followed the blog (thanks),
    I am close but “no cigar” so to speak.
    The issue is that I am not sure how to get the value for “destinationApimName”.
    I am using “test”…
    API Management Template

    Connecting to REDACTED API Management Service on REDACTED Resource Group …
    Executing full extraction …
    Extracting global service policy from service
    Error occured: Response status code does not indicate success: 404 (Not Found).
    Response status code does not indicate success: 404 (Not Found).
    PS C:\Tools\azure-api-management-devops-resource-kit-master\src\APIM_ARMTemplate\apimtemplate>

    1. Hi John,

      destinationApimName seems to be a placeholder for the destination parameter value, so in theory any value is valid. I’ve executed mine using temp and it worked without a glitch. What I suggest is that you double check that you have enough permission on the resource group/api management to execute the command.
      I don’t know yet what is the minimum required permissions, so I would suggest that you have contributor at the resource group level. Also make sure you have the latest version of the code.

      BTW, sorry for the late reply, but I was replacing computers, so until I got everything back in the place took me a while.

      I hope this helps.

  3. @wsilveiranz excellent explain.Im trying implement ci/cd but no successful. I’m facing issue to extract templates from a APim instance, the dotnet run extract , “complains” :Unrecognized command or argument ‘c:/….’ ( path do arquivo), on –fileFolder I still didn’t see where I’m failing..

    1. Hey Edmar.

      Looking at your error, I can think of one of two things: either your path is missing using the wrong “slash” (looks like you are using / (fwd slash) instead of \ (backward slash) for your path). But the other thing that always trick me is that if you have an space in your path, you should have it around quotes (e.g. c:\temp\my folder should be “c:\temp\my folder”).

      I hope this helps, Wagner.

      1. Thank you @wsilveiranz, I could fix it, on really as I mentioned there on Issue (Github of the project), I realized that I put the wrong parameter –resourceGroup ( was typing –resourceGroupName), but I believe that still is valid, once the command should return error about –resourceGroup parameter not about filePath. 🙂 Now I’m facing another issue to validate the templates on AzureDevOps pipelines, at moment of ‘ready’ the templates from azure storage(blob):
        [error]Multiple error occurred: BadRequest,BadRequest,BadRequest,…


        ##[error]InvalidContentLink: Unable to download deployment content from ‘’. The tracking Id is ‘blabla-blabla-blabla-49f4b900b5b6’. Please see for usage details.

        I’m not sure what is wrong.. because everything is look like ok.. the unique point that I’m suspect is about the extension of the file which here is showing as *.jsonlinkedTemplate.. Please some idea?

        1. Hi Edmar,

          Sorry for the late reply. How is you setting up the validation task? It feels like this error is related to how the task is setup.

          Cheers, Wagner.

  4. Hi interesting read – We’ve managed to implement a similar process without making use of this tool , extracting api’s and importing isnt an issue – until specific policies tied to operations, api’s and products differ between environments. I was wondering if anyone had any insight into this problem and potential solutions with regards to solving it. Currently we define operations per environment and pull environment specific operations in during import. Whilst this solution is working it adds a significant amount of overhead and introduces a risk of potential misconfiguration. Any ideas or assistance would be greatly appreciated.

    1. Hi Vinni,

      What do you mean “operations per environment” your API have different operation names or some operations are not valid on specific environments?
      Could you give some examples? Trying to understand the problem to see if I can think of something that might be helpful.

      Cheers, Wagner.

  5. This is a great post. Thank you!

    Is there a way to programmatically save and re-apply policies for APIM? It doesn’t look like these are part of an ARM template (as they’re XML files instead). I’m looking to develop API’s and policies in Dev and then deploy these in production.

    1. Hi John,

      As far as I know, there is no way to deploy just the policies. The policies are part of the API ARM definition, although you can export them separately on its own XML file for easier management. When you do that, you have something like this in your template:

      “properties”: {
      “value”: “[concat(parameters(‘PolicyXMLBaseUrl’), ‘/claimcheck-sender-apiPolicy.xml’)]”,
      “format”: “rawxml-link”
      “name”: “[concat(parameters(‘ApimServiceName’), ‘/claimcheck-sender/policy’)]”,
      “type”: “Microsoft.ApiManagement/service/apis/policies”,
      “apiVersion”: “2019-01-01”,
      “dependsOn”: [
      “[resourceId(‘Microsoft.ApiManagement/service/apis’, parameters(‘ApimServiceName’), ‘claimcheck-sender’)]”
      for each policy you created on the API (e.g. one for the API policy itself, one for each operation policy, one for each product policy, etc).

      If you didn’t try yet, you should take a look at the VS Code extension for APIM. It has some new functionality, including the ability to connect directly tan APIM instance and develop the policy locally. It also have the Extractor implemented as part of the extension, so you don’t have to programmatically execute the code or maintain a copy of the code anymore.

      Take a look at my MS Build announcements , where I talk about the extension for a bit (this is a post on my list to create soon), and also there is a link to a session from Matt Farmer and Miao Jiang, where Miao shows the extension in action.

  6. Hi Wagner,

    Excellent Article, when i try to extract a single API, it still tries to export all named values and all Backends.
    Is There a way to extratc only the required ones?

  7. Hi ,
    I have used the command in Azure CLI
    dotnet run extract –sourceApimName my-feature-apim –destinationApimName company-stable-apim –resourceGroup my-feature-rg –fileFolder c:\temp\apim-extract
    but it would through the below error, Is this extractor command can work in VS?

    Couldn’t find a project to run. Ensure a project exists in F:\AIS POC\AIS Echo API, or pass the path to the project using –project.

    1. Hi Sunaya, sorry for the late reply. I’ve been away from the blog for a while. I would suggest you to take a look at the new API Management Extension for VS Code, which brings the API extractor embeded. You can find a link for a presentation I did for the Brisbane Azure User Group a while ago about this subject.

Leave a Reply

Your email address will not be published.