Automating API Management CI/CD with APIM DevOps Resource Kit

Hey everyone! It’s been a while right? The only thing I have to say in my defense is that the last 5 months have been crazy with travel, presentations and lots of work, so I couldn’t get myself to create a good blog post. But enough of excuses already… Remember the “lots of work” part? That involved working with different clients, and a common theme among many of those clients was the implementation of a centralized API layer, tho expose the back services they were creating. That meant that I had to find a good way to automate the publishing of the APIs created in API Management across the different environments. Previously, I’ve relied on the fantastic work that Mattias Logdberg did with the API Management ARM Template Creator (which is also worth a look). But now that Microsoft had some official guidance for that extraction, I decided to have a go with it.

API Management DevOps Resource Kit

This guidance from Microsoft came in the form of a DevOps Resource Kit, an open source project that provide us guidance and components to support the creation and/or extraction of APIs from an API Instance deployed to Azure. You can find the resource kit on its official github repository. Since the page is fairly well documented and explains how Microsoft sees the lifecycle of APIs within API Management been managed through a DevOps pipeline, I will not spend much time in it. But I think it is still worth to share the process visually, so I will present here a copy of the life cycle diagram which is presented in the repository, for your own convenience:

API Management life cycle from the APIM DevOps Resource Kit.

As per the image below, the whole guidance is based on a set of ARM templates that can be used to sync one or more APIs between APIM instances. Allowing you to:

  • Generate a set of base templates, and publish them to a working instance of APIM.
  • Create and/or update existing APIs in the working instance.
  • Export the templates from the work instance, and push them to the master repository.
  • Publish from that master repository into Azure.

The “magic sauce” on the Resource Kit is done by a .NET Core solution, which contains two main applications – the creator and the extractor.

  • The creator allows you to create a new ARM template representing an API management definition, based on an OpenAPI specification and optionally a steps of policies. So if you prefer working with code automation, you can create your API definition to APIM without having to use the Azure Portal at all. That unlock quite interesting cases, where the team developing the API doesn’t require contributor access to APIM instance, and is still able to contribute with APIs.
  • The extractor allows you to connect to an APIM instance and extract one or all APIs defined in that instance. This is perfect for teams that are using the GUI or started without any devops requirements or guidance and now need to automate the publication.

In my case, I’ve used the extractor to export all the APIs in greenfields implementation. It was as simple as following the instructions of the tool, and required only three commands:

First two commands, which requires Azure CLI, will setup the environment on the correct Azure subscription.

The last command, which is the actual extract function, must be executed from the root folder of the .NET Core solution and requires .NET Core 2.1. The documentation doesn’t say what level of permission you are required int the resource group and apim instance, so assume contributor as a safe option.

If you want to generate the files for a single api, you will replace the last command above with the following:

where the <api_name> value is the name of the API in APIM:

Where to find the api_name settings in the portal

The extractor component have other, more advanced commands, to deal with linked templates and policy XML files. You can find all the details of the extractor, as well as creator, here.

Generated Files

The extract and create commands both generate the following files:

  • <apim instance name>-apis-template.json – extracts all the APIs in the APIM instance and extract the API, operations and connection between api and products.
  • <apim instance name>-apiVersionSets.json – extracts The linkage between the APIs and its defined versions.
  • <apim instance name>-authorizationServers.json – extracts all the Authorization Servers defined in the instance.
  • <apim instance name>-backends.json – extracts all the back end services defined
  • <apim instance name>-loggers.json – extracts the configured logging settings of the APIs.
  • <apim instance name>-namedValues.json – extracts all stored named values. One thing to notice is that all secrets are extracted encrypted, so no secrets will be exposed in the ARM template.
  • <apim instance name>-products.json – extracts all the configured products that are configured in the APIM instance.

When extracting a single API, you will find that the name will change slightly to <apim instance name>-<api name>-api-template.json.

One important thing to notice is that apart from the API file, all the other files will have the same name in both cases, but will only include information related to the API being extracted. The one exception to this is the namedValues file, which contains all the named Values for the whole instance.

Automating the process

Once you get all the files extracted, automating this process is relatively simple, using Azure DevOps. In my case, what I did was:

  • Add a parameter file for each file (when you export files there is only one parameter generated – the apiminstance name – but you will probably want to parameterize other items like back end services etc.).
  • Add the files to an Azure DevOps repository
  • Create a new build pipeline file.
  • Configure the pipeline to be automatically executed when a new commit is pushed into the repository.

The Azure Pipeline is quite simple to implement. Here is how I’ve created it:

For that to work on your case, you will need to update the following items:

  • <service connection> – this will be the service connection you create in Azure DevOps, associating a service principal to one or more resource groups. You can find more information about how to create the service connection here.
  • <apiminstance> – the name of the apim instance from where you extracted the files.
  • <apim region> – the location (azure region) where the destination APIM instance is deployed.
  • <apim resourcegroup> – the resource group where the destination APIM instance is deployed.

Summary

The new API Management DevOps Resource Kit speed up the process of extracting APIs from your development environment into new environments. The process is quite straight forward and doesn’t require much extra work to get it automated apart from parameterizing items like backend service and policies.

Automating is a simple as adding a quite straightforward build pipeline to the repository and attaching it to a build/release pipeline.

I am toying with the idea of using variable groups instead of ARM template parameters. This is something that I need to do some research, so stay tuned. If that works parameterization might become a bit simpler.


Sharing is caring...

2 thoughts on “Automating API Management CI/CD with APIM DevOps Resource Kit”

  1. —Please disregard my previous comment. I forgot to remove the real names from my code

    Hi, great article but I could use some help or more details. I have 3 environments in Azure. In the DEV env. I have an API Management that I have now been able to extract and get all these generated json files that you mention. But I am a bit in doubt, how I proceed from here. Automating The Process section is a bit short for me 🙂

    I want to be able to take those extracted APIs and deploy them to my TEST env, but I can see that all the APIs in the backends.json file are the ones from the DEV env and all URLs in them are hardcoded:

    “resources”: [
    {
    “properties”: {
    “description”: “MyLogicApp-LA”,
    “resourceId”: “https://management.azure.com/subscriptions/sd5b086-87e4-4eb7-b6f6-1a0900db6bec/resourceGroups/myrg/providers/Microsoft.Logic/workflows/MyLogicApp-LA”,
    “url”: “https://dev-119.westeurope.logic.azure.com/workflows/383d73a65bbc4aeb82bc9b98bba37097”,
    “protocol”: “http”
    },
    “name”: “[concat(parameters(‘ApimServiceName’), ‘/MyLogicApp-LA’)]”,
    “type”: “Microsoft.ApiManagement/service/backends”,
    “apiVersion”: “2019-01-01”
    },

    I am not sure what the process is from here. I want to use the LogicApps that are already deployed to my TEST env and use this open source tool for this, but I could use some more guidance.

    Am I of the wrong impression, when I think that this extraction only gave me these json files that I can store in my repo and ONLY deploy back to my DEV env?

    1. Hi Oliver,

      The extraction is the first step. Once you have it extracted, you should be parameterizing the bits that should be changing from each environment. At this stage, the tool do not provide parameterization during the extraction process yet, although it is something that I think it is in the pipeline, since it is one of the main tasks when we are creating ARM templates.

      You can find a begginer’s guide to parameters here and the official ARM template guidance here.

      If you noticed at the end of the post, I added the following paragraph I am toying with the idea of using variable groups instead of ARM template parameters. This is something that I need to do some research, so stay tuned. If that works parameterization might become a bit simpler.. I am still in two minds on using this or ARM template parameters because using tokens (which are basically replacing the values I want to change with specially coded values, like {{api-a-backend}}, then using a batch process that replaces each token for the respective value in that environment) would tie the ARM template to Azure DevOps only (or at least require some pre-parsing using powershell or something similar), while using ARM templates, although is more work the first time, would give me more flexibility, as there are powershell commands and support on Azure Portal to import those templates (with the associated parameter file). I will write a follow up post in the next week or so about those options.

Leave a Reply

Your email address will not be published. Required fields are marked *