BizTalk Server 2016 and SQL Server 2016 SP2

BizTalk Server 2016 Cumulative Update 5 was released last week. One of the items in the CU list was this KB – adding support in BizTalk Server 2016 for SQL Server 2016 SP2.

Why is this important? Because this service pack simplifies the deployment of SQL components when setting up BizTalk Server 2016 highly available environments. Continue reading “BizTalk Server 2016 and SQL Server 2016 SP2”

Automating API Management Backup and Restore with Logic Apps

I’ve been working during the last week or so on setting up a DR strategy for a solution that is based on API Management, Azure Functions and Service Bus. Most of the deployment to the secondary site is dealt by VSTS, but one of the main issues on the proposed strategy was the fact that APIM instance utilized is Standard, which doesn’t allow multi-region deployments. This way, to guarantee that all APIM configuration, including users, API policies and subscriptions, I had to leverage from the backup/restore functionality available in APIM, based on the Management API.

The API calls for backup and restores are quite straight forward, but use a authorization token that must be requested before the API call can be executed. So, to automate the process to generate the token and execute the backup or restore API calls, I decided to use Logic Apps. Continue reading “Automating API Management Backup and Restore with Logic Apps”

Acessing Event Hubs with Confluent Kafka Library

A while ago, I was involved in a project that needed to push messages to a Kafka topic. I found that while the .NET code required for that implementation was relatively straight-forward – thanks to the Confluent’s .NET client for Kafka –  configuring the server was a nightmare. The IT team at the client site was supposed to get the kafka cluster sorted and dragged the issue for a month or so. And I understood why when I tried to setup a cluster myself – configuring a kafka cluster is not a walk in the park. If only we had a managed, one click solution to implement a event streaming solution, based on kafka protocol… 😀

When Microsoft announced last month Event Hubs support for the Kafka protocol – I thought that a great way to prove that this was really interoperable, was to use part of the original code I wrote and see if I could connect to Event Hubs without any significant changes. And I was pleasantly surprised! The only changes required was some additions to the producer/consumer configuration. This post shows how I managed to get this working, and show one of the main gotchas I found along the way. Continue reading “Acessing Event Hubs with Confluent Kafka Library”