Skip to main content
Version: 47.0.0

a9s Messaging Migration

This document describes how to do data migration between two a9s Messaging service instances. We have tested the following migration steps for the following cases:

  • from 3.7.x to 3.10.x
  • from 3.8.x to 3.10.x
  • from 3.10.x to 3.12.x

Migration Strategy

In this scenario, we will use an SSH tunnel to get the RabbitMQ server definitions from an a9s Messaging service instance and then restore them on another service using a similar process. This method migrates only the server queues and exchanges structures, not the messages contained on it.

Migration Steps

Create New a9s Messaging Service Instance

Create a new and empty a9s Messaging service instance that will be the target for the data migration from an existing a9s Messaging service instance.

Notes

  • Do not insert any data on the new instance before the migration and verification are done.
  • Create a new service instance with the same capabilities or greater.

Reverse Tunnel

The goal is to give the application developer access to RabbitMQ service instance. And for this, it is necessary to make an SSH tunnel to the service instance.

In the end, we want to achieve the following scenarios below:


/----> * RabbitMQs 3.7 * /--------> * RabbitMQ 3.10 *
/ /
* CF Application * (via SSH Tunnel) * CF Application * (via SSH Tunnel) [Infrastructure network]
--------/----------------------------------------------/--------------------------------------------------------------
/ / [Developer network]
/ /
* Developer Machine * * Developer Machine *

/----> * RabbitMQs 3.8 * /--------> * RabbitMQ 3.10 *
/ /
* CF Application * (via SSH Tunnel) * CF Application * (via SSH Tunnel) [Infrastructure network]
--------/----------------------------------------------/--------------------------------------------------------------
/ / [Developer network]
/ /
* Developer Machine * * Developer Machine *

/----> * RabbitMQs 3.10 * /--------> * RabbitMQ 3.12 *
/ /
* CF Application * (via SSH Tunnel) * CF Application * (via SSH Tunnel) [Infrastructure network]
--------/----------------------------------------------/--------------------------------------------------------------
/ / [Developer network]
/ /
* Developer Machine * * Developer Machine *

Create Tunnel

It is possible to access any of the a9s Data Services locally. That means you can connect with a local client to the service for any purpose such as debugging. Cloud Foundry (CF) provides a smart way to create SSH forward tunnels via a pushed application. For more information about this feature see the Accessing Apps with SSH section of the CF documentation.

First of all, you must have an application bound to the service. How to do this see Bind an application to a service instance.

NOTE: cf ssh support must be enabled in the platform. If you are not sure, run the following command:

$ cf ssh-enabled <APP_NAME>

Follow the section Create a Tunnel to The Service to create the reverse tunnel.

Create RabbitMQ Administrator Credentials

You will need a user with administrative privileges to do the tasks required for the migration. In both instances, create a rabbitmq user with the administrator role with the following command:

cf create-service-key my-messaging-service my-key -c '{"roles": ["administrator", "management"]}'

You should use these credentials to run all the commands on RabbitMQ instances described in the next sections.

Export Definitions From Old Instance

To export and import RabbitMQ definitions files on a9s Messaging services, one can use two methods: getting the files directly through the HTTP API, or using Backup Manager to create a backup that contains the service instance definitions and then restore this backup on the new instance.

To get the definition file through the HTTP API, make sure the SSH tunnel to the old instance is set up as described on the previous section, and run the following command from your local environment:

cf ssh <APP_NAME> -L 127.0.0.1:15672:<OLD_SERVICE_HOSTNAME>:15672

Then you can use curl to download the definitions file, passing the administrator credentials created in the previous step:

$ curl -o definitions.defs -u <SERVICE_ADMIN_USERNAME>:<SERVICE_ADMIN_PASSWORD> -X GET http://localhost:15672/api/definitions

If for some reason you cannot access the HTTP API, you can use the Backup Manager to get the definitions file. Access the service dashboard of the service instance you want to migrate from, you can find the dashboard URL with this command:

$ cf service <SERVICE_NAME> | grep dashboard

Make sure you set a encryption password for the backups using the service instance dashboard. Create a backup using the dashboard. Download the backup to your local machine.

Use the password set up before to decrypt the backup and write its contents to a file:

cat <BACKUP_FILE> | openssl enc -aes256 -md md5 -d -pass 'pass:<BACKUP_PASSWORD>' | gunzip -c > definitions.defs

This definitions file does NOT contain any message data, only queues and exchanges definitions. The message data migration will be done in a later step.

Import Definitions to New Instance

To import the queue definitions to the new instance, set up the SSH tunnel to the new instance:

cf ssh <APP_NAME> -L 127.0.0.1:15672:<NEW_SERVICE_HOSTNAME>:15672

Then you can use curl to import the definitions to the new instance like this:

$ curl --header 'Content-Type: application/json' -u <SERVICE_ADMIN_USERNAME>:<SERVICE_ADMIN_PASSWORD> -X POST -T definitions.defs http://localhost:15672/api/definitions

With this step done, we will have both instances set up with the same structure, so we can proceed to the application migration.

Migrate Producer Applications

You can now switch your producers to use the new service instance. This step changes depending on the application setup, reconfigure your load balancer or your consumer applications.

Migrate Consumer Applications

Once the queues in the old instance are almost empty, you can stop consumers. If message ordering is important to you, you can still wait a bit more so that the consumers finish draining the queues on the old instance. When they are empty, reconfigure them as you did for the producers and restart them. At this point, everything is migrated to the new instance.

Decommission Old Instance

The last step is to stop the old service instance. Your migration is now complete.