The Great Azure DevOps Migration - Part 5: Prepare

timothymcgrath

Timothy McGrath

Posted on August 7, 2019

The Great Azure DevOps Migration - Part 5: Prepare

We've validated that our data is ready for import. Now, we need to prepare the data to be imported!
This is a short step, so let's enjoy the ease of this one.

If you missed the earlier posts, start here.

I highly recommend Microsoft’s Azure DevOps Service Migration Guide.

Prepare Command

In the same way that we used the Migrator validate command earlier, we need to run a Migrator prepare command. This re-runs the validation but also creates a .json file that will be used for the actual import process.

So, open Powershell to the directory that contains the Migrator.exe file (in the DataMigrationTool download). Execute the cmd below:

Migrator prepare /collection:[COLLECTION_ADDRESS] tenantdomainname:[AZURE_TENANT_NAME] /region:CUS

I recommend using the localhost address to your collection to verify that you are pointed at the right server. The tenant domain name is the Azure Active Directory that it will connect to for your newly imported data. The region must be from a narrow list of Azure regions, make sure you choose a supported region. View the full list here.

Execute the command and you will see results similar to the validation run earlier.

If all goes well, you will find the new import.json file in the Logs folder of the DataMigrationTool. Inside Logs, open the newest folder, and open the import.json file in a text editor.

There are a bunch of fields in this file, but we only care about the ones at the very top. Update the following fields:
Target.Name - The name of your organization that will be created at Azure DevOps.
Properties.ImportType - DryRun for this initial test.
The two source fields will be updated in the next post.

Azure Storage

Next, you need to setup an Azure Storage Container. This is the location you will move the file containing all of your TFS data to before importing it into Azure DevOps Service.

In Azure, you just need to create a new Standard Storage Container. This container has to be created in the same data center region as you set in the import.json file. So make sure you pay attention to that!

I simply created a Standard Storage Container in Central US, easy.

What's Next?

We're so close! Our data is now prepared for import!

In the next step, we'll push the data to the Storage container and begin the import process!

💖 💪 🙅 🚩
timothymcgrath
Timothy McGrath

Posted on August 7, 2019

Join Our Newsletter. No Spam, Only the good stuff.

Sign up to receive the latest update from our blog.

Related