Running Dynamics 365 Configuration Data Mover jobs in VSTS builds

In today's post I will show how to use my Dynamics 365 Configuration Data Mover utility for synchronizing configuration data between CRM orgs as part of a Visual Studio Team Services build.

  1. Download the latest version of the Configuration Data Mover utility's CLI tool from my repository on GitHub here: https://github.com/lucasalexander/AlexanderDevelopment.ConfigDataMover/releases.
    Any version from 2.2 onward should work, and make sure you get the "AlexanderDevelopment.ConfigDataMover.Cli.zip" file.
  2. Extract everything from the zip archive and add the entire directory to your source code repository. I have historically used a "tools" directory that contains various utilities, so I keep the Data Mover CLI tool in its own subdirectory there.
  3. Add a directory to your source code repository to hold configuration data mover job files and JSON data extracts. I call mine "data."
  4. Set up a data mover job using the GUI tool. You can create a job that syncs data between two CRM orgs or a job that reads from a file source. I prefer to use file sources for deployments because then I don't have a dependency on the development org, and also the configuration data can be kept under source control. You can also save your connection parameters in the job file or pass them to the Data Mover at run time.
  5. After you create and test your data mover job, save a copy of the XML job file to your source control "data" directory.
  6. (This step is only required for file sources when you are saving the connection parameters in the job file.) Open the XML file in a text editor and find the <ConnectionDetails> node toward the bottom. There will be a "source" attribute that you need to make sure is set to a relative path to the JSON data file from the location of your AlexanderDevelopment.ConfigDataMover.Cli.exe assembly. Because of how I have my directory structure set up in source control, the a JSON file named "alm-test.json" has "....\data\alm-test.json" as its relative path. Here's a screenshot of my directory structure for reference:
    Directory structure
  7. Commit your changes to your VSTS instance. I am using Git, but this should work for TFS, too.
  8. Create a new build and add a batch script step.
  9. Configure the batch script step to run the AlexanderDevelopment.ConfigDataMover.Cli.exe assembly you pushed to source control in step #7.
    Setting the .exe path
  10. Set the arguments to include the relative path to your job configuration file. For this example, it is -c ....\data\alm-test-import.xml
    Setting the arguments
  11. (This step is only required if you are not saving the connection parameters in the job file.) Add source and target arguments, too. The source argument is -s, and the target argument is -t. The values will look just like the source/target attributes from a job configuration file where you have saved the values.
  12. Set the working folder value to the directory that contains the AlexanderDevelopment.ConfigDataMover.Cli.exe assembly. You can see that setting in the screenshot above.

At this point you should be able to save and queue the build. Assuming everything is set up properly, you should get a nice result like this:
Successful build

If you do get an error, it's probably something simple like the paths not being set correctly, and a quick review of the output logs should point to the solution.

That's all there is to it. Are you using automated builds in VSTS for your Dynamics 365 projects? Do see value in managing configuration data like this? Share your thoughts in the comments!

comments powered by Disqus