<?xml version="1.0" encoding="UTF-8"?><rss xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:atom="http://www.w3.org/2005/Atom" version="2.0" xmlns:media="http://search.yahoo.com/mrss/"><channel><title><![CDATA[Dynamics 365 - Alexander Development]]></title><description><![CDATA[Dynamics 365 - Alexander Development]]></description><link>https://alexanderdevelopment.net/</link><generator>Ghost 1.20</generator><lastBuildDate>Mon, 24 Aug 2020 19:52:59 GMT</lastBuildDate><atom:link href="https://alexanderdevelopment.net/tag/dynamics-365/rss/" rel="self" type="application/rss+xml"/><ttl>60</ttl><item><title><![CDATA[Building a custom Dynamics 365 data interface with OpenFaaS]]></title><description><![CDATA[<div class="kg-card-markdown"><p>Over the past several months, I've been doing a lot of work with <a href="https://github.com/openfaas/faas">OpenFaaS</a> in my spare time, and in today's post I will show how you can use it to easily build and deploy a custom web service interface for data in a Dynamics 365 Customer Engagement online tenant.</p></div>]]></description><link>https://alexanderdevelopment.net/post/2018/07/05/building-a-custom-dynamics-365-data-interface-with-openfaas/</link><guid isPermaLink="false">5b3a415c97f5e30001931b7f</guid><category><![CDATA[Microsoft Dynamics CRM]]></category><category><![CDATA[Dynamics 365]]></category><category><![CDATA[OpenFaaS]]></category><category><![CDATA[serverless]]></category><category><![CDATA[C#]]></category><category><![CDATA[integration]]></category><dc:creator><![CDATA[Lucas Alexander]]></dc:creator><pubDate>Thu, 05 Jul 2018 17:28:47 GMT</pubDate><media:content url="https://alexanderdevelopment.net/content/images/2018/07/openfaas-d365-header.png" medium="image"/><content:encoded><![CDATA[<div class="kg-card-markdown"><img src="https://alexanderdevelopment.net/content/images/2018/07/openfaas-d365-header.png" alt="Building a custom Dynamics 365 data interface with OpenFaaS"><p>Over the past several months, I've been doing a lot of work with <a href="https://github.com/openfaas/faas">OpenFaaS</a> in my spare time, and in today's post I will show how you can use it to easily build and deploy a custom web service interface for data in a Dynamics 365 Customer Engagement online tenant.</p>
<h4 id="openfaas">OpenFaaS</h4>
<p>If you're not familiar with OpenFaaS, it's basically a serverless functions platform like <a href="https://azure.microsoft.com/en-us/services/functions/">Azure Functions</a> or <a href="https://aws.amazon.com/lambda/">AWS Lambda</a>, but you run it on Kubernetes or Docker Swarm on your own servers or in the cloud. What I particularly like about OpenFaaS compared to the various commercial serverless platforms is that in addition to offering more control over how/where it's deployed, OpenFaaS supports a wider variety of languages for writing serverless functions.</p>
<blockquote>
<p>OpenFaaS (Functions as a Service) is a framework for building serverless functions with Docker and Kubernetes which has first class support for metrics. Any process can be packaged as a function enabling you to consume a range of web events without repetitive boiler-plate coding.</p>
</blockquote>
<p>To follow along with the samples in this post, you'll need access to a cluster with OpenFaaS deployed, so if you don't already have one, now would be an excellent time to look at the OpenFaaS <a href="http://docs.openfaas.com/deployment/">deployment docs</a> or maybe even work through the <a href="https://github.com/openfaas/workshop">hands-on workshop</a>. I've also previously written about how to securely deploy OpenFaaS on a free <a href="https://alexanderdevelopment.net/post/2018/02/25/installing-and-securing-openfaas-on-a-google-cloud-virtual-machine/">Google Cloud VM with Docker Swarm</a> or on an <a href="https://alexanderdevelopment.net/post/2018/05/31/installing-and-securing-openfaas-on-an-aks/">Azure Kubernetes Service cluster</a>.</p>
<h4 id="preparingtobuildtheinterfacefunction">Preparing to build the interface function</h4>
<p>As soon as you have OpenFaaS running, it's time to look at the actual custom interface function.</p>
<p>My demo C# function does the following:</p>
<ol>
<li>Parse a JSON object sent in the client request for an access key and optional query filter</li>
<li>Validate the client-supplied access key to authorize or reject the request</li>
<li>Retrieve a Dynamics 365 OAuth access token using my <a href="https://alexanderdevelopment.net/post/2018/05/19/an-azure-ad-oauth2-helper-microservice/">Azure AD OAuth 2 helper microservice</a></li>
<li>Execute a query for contacts against the Dynamics 365 Web API</li>
<li>Return the Web API query results to the client in an array as part of a JSON object</li>
</ol>
<p>Because the OpenFaaS function uses my OAuth helper microservice instead of requesting an OAuth access token directly from Azure Active Directory, you need to deploy that microservice to your cluster before moving forward.</p>
<p>If you're using Kubernetes, you can create the deployment and corresponding service using the following YAML. You'll need to set the RESOURCE environment variable to the FQDN for your Dynamics 365 CE organization, but you can leave the CLIENTID and TOKEN_ENDPOINT values alone. <em>(While I used to think you needed to register a separate client application for every Dynamics 365 org to use OAuth authentication, I recently learned via a Twitter conversation that there is a <a href="https://twitter.com/bguidinger/status/1001796185798119424">&quot;universal&quot; CRM client id</a> you can use instead.)</em></p>
<pre><code>apiVersion: apps/v1beta1
kind: Deployment
metadata:
  name: azuread-oauth2-helper
spec:
  replicas: 1
  template:
    metadata:
      labels:
        app: azuread-oauth2-helper
    spec:
      containers:
      - name: azuread-oauth2-helper
        image: lucasalexander/azuread-oauth2-helper
        ports:
        - containerPort: 5000
        env:
        - name: RESOURCE
          value: &quot;https://XXXXXXXX.crm.dynamics.com&quot;
        - name: CLIENTID
          value: &quot;2ad88395-b77d-4561-9441-d0e40824f9bc&quot;
        - name: TOKEN_ENDPOINT
          value: &quot;https://login.microsoftonline.com/common/oauth2/token&quot;
---
apiVersion: v1
kind: Service
metadata:
  name: azuread-oauth2-helper
spec:
  ports:
  - port: 5000
  selector:
    app: azuread-oauth2-helper
</code></pre>
<p>Once you've deployed the microservice, here's the definition for a Kubernetes ingress. In this case my microservice is accessible on the same host as OpenFaaS (akskube.alexanderdevelopment.net), and it is secured with the same Let's Encrypt certificate. You'll want to update your configuration with the appropriate values for your specific situation.</p>
<pre><code>apiVersion: extensions/v1beta1
kind: Ingress
metadata:
  name: azuread-oauth2-helper-ingress
  annotations:
    kubernetes.io/tls-acme: &quot;true&quot;
    certmanager.k8s.io/issuer: letsencrypt-production
    nginx.ingress.kubernetes.io/rewrite-target: /
spec:
  tls:
  - hosts:
    - akskube.alexanderdevelopment.net
    secretName: faas-letsencrypt-production
  rules:
  - host: akskube.alexanderdevelopment.net
    http:
      paths:
      - path: /oauthhelper
        backend:
          serviceName: azuread-oauth2-helper
          servicePort: 5000
</code></pre>
<p>After the OAuth helper microservice is deployed, you should validate that you can get a token returned for a valid username/password combination. Here's what that looks like in Postman.<br>
<img src="https://alexanderdevelopment.net/content/images/2018/07/microservice-validation-1.png#img-thumbnail" alt="Building a custom Dynamics 365 data interface with OpenFaaS"></p>
<h4 id="buildingtheinterfacefunction">Building the interface function</h4>
<p>If you've made it to this point, building and deploying the function is easy!</p>
<p>First the function gets its configuration data from environment variables that are set when the function is deployed. If you were actually using this function in production, it would be better to store sensitive values like the access key and the Dynamics 365 password as <a href="https://github.com/openfaas/faas/blob/master/guide/secure_secret_management.md">secrets</a>, but I've used environment variables here to keep this overview as simple as possible.</p>
<pre><code>//get configuration from env variables        
var username = Environment.GetEnvironmentVariable(&quot;USERNAME&quot;);
var userpassword = Environment.GetEnvironmentVariable(&quot;USERPASS&quot;);
var tokenendpoint = Environment.GetEnvironmentVariable(&quot;TOKENENDPOINT&quot;);
var accesskey = Environment.GetEnvironmentVariable(&quot;ACCESSKEY&quot;);
var crmwebapi = Environment.GetEnvironmentVariable(&quot;CRMAPI&quot;);
</code></pre>
<p>After the function gets its configuration data, it deserializes the client request using Json.Net to extract a client-supplied access key and an optional query filter. The client-supplied key is validated against the stored key value, and if they don't match, an error response is returned.</p>
<pre><code>var queryrequest = JsonConvert.DeserializeObject&lt;QueryRequest&gt;(input);

if(accesskey!=queryrequest.AccessKey)
{
    JObject outputobject = new JObject();
    outputobject.Add(&quot;error&quot;, &quot;Invalid access key&quot;);
    Console.WriteLine(outputobject.ToString());
    return;
}
</code></pre>
<p>After the access key is validated, the function then makes a request to the authentication helper microservice to get an access token.</p>
<pre><code>var token = GetToken(username, userpassword, tokenendpoint);

...
...
...

string GetToken(string username, string userpassword, string tokenendpoint){
    try
    {
        JObject tokencredentials = new JObject();
        tokencredentials.Add(&quot;username&quot;, username);
        tokencredentials.Add(&quot;password&quot;,userpassword);
        var reqcontent = new StringContent(tokencredentials.ToString(), Encoding.UTF8, &quot;application/json&quot;);
        var result = _client.PostAsync(tokenendpoint, reqcontent).Result;
        var tokenobj = JsonConvert.DeserializeObject&lt;Newtonsoft.Json.Linq.JObject&gt;(
            result.Content.ReadAsStringAsync().Result);
        var token = tokenobj[&quot;accesstoken&quot;];
        return token.ToString();
    }
    catch(Exception ex)
    {
        return string.Format(&quot;Error: {0}&quot;, ex.Message);
    }
}
</code></pre>
<p>Once the token is returned from the microservice, the function executes the Web API query. The query is just a hardcoded OData query in the form of <code>/contacts?$select=fullname,contactid</code> plus any filter supplied by the client. The function expects that the filter will also be provided in supported Dynamics 365 OData format like <code>startswith(fullname,'y')</code>.</p>
<pre><code>var crmreq = new HttpRequestMessage(HttpMethod.Get, crmwebapi + crmwebapiquery);
crmreq.Headers.Add(&quot;Authorization&quot;, &quot;Bearer &quot; + token);
crmreq.Headers.Add(&quot;OData-MaxVersion&quot;, &quot;4.0&quot;);
crmreq.Headers.Add(&quot;OData-Version&quot;, &quot;4.0&quot;);
crmreq.Headers.Add(&quot;Prefer&quot;, &quot;odata.maxpagesize=500&quot;);
crmreq.Headers.Add(&quot;Prefer&quot;, &quot;odata.include-annotations=OData.Community.Display.V1.FormattedValue&quot;);
crmreq.Content = new StringContent(string.Empty.ToString(), Encoding.UTF8, &quot;application/json&quot;);
var crmres = _client.SendAsync(crmreq).Result;

var crmresponse = crmres.Content.ReadAsStringAsync().Result;

var crmresponseobj = JsonConvert.DeserializeObject&lt;Newtonsoft.Json.Linq.JObject&gt;(crmresponse);
</code></pre>
<p>Finally results are returned to the client in an array as part of a JSON object.</p>
<pre><code>JArray outputarray = new JArray();
foreach(var row in crmresponseobj[&quot;value&quot;].Children())
{
    JObject record = new JObject();
    record.Add(&quot;id&quot;, row[&quot;contactid&quot;]);
    record.Add(&quot;fullname&quot;, row[&quot;fullname&quot;]);
    outputarray.Add(record);
}
JObject outputobject = new JObject();
outputobject.Add(&quot;contacts&quot;, outputarray);
Console.WriteLine(outputobject.ToString());
</code></pre>
<p>Here's the complete function.</p>
<pre><code>using System;
using System.Text;
using System.Net;
using System.Net.Http;
using System.Net.Http.Headers;
using System.IO;
using Newtonsoft.Json;
using Newtonsoft.Json.Linq;
using System.Collections.Generic;

namespace Function
{
    public class FunctionHandler
    {
        private static HttpClient _client = new HttpClient();

        public void Handle(string input) {
            //get configuration from env variables        
            var username = Environment.GetEnvironmentVariable(&quot;USERNAME&quot;);
            var userpassword = Environment.GetEnvironmentVariable(&quot;USERPASS&quot;);
            var tokenendpoint = Environment.GetEnvironmentVariable(&quot;TOKENENDPOINT&quot;);
            var accesskey = Environment.GetEnvironmentVariable(&quot;ACCESSKEY&quot;);
            var crmwebapi = Environment.GetEnvironmentVariable(&quot;CRMAPI&quot;);
            
            //deserialize the client request
            var queryrequest = JsonConvert.DeserializeObject&lt;QueryRequest&gt;(input);
            
            //validate the client access key
            if(accesskey!=queryrequest.AccessKey)
            {
                JObject outputobject = new JObject();
                outputobject.Add(&quot;error&quot;, &quot;Invalid access key&quot;);
                Console.WriteLine(outputobject.ToString());
                return;
            }

            //get the oauth token
            var token = GetToken(username, userpassword, tokenendpoint);
            
            if(!token.ToUpper().StartsWith(&quot;ERROR:&quot;))
            {
                //set the base odata query
                var crmwebapiquery = &quot;/contacts?$select=fullname,contactid&quot;;
                
                //add a filter if the client included one in the request
                if(!string.IsNullOrEmpty(queryrequest.Filter))
                    crmwebapiquery+=&quot;&amp;$filter=&quot;+queryrequest.Filter;
                try
                {
                    //make the request to d365
                    var crmreq = new HttpRequestMessage(HttpMethod.Get, crmwebapi + crmwebapiquery);
                    crmreq.Headers.Add(&quot;Authorization&quot;, &quot;Bearer &quot; + token);
                    crmreq.Headers.Add(&quot;OData-MaxVersion&quot;, &quot;4.0&quot;);
                    crmreq.Headers.Add(&quot;OData-Version&quot;, &quot;4.0&quot;);
                    crmreq.Headers.Add(&quot;Prefer&quot;, &quot;odata.maxpagesize=500&quot;);
                    crmreq.Headers.Add(&quot;Prefer&quot;, &quot;odata.include-annotations=OData.Community.Display.V1.FormattedValue&quot;);
                    crmreq.Content = new StringContent(string.Empty.ToString(), Encoding.UTF8, &quot;application/json&quot;);
                    var crmres = _client.SendAsync(crmreq).Result;
                    
                    //handle the d365 response
                    var crmresponse = crmres.Content.ReadAsStringAsync().Result;

                    var crmresponseobj = JsonConvert.DeserializeObject&lt;Newtonsoft.Json.Linq.JObject&gt;(crmresponse);
                    
                    try
                    {
                        //build the function response
                        JArray outputarray = new JArray();
                        foreach(var row in crmresponseobj[&quot;value&quot;].Children())
                        {
                            JObject record = new JObject();
                            record.Add(&quot;id&quot;, row[&quot;contactid&quot;]);
                            record.Add(&quot;fullname&quot;, row[&quot;fullname&quot;]);
                            outputarray.Add(record);
                        }
                        JObject outputobject = new JObject();
                        outputobject.Add(&quot;contacts&quot;, outputarray);
                        
                        //return the response to the client
                        Console.WriteLine(outputobject.ToString());
                    }
                    catch(Exception ex)
                    {
                        JObject outputobject = new JObject();
                        outputobject.Add(&quot;error&quot;, string.Format(&quot;Could not parse query response: {0}&quot;, ex.Message));
                        Console.WriteLine(outputobject.ToString());
                    }
                }
                catch(Exception ex)
                {
                    JObject outputobject = new JObject();
                    outputobject.Add(&quot;error&quot;, string.Format(&quot;Could not query data: {0}&quot;, ex.Message));
                    Console.WriteLine(outputobject.ToString());
                }
            }
            else
            {
                JObject outputobject = new JObject();
                outputobject.Add(&quot;error&quot;, &quot;Could not get token&quot;);
                Console.WriteLine(outputobject.ToString());
            }
        }

        string GetToken(string username, string userpassword, string tokenendpoint){
            try
            {
                JObject tokencredentials = new JObject();
                tokencredentials.Add(&quot;username&quot;, username);
                tokencredentials.Add(&quot;password&quot;,userpassword);
                var reqcontent = new StringContent(tokencredentials.ToString(), Encoding.UTF8, &quot;application/json&quot;);
                var result = _client.PostAsync(tokenendpoint, reqcontent).Result;
                var tokenobj = JsonConvert.DeserializeObject&lt;Newtonsoft.Json.Linq.JObject&gt;(
                    result.Content.ReadAsStringAsync().Result);
                var token = tokenobj[&quot;accesstoken&quot;];
                return token.ToString();
            }
            catch(Exception ex)
            {
                return string.Format(&quot;Error: {0}&quot;, ex.Message);
            }
        }
    }

    public class QueryRequest
    {
        public string AccessKey {get;set;}
        public string Filter{get;set;}
    }
}
</code></pre>
<p>Because the function relies on Json.Net, you need to add a reference to it in your .csproj file before you build the function.</p>
<pre><code>&lt;Project Sdk=&quot;Microsoft.NET.Sdk&quot;&gt;
  &lt;PropertyGroup&gt;
    &lt;TargetFramework&gt;netstandard2.0&lt;/TargetFramework&gt;
  &lt;/PropertyGroup&gt;
  &lt;PropertyGroup&gt;
    &lt;GenerateAssemblyInfo&gt;false&lt;/GenerateAssemblyInfo&gt;
  &lt;/PropertyGroup&gt;
  &lt;ItemGroup&gt;
    &lt;PackageReference Include=&quot;newtonsoft.json&quot; Version=&quot;11.0.2&quot; /&gt;
  &lt;/ItemGroup&gt;
&lt;/Project&gt;
</code></pre>
<p>Here is my function definition YAML file with enviroment variables included. You will need to update them with your appropriate values, and you will also need to change the image name if you're building your own function instead of just deploying mine from Docker Hub.</p>
<pre><code>provider:
  name: faas
  gateway: http://localhost:8080

functions:
  demo-crm-function:
    lang: csharp
    handler: ./demo-crm-function
    image: lucasalexander/faas-demo-crm-function
    environment:
      USERNAME: XXXXXX@XXXXXX.onmicrosoft.com
      USERPASS: XXXXXX
      TOKENENDPOINT: https://akskube.alexanderdevelopment.net/oauthhelper/requesttoken
      CRMAPI: https://lucastest20.api.crm.dynamics.com/api/data/v9.0
      ACCESSKEY: MYACCESSKEY
</code></pre>
<p>Once the function is deployed, you can execute it either through the OpenFaaS admin UI or another tool that makes HTTP requests like Curl or Postman. Here's what an unfiltered query in Postman looks like for a Dynamics 365 org with sample data installed.<br>
<img src="https://alexanderdevelopment.net/content/images/2018/07/unfiltered-query.png#img-thumbnail" alt="Building a custom Dynamics 365 data interface with OpenFaaS"></p>
<p>And here's a query with a filter included.<br>
<img src="https://alexanderdevelopment.net/content/images/2018/07/filtered-query.png#img-thumbnail" alt="Building a custom Dynamics 365 data interface with OpenFaaS"></p>
<h4 id="wrappingup">Wrapping up</h4>
<p>Once I got OpenFaaS running, writing and deploying the actual function only took about an hour. Obviously writing a more complex data interface to support real-world requirements would take longer, but using a serverless functions platform like OpenFaaS is definitely a significant accelerator for custom Dynamics 365 integration development.</p>
<p>What do you think about this approach? Are you using serverless functions with your Dynamics 365 projects? What do you think about OpenFaaS vs Azure Functions or AWS Lambda? Let us know in the comments!</p>
</div>]]></content:encoded></item><item><title><![CDATA[Using Dynamics 365 virtual entities to show data from an external organization]]></title><description><![CDATA[<div class="kg-card-markdown"><p>I was recently asked to be a guest on the third-anniversary episode of the <a href="https://crm.audio/">CRM Audio podcast</a>. While I was there George Doubinski challenged me to create a plugin in one Dynamics 365 organization to retrieve records from another Dynamics 365 organization so they could be displayed as virtual entities.</p></div>]]></description><link>https://alexanderdevelopment.net/post/2018/05/28/using-dynamics-365-virtual-entities-to-show-data-from-an-external-organization/</link><guid isPermaLink="false">5b05bc3c97f5e30001931b67</guid><category><![CDATA[Microsoft Dynamics CRM]]></category><category><![CDATA[Dynamics 365]]></category><category><![CDATA[C#]]></category><category><![CDATA[integration]]></category><dc:creator><![CDATA[Lucas Alexander]]></dc:creator><pubDate>Mon, 28 May 2018 12:55:09 GMT</pubDate><media:content url="https://alexanderdevelopment.net/content/images/2018/05/PluginRegistration_2018-05-23_13-48-15.png" medium="image"/><content:encoded><![CDATA[<div class="kg-card-markdown"><img src="https://alexanderdevelopment.net/content/images/2018/05/PluginRegistration_2018-05-23_13-48-15.png" alt="Using Dynamics 365 virtual entities to show data from an external organization"><p>I was recently asked to be a guest on the third-anniversary episode of the <a href="https://crm.audio/">CRM Audio podcast</a>. While I was there George Doubinski challenged me to create a plugin in one Dynamics 365 organization to retrieve records from another Dynamics 365 organization so they could be displayed as virtual entities. I was promised adulation on <a href="https://crmtipoftheday.com/">Dynamics CRM Tip of the Day</a> and fame beyond my wildest dreams, so naturally I accepted.</p>
<p><img src="https://alexanderdevelopment.net/content/images/2018/05/tumblr_inline_n4m5yj9nMP1qa7k0a.gif" alt="Using Dynamics 365 virtual entities to show data from an external organization"></p>
<p>To address the challenge, I wrote a simple Dynamics 365 plugin that calls the Web API in a different Dynamics 365 organization to retrieve records and return them to a virtual entity data provider. From there, configuration of the Dynamics 365 virtual entity is simple. Let's take a look at how I did it.</p>
<h4 id="theplugin">The plugin</h4>
<p>First you need to create a plugin to retrieve the data from the &quot;external&quot; Dynamics 365 org. Because this code connects directly to the Web API, you'll need to get an access token from Azure AD before you can make the request to Dynamics 365. Just like I showed in my <a href="https://alexanderdevelopment.net/post/2016/11/29/scheduling-dynamics-365-workflows-with-azure-functions-and-csharp/">&quot;Scheduling Dynamics 365 workflows with Azure Functions and C#&quot;</a> post back in 2016, my sample code does not use <a href="https://github.com/AzureAD/azure-activedirectory-library-for-nodejs">ADAL</a> to get the access token, but rather it issues a request directly to the Azure AD OAuth 2 token endpoint.</p>
<p>Here's the code for the plugin. There are some configuration values you'll need to set for your Dynamics 365 organization and whatever query you want to run. It's not a best practice to have any of this actually hardcoded in your plugin, but I've done it this way so it's easier to see how things work.</p>
<pre><code>using Microsoft.Xrm.Sdk;
using Microsoft.Xrm.Sdk.Data.Exceptions;
using Microsoft.Xrm.Sdk.Extensions;
using Microsoft.Xrm.Sdk.Query;
using System;
using System.IO;
using System.Net;
using System.Threading.Tasks;
using Newtonsoft.Json;

namespace VirtualEntityProvider
{
    public class RetrieveOtherOrgData : IPlugin
    {
        //set these values for your D365 instance, user credentials and Azure AD clientid/token endpoint
        string crmorg = &quot;https://XXXXX.crm.dynamics.com&quot;;
        string clientid = &quot;XXXXXXXXX&quot;;
        string username = &quot;lucasalexander@XXXXXX.onmicrosoft.com&quot;;
        string userpassword = &quot;XXXXXXXXXXXX&quot;;
        string tokenendpoint = &quot;https://login.microsoftonline.com/XXXXXXXXXXX/oauth2/token&quot;;

        //relative path to web api endpoint
        string crmwebapi = &quot;/api/data/v8.2&quot;;

        //web api query to execute - in this case all accounts that start with &quot;F&quot;
        string crmwebapipath = &quot;/accounts?$select=name,accountid&amp;$filter=startswith(name,'F')&quot;;

        public void Execute(IServiceProvider serviceProvider)
        {
            //basic plugin set-up stuff
            IPluginExecutionContext context = (IPluginExecutionContext)serviceProvider.GetService(typeof(IPluginExecutionContext));
            IOrganizationServiceFactory servicefactory = (IOrganizationServiceFactory)serviceProvider.GetService(typeof(IOrganizationServiceFactory));
            IOrganizationService service = servicefactory.CreateOrganizationService(context.UserId);
            ITracingService tracingService = (ITracingService)serviceProvider.GetService(typeof(ITracingService));

            try
            {
                //instantiate a new entity collection to hold the records we'll return later
                EntityCollection results = new EntityCollection();

                //build the authorization request for Azure AD
                var reqstring = &quot;client_id=&quot; + clientid;
                reqstring += &quot;&amp;resource=&quot; + Uri.EscapeUriString(crmorg);
                reqstring += &quot;&amp;username=&quot; + Uri.EscapeUriString(username);
                reqstring += &quot;&amp;password=&quot; + Uri.EscapeUriString(userpassword);
                reqstring += &quot;&amp;grant_type=password&quot;;

                //make the Azure AD authentication request
                WebRequest req = WebRequest.Create(tokenendpoint);
                req.ContentType = &quot;application/x-www-form-urlencoded&quot;;
                req.Method = &quot;POST&quot;;
                byte[] bytes = System.Text.Encoding.ASCII.GetBytes(reqstring);
                req.ContentLength = bytes.Length;
                System.IO.Stream os = req.GetRequestStream();
                os.Write(bytes, 0, bytes.Length);
                os.Close();

                HttpWebResponse resp = (HttpWebResponse)req.GetResponse();
                StreamReader tokenreader = new StreamReader(resp.GetResponseStream());
                string responseBody = tokenreader.ReadToEnd();
                tokenreader.Close();

                //deserialize the Azure AD token response and get the access token to supply with the web api query
                var tokenresponse = JsonConvert.DeserializeObject&lt;Newtonsoft.Json.Linq.JObject&gt;(responseBody);
                var token = tokenresponse[&quot;access_token&quot;];

                //make the web api query
                WebRequest crmreq = WebRequest.Create(crmorg+crmwebapi+crmwebapipath);
                crmreq.Headers = new WebHeaderCollection();

                //use the access token from earlier as the authorization header bearer value
                crmreq.Headers.Add(&quot;Authorization&quot;, &quot;Bearer &quot; + token);
                crmreq.Headers.Add(&quot;OData-MaxVersion&quot;, &quot;4.0&quot;);
                crmreq.Headers.Add(&quot;OData-Version&quot;, &quot;4.0&quot;);
                crmreq.Headers.Add(&quot;Prefer&quot;, &quot;odata.maxpagesize=500&quot;);
                crmreq.Headers.Add(&quot;Prefer&quot;, &quot;odata.include-annotations=OData.Community.Display.V1.FormattedValue&quot;);
                crmreq.ContentType = &quot;application/json; charset=utf-8&quot;;
                crmreq.Method = &quot;GET&quot;;

                HttpWebResponse crmresp = (HttpWebResponse)crmreq.GetResponse();
                StreamReader crmreader = new StreamReader(crmresp.GetResponseStream());
                string crmresponseBody = crmreader.ReadToEnd();
                crmreader.Close();

                //deserialize the response
                var crmresponseobj = JsonConvert.DeserializeObject&lt;Newtonsoft.Json.Linq.JObject&gt;(crmresponseBody);

                //loop through the response values
                foreach (var row in crmresponseobj[&quot;value&quot;].Children())
                {
                    //create a new virtual entity of type lpa_demove
                    Entity verow = new Entity(&quot;lpa_otheraccount&quot;);
                    //verow[&quot;lpa_otheraccountid&quot;] = Guid.NewGuid();
                    //verow[&quot;lpa_name&quot;] = ((Newtonsoft.Json.Linq.JValue)row[&quot;name&quot;]).Value.ToString();
                    verow[&quot;lpa_otheraccountid&quot;] = (Guid)row[&quot;accountid&quot;];
                    verow[&quot;lpa_name&quot;] = (string)row[&quot;name&quot;];

                    //add it to the collection
                    results.Entities.Add(verow);
                }

                //return the results
                context.OutputParameters[&quot;BusinessEntityCollection&quot;] = results;
            }
            catch (Exception e)
            {
                tracingService.Trace($&quot;{e.Message} {e.StackTrace}&quot;);
                if (e.InnerException != null)
                    tracingService.Trace($&quot;{e.InnerException.Message} {e.InnerException.StackTrace}&quot;);

                throw new InvalidPluginExecutionException(e.Message);
            }
        }
    }
}
</code></pre>
<p>Because the plugin uses JSON.Net, you'll need to use ILMerge to bundle the Newtonsoft.Json.dll assembly with your compiled plugin before you deploy it to Dynamics 365.</p>
<h4 id="settingupthevirtualentity">Setting up the virtual entity</h4>
<p>After you've deployed the plugin using the plugin registration tool, register a new data provider. When the data provider registration window opens, first create a new data source entity.<br>
<img src="https://alexanderdevelopment.net/content/images/2018/05/2018-05-23_14-28-33.png#img-thumbnail" alt="Using Dynamics 365 virtual entities to show data from an external organization"></p>
<p>Complete the details for the data source and save it.<br>
<img src="https://alexanderdevelopment.net/content/images/2018/05/2018-05-23_13-53-40.png#img-thumbnail" alt="Using Dynamics 365 virtual entities to show data from an external organization"></p>
<p>Complete the rest of the details for the data provider and save it. <img src="https://alexanderdevelopment.net/content/images/2018/05/2018-05-23_13-53-24.png#img-thumbnail" alt="Using Dynamics 365 virtual entities to show data from an external organization"></p>
<p>You should now see a new data provider and data source. <img src="https://alexanderdevelopment.net/content/images/2018/05/PluginRegistration_2018-05-23_13-53-57.png#img-thumbnail" alt="Using Dynamics 365 virtual entities to show data from an external organization"></p>
<p>Open the Dynamics 365 web UI, and go to settings-&gt;administration-&gt;virtual entity data sources. <img src="https://alexanderdevelopment.net/content/images/2018/05/chrome_2018-05-23_13-54-52.png#img-thumbnail" alt="Using Dynamics 365 virtual entities to show data from an external organization"></p>
<p>Click the &quot;new&quot; button to create a new virtual entity data source. <img src="https://alexanderdevelopment.net/content/images/2018/05/chrome_2018-05-23_13-55-21.png#img-thumbnail" alt="Using Dynamics 365 virtual entities to show data from an external organization"></p>
<p>In the window that pops up, select the data provider you created earlier. <img src="https://alexanderdevelopment.net/content/images/2018/05/chrome_2018-05-23_13-55-34.png#img-thumbnail" alt="Using Dynamics 365 virtual entities to show data from an external organization"></p>
<p>Give your new virtual entity data source a name and save it. <img src="https://alexanderdevelopment.net/content/images/2018/05/chrome_2018-05-23_13-55-50.png#img-thumbnail" alt="Using Dynamics 365 virtual entities to show data from an external organization"></p>
<p>Open your solution and create a new entity. <img src="https://alexanderdevelopment.net/content/images/2018/05/chrome_2018-05-23_13-56-22.png#img-thumbnail" alt="Using Dynamics 365 virtual entities to show data from an external organization"></p>
<p>Configure your entity as a virtual entity that uses the virtual entity data source you created previously. <img src="https://alexanderdevelopment.net/content/images/2018/05/chrome_2018-05-23_13-58-31.png#img-thumbnail" alt="Using Dynamics 365 virtual entities to show data from an external organization"></p>
<p>Once you save and publish the virtual entity, you can open an advanced find view that will retrieve data from your other Dynamics 365 organization and display it. <img src="https://alexanderdevelopment.net/content/images/2018/05/chrome_2018-05-23_14-07-38.png#img-thumbnail" alt="Using Dynamics 365 virtual entities to show data from an external organization"></p>
<p>If you export this data to Excel and unhide the id column, you will see that the GUIDs match the records in the external system.</p>
<p>And that's all there is to it. Happy entity virtualizing!</p>
</div>]]></content:encoded></item><item><title><![CDATA[An Azure AD OAuth 2 helper microservice]]></title><description><![CDATA[<div class="kg-card-markdown"><p>One of the biggest trends in systems architecture these days is the use of &quot;serverless&quot; functions like Azure Functions, Amazon Lambda and OpenFaas. Because these functions are stateless, if you want to use a purely serverless approach to work with resources secured using Azure Active Directory like Dynamics</p></div>]]></description><link>https://alexanderdevelopment.net/post/2018/05/19/an-azure-ad-oauth2-helper-microservice/</link><guid isPermaLink="false">5aff468b97f5e30001931b5d</guid><category><![CDATA[Microsoft Dynamics CRM]]></category><category><![CDATA[Dynamics 365]]></category><category><![CDATA[Python]]></category><category><![CDATA[serverless]]></category><category><![CDATA[Docker]]></category><dc:creator><![CDATA[Lucas Alexander]]></dc:creator><pubDate>Sat, 19 May 2018 16:45:38 GMT</pubDate><media:content url="https://alexanderdevelopment.net/content/images/2018/05/Postman_2018-05-18_22-58-04-1.png" medium="image"/><content:encoded><![CDATA[<div class="kg-card-markdown"><img src="https://alexanderdevelopment.net/content/images/2018/05/Postman_2018-05-18_22-58-04-1.png" alt="An Azure AD OAuth 2 helper microservice"><p>One of the biggest trends in systems architecture these days is the use of &quot;serverless&quot; functions like Azure Functions, Amazon Lambda and OpenFaas. Because these functions are stateless, if you want to use a purely serverless approach to work with resources secured using Azure Active Directory like Dynamics 365 online, a new token will have to be requested every time a function executes. This is inefficient, and it requires the function to fully understand OAuth 2 authentication, which could be handled better elsewhere.</p>
<p>To address this problem, I've written a microservice in Python that can be used to request OAuth 2 tokens from Azure Active Directory, and it also handles refreshing them as needed. I've containerized it as Docker image so you can easily run it without needing to build anything.</p>
<h4 id="howitworks">How it works</h4>
<p>When a request containing a username and password arrives for the first time, the microservice retrieves an OAuth2 access token from Azure AD and returns it to the requester. The microservice also caches an object that contains the access token, refresh token, username, password and expiration time.</p>
<p>When subsequent requests arrive, the microservice checks its cache for an existing token that matches the username and password. If it finds one, it checks if the token has expired or needs to be refreshed.</p>
<p>If the existing token has expired, a new one is requested. If the existing token has not expired, but it will expire within a specified period of time (10 minutes is the default value), the microservice will execute a refresh request to Azure AD, cache the updated token and return it to the requester. If there's an unexpired existing token that doesn't need to be refreshed, the cached access token will be returned to the requester.</p>
<p>Here's what a raw token request to and response from the microservice looks like in Postman:<br>
<img src="https://alexanderdevelopment.net/content/images/2018/05/Postman_2018-05-18_22-58-04.png#img-thumbnail" alt="An Azure AD OAuth 2 helper microservice"></p>
<p>Back in 2016 I shared <a href="https://alexanderdevelopment.net/post/2016/11/27/dynamics-365-and-python-integration-using-the-web-api/">some sample Python code</a> that showed how to authenticate to Azure AD and query the Dynamics 365 (then called Dynamics CRM) Web API. Here is an updated version of that sample code that uses this new microservice to acquire access tokens:</p>
<pre><code>import requests
import json

#set these values to retrieve the oauth token
username = 'lucasalexander@xxxxxx.onmicrosoft.com'
userpassword = 'xxxxxx'
tokenendpoint = 'http://localhost:5000/requesttoken'

#set these values to query your crm data
crmwebapi = 'https://xxxxxx.api.crm.dynamics.com/api/data/v8.2'
crmwebapiquery = '/contacts?$select=fullname,contactid'

#build the authorization request
tokenpost = {
    'username':username,
    'password':userpassword
}

#make the token request
print('requesting token . . .')
tokenres = requests.post(tokenendpoint, json=tokenpost)
print('token response received. . .')

accesstoken = ''

#extract the access token
try:
    print('parsing token response . . .')
    print(tokenres)
    accesstoken = tokenres.json()['accesstoken']

except(KeyError):
    print('Could not get access token')

if(accesstoken!=''):
    crmrequestheaders = {
        'Authorization': 'Bearer ' + accesstoken,
        'OData-MaxVersion': '4.0',
        'OData-Version': '4.0',
        'Accept': 'application/json',
        'Content-Type': 'application/json; charset=utf-8',
        'Prefer': 'odata.maxpagesize=500',
        'Prefer': 'odata.include-annotations=OData.Community.Display.V1.FormattedValue'
    }

    print('making crm request . . .')
    crmres = requests.get(crmwebapi+crmwebapiquery, headers=crmrequestheaders)
    print('crm response received . . .')
    try:
        print('parsing crm response . . .')
        crmresults = crmres.json()
        for x in crmresults['value']:
            print (x['fullname'] + ' - ' + x['contactid'])
    except KeyError:
        print('Could not parse CRM results')
</code></pre>
<p>Here's the output when I run the sample against my demo environment:<br>
<img src="https://alexanderdevelopment.net/content/images/2018/05/powershell_2018-05-19_11-34-16.png" alt="An Azure AD OAuth 2 helper microservice"></p>
<h4 id="runningthemicroservice">Running the microservice</h4>
<p>Pull the image from <a href="https://hub.docker.com">Docker Hub</a>: <code>docker pull lucasalexander/azuread-oauth2-helper:latest</code></p>
<p><em>Required environment variables</em></p>
<ol>
<li>RESOURCE - The URL of the service that is going to be accessed</li>
<li>CLIENTID - The Azure AD application client ID</li>
<li>TOKEN_ENDPOINT - The OAuth2 token endpoint from the Azure AD application</li>
</ol>
<p>Run the image with the following command (replacing the environment variables with your own).</p>
<p><code>docker run -d -p 5000:5000 -e RESOURCE=https://XXXXXX.crm.dynamics.com -e CLIENTID=XXXXXX -e TOKEN_ENDPOINT=https://login.microsoftonline.com/XXXXXX/oauth2/token --name oauthhelper lucasalexander/azuread-oauth2-helper:latest</code></p>
<p>You can also optionally supply an additional &quot;REFRESH_THRESHOLD&quot; environment variable that sets the time remaining (in seconds) before a token's expiration time when it will be refreshed. The default value is 600 seconds.</p>
<h4 id="anoteonsecurity">A note on security</h4>
<p>Because the microservice is caching usernames, passwords and access tokens in memory, this approach is vulnerable to heap inspection attacks, so you'll want to make sure your environment is appropriately locked down. Also you'll want to make sure any communication between your code that requests tokens and the microservice is encrypted.</p>
<h4 id="wrappingup">Wrapping up</h4>
<p>Although I wrote this with Dynamics 365 in mind, it should work for any resource that is secured by Azure AD. If you'd like to take a closer look at the code, it's available on GitHub <a href="https://github.com/lucasalexander/azuread-oauth2-helper">here</a>.</p>
</div>]]></content:encoded></item><item><title><![CDATA[Setting values in a Dynamics 365 CE quick create form from the main form]]></title><description><![CDATA[<div class="kg-card-markdown"><p>Earlier this week I was asked to populate a field in a Dynamics 365 Customer Engagement quick create form with a value from a field on the main form. Unfortunately, the main form would not be saved at the time the quick create form was opened, so the value couldn't</p></div>]]></description><link>https://alexanderdevelopment.net/post/2018/03/17/setting-values-in-a-dynamics-365-ce-quick-create-form-from-the-main-form/</link><guid isPermaLink="false">5aad852d44999a000186ddb9</guid><category><![CDATA[Microsoft Dynamics CRM]]></category><category><![CDATA[Dynamics 365]]></category><category><![CDATA[JavaScript]]></category><dc:creator><![CDATA[Lucas Alexander]]></dc:creator><pubDate>Sat, 17 Mar 2018 21:39:19 GMT</pubDate><content:encoded><![CDATA[<div class="kg-card-markdown"><p>Earlier this week I was asked to populate a field in a Dynamics 365 Customer Engagement quick create form with a value from a field on the main form. Unfortunately, the main form would not be saved at the time the quick create form was opened, so the value couldn't be read from the database.</p>
<p>While there is no good way to access the opening form from the quick create form using the Xrm.Page object model, there is a way to pass the value from the main form using the regular JavaScript browser object model. Because the quick create form and the main form are both children of the same topmost browser window, the main form can create a property in the top window that the quick create form can access when it loads.</p>
<p>Here's sample code that runs in the main form to set the topmost window's property value:</p>
<pre><code>var setValsForQuickCreate = function(){
  window.top.attributename = Xrm.Page.getAttribute(&quot;new_attributename&quot;).getValue();
}
</code></pre>
<p>And here's the corresponding sample code to run when the quick create form loads:</p>
<pre><code>var setValFromMainForm = function(){
  Xrm.Page.getAttribute(&quot;new_attributename&quot;).setValue(window.top.attributename);
}
</code></pre>
<p>This is a relatively simple example that assumes the value to set in the quick create form is the exact same value from the opening form, but there's no reason you can't do transformations if necessary. Additionally, there's no error checking here, so you'll probably want to at least add null checking/handling in the quick create form's script.</p>
</div>]]></content:encoded></item><item><title><![CDATA[Updated solution for scheduling recurring Dynamics 365 workflows]]></title><description><![CDATA[<div class="kg-card-markdown"><p>I've released an updated version of my recurring workflow scheduler for Dynamics 365 Customer Engagement. This solution targets Dynamics 365 version 9, so it should work in all current Dynamics 365 online organizations. You can download version 1.3 of my solution from here: <a href="https://github.com/lucasalexander/AlexanderDevelopment.ProcessRunner/releases/tag/v1.3">https://github.com/lucasalexander/AlexanderDevelopment.ProcessRunner/</a></p></div>]]></description><link>https://alexanderdevelopment.net/post/2018/03/12/updated-solution-for-scheduling-recurring-dynamics-crm-workflows-2/</link><guid isPermaLink="false">5aa6908f44999a000186ddb1</guid><category><![CDATA[Microsoft Dynamics CRM]]></category><category><![CDATA[Dynamics 365]]></category><category><![CDATA[utilities]]></category><category><![CDATA[FetchXML]]></category><category><![CDATA[process automation]]></category><dc:creator><![CDATA[Lucas Alexander]]></dc:creator><pubDate>Mon, 12 Mar 2018 15:00:00 GMT</pubDate><media:content url="https://alexanderdevelopment.net/content/images/2018/03/chrome_2018-03-12_09-40-49.png" medium="image"/><content:encoded><![CDATA[<div class="kg-card-markdown"><img src="https://alexanderdevelopment.net/content/images/2018/03/chrome_2018-03-12_09-40-49.png" alt="Updated solution for scheduling recurring Dynamics 365 workflows"><p>I've released an updated version of my recurring workflow scheduler for Dynamics 365 Customer Engagement. This solution targets Dynamics 365 version 9, so it should work in all current Dynamics 365 online organizations. You can download version 1.3 of my solution from here: <a href="https://github.com/lucasalexander/AlexanderDevelopment.ProcessRunner/releases/tag/v1.3">https://github.com/lucasalexander/AlexanderDevelopment.ProcessRunner/releases/tag/v1.3</a>.</p>
<p>For more information on the use of this tool, take a look at the original blog posts:</p>
<ul>
<li><a href="https://alexanderdevelopment.net/post/2016/09/19/updated-solution-for-scheduling-recurring-dynamics-crm-workflows/">https://alexanderdevelopment.net/post/2016/09/19/updated-solution-for-scheduling-recurring-dynamics-crm-workflows/</a></li>
<li><a href="https://alexanderdevelopment.net/post/2013/05/18/scheduling-recurring-dynamics-crm-workflows-with-fetchxml/">https://alexanderdevelopment.net/post/2013/05/18/scheduling-recurring-dynamics-crm-workflows-with-fetchxml/</a></li>
</ul>
</div>]]></content:encoded></item><item><title><![CDATA[Disable and enable Dynamics 365 CE users with SSIS & KingswaySoft]]></title><description><![CDATA[<div class="kg-card-markdown"><p>Recently I was asked to set up a process to automatically disable or re-enable Dynamics 365 Customer Engagement users depending on some external data. This ended up being ridiculously easy to do with SSIS and KingswaySoft's <a href="http://www.kingswaysoft.com/products/ssis-integration-toolkit-for-microsoft-dynamics-365">Dynamics 365 Integration Toolkit</a>. Let me show you how it works.</p>
<p>In Dynamics 365</p></div>]]></description><link>https://alexanderdevelopment.net/post/2018/02/08/disable-enable-dynamics-365-ce-users-with-ssis-kingswaysoft/</link><guid isPermaLink="false">5a7c568fc86c8900016cf372</guid><category><![CDATA[Microsoft Dynamics CRM]]></category><category><![CDATA[Dynamics 365]]></category><category><![CDATA[integration]]></category><category><![CDATA[KingswaySoft]]></category><category><![CDATA[SSIS]]></category><dc:creator><![CDATA[Lucas Alexander]]></dc:creator><pubDate>Thu, 08 Feb 2018 19:01:01 GMT</pubDate><media:content url="https://alexanderdevelopment.net/content/images/2018/02/02_query_users-1.png" medium="image"/><content:encoded><![CDATA[<div class="kg-card-markdown"><img src="https://alexanderdevelopment.net/content/images/2018/02/02_query_users-1.png" alt="Disable and enable Dynamics 365 CE users with SSIS & KingswaySoft"><p>Recently I was asked to set up a process to automatically disable or re-enable Dynamics 365 Customer Engagement users depending on some external data. This ended up being ridiculously easy to do with SSIS and KingswaySoft's <a href="http://www.kingswaysoft.com/products/ssis-integration-toolkit-for-microsoft-dynamics-365">Dynamics 365 Integration Toolkit</a>. Let me show you how it works.</p>
<p>In Dynamics 365 CE, you can disable or enable a user record just by setting the value of its &quot;isdisabled&quot; attribute to true or false, so both my disable user data flow and re-enable user data flow do roughly the same thing.</p>
<ol>
<li>Get a list of Dynamics 365 user records to update.</li>
<li>Add a derived column to hold the value to use for updating isdisabled on the user records.</li>
<li>Update the user records.</li>
</ol>
<h4 id="thedisableuserspackage">The disable users package</h4>
<p>Here's a screenshot of a sample disable users data flow.<br>
<img src="https://alexanderdevelopment.net/content/images/2018/02/01_disable_user_flow.png#img-thumbnail" alt="Disable and enable Dynamics 365 CE users with SSIS & KingswaySoft"></p>
<p>Let's take a closer look at each step.</p>
<ol>
<li>Query users using FetchXML. <img src="https://alexanderdevelopment.net/content/images/2018/02/02_query_users.png#img-thumbnail" alt="Disable and enable Dynamics 365 CE users with SSIS & KingswaySoft"></li>
<li>Add a derived column named &quot;isdisabled&quot; and set its value to 1. <img src="https://alexanderdevelopment.net/content/images/2018/02/03_isdisabled_column.png#img-thumbnail" alt="Disable and enable Dynamics 365 CE users with SSIS & KingswaySoft"></li>
<li>Update the users. <img src="https://alexanderdevelopment.net/content/images/2018/02/04_update_users.png#img-thumbnail" alt="Disable and enable Dynamics 365 CE users with SSIS & KingswaySoft"></li>
</ol>
<p>Enabling the users works exactly the same way, except the value of the &quot;isdisabled&quot; column should be 0 instead of 1, so I won't show the screenshots for that package.</p>
<h4 id="disableuserdemo">Disable user demo</h4>
<p>In my Dynamics 365 online instance, I have an active user named &quot;Angus Alexander&quot; who I want to disable. <img src="https://alexanderdevelopment.net/content/images/2018/02/05_enabled_users.png#img-thumbnail" alt="Disable and enable Dynamics 365 CE users with SSIS & KingswaySoft"></p>
<p>When I run the disable users package with the query from above (<code>&lt;condition attribute=&quot;firstname&quot; operator=&quot;eq&quot; value=&quot;angus&quot; /&gt;</code>) in Visual Studio, I see success on every step. <img src="https://alexanderdevelopment.net/content/images/2018/02/06_package_run.png#img-thumbnail" alt="Disable and enable Dynamics 365 CE users with SSIS & KingswaySoft"></p>
<p>I check back in Dynamics 365 to see Angus Alexander is no longer an enabled user. <img src="https://alexanderdevelopment.net/content/images/2018/02/07_enabled_users.png#img-thumbnail" alt="Disable and enable Dynamics 365 CE users with SSIS & KingswaySoft"></p>
<p>Instead Angus shows up as a disabled user. <img src="https://alexanderdevelopment.net/content/images/2018/02/08_disabled_users.png#img-thumbnail" alt="Disable and enable Dynamics 365 CE users with SSIS & KingswaySoft"></p>
<p>Now when Angus tries to access Dynamics 365, he sees that his account has been disabled. <img src="https://alexanderdevelopment.net/content/images/2018/02/09_disabled_message.png#img-thumbnail" alt="Disable and enable Dynamics 365 CE users with SSIS & KingswaySoft"></p>
</div>]]></content:encoded></item><item><title><![CDATA[Building a simple service relay for Dynamics 365 CE with RabbitMQ and Python - part 4]]></title><description><![CDATA[<div class="kg-card-markdown"><p>This is the final post in my series about building a service relay for Dynamics 365 CE with RabbitMQ and Python. In my previous <a href="https://alexanderdevelopment.net/post/2018/02/05/building-a-simple-service-relay-for-dynamics-365-ce-with-rabbitmq-and-python-part-3/">post</a> in this series, I showed the Python code to make the service relay work. In today's post, I will show how you can use <a href="https://azure.microsoft.com/en-us/services/functions/">Azure</a></p></div>]]></description><link>https://alexanderdevelopment.net/post/2018/02/07/building-a-simple-service-relay-for-dynamics-365-ce-with-rabbitmq-and-python-part-4/</link><guid isPermaLink="false">5a788a53c86c8900016cf367</guid><category><![CDATA[Microsoft Dynamics CRM]]></category><category><![CDATA[Dynamics 365]]></category><category><![CDATA[integration]]></category><category><![CDATA[Python]]></category><category><![CDATA[RabbitMQ]]></category><category><![CDATA[Azure]]></category><category><![CDATA[C#]]></category><dc:creator><![CDATA[Lucas Alexander]]></dc:creator><pubDate>Thu, 08 Feb 2018 04:00:42 GMT</pubDate><media:content url="https://alexanderdevelopment.net/content/images/2018/02/simple-service-relay-2.png" medium="image"/><content:encoded><![CDATA[<div class="kg-card-markdown"><img src="https://alexanderdevelopment.net/content/images/2018/02/simple-service-relay-2.png" alt="Building a simple service relay for Dynamics 365 CE with RabbitMQ and Python - part 4"><p>This is the final post in my series about building a service relay for Dynamics 365 CE with RabbitMQ and Python. In my previous <a href="https://alexanderdevelopment.net/post/2018/02/05/building-a-simple-service-relay-for-dynamics-365-ce-with-rabbitmq-and-python-part-3/">post</a> in this series, I showed the Python code to make the service relay work. In today's post, I will show how you can use <a href="https://azure.microsoft.com/en-us/services/functions/">Azure Functions</a> to make a consumer service proxy using C# so client applications don't have to access to your RabbitMQ broker directly, and I will also discuss some general thoughts on security and scalability for this service relay architecture.</p>
<p>Although this simple service relay allows external consumers to get data from Dynamics 365 CE without needing to connect directly, the examples I've shown so far require that they can connect to a RabbitMQ broker. This may be problematic for a variety of reasons, so you would probably want external consumers to connect to a web service proxy that would write requests to and read responses from the RabbitMQ broker.</p>
<h4 id="buildingaserviceproxyfunction">Building a service proxy function</h4>
<p>You can build an Azure Functions service proxy with Python, but I don't recommend it for three reasons:</p>
<ol>
<li>Azure Functions Python support is still considered experimental.</li>
<li>Python scripts that use external libraries can run <a href="https://github.com/Azure/azure-functions-host/issues/1626">exceedingly slow</a>.</li>
<li>Getting the environment set up is a bit of a hassle.</li>
</ol>
<p>On the other hand, building a service proxy function with C# was so much easier, and it performed much better than a comparable Python function (~.5 seconds for C# compared to 5+ seconds for Python).</p>
<p>Here are the steps I took to build my C# service proxy function:</p>
<ol>
<li>Create a C# HTTP trigger function.</li>
<li>Create and upload a project.json file with a dependency on the RabbitMQ client (see below).</li>
<li>Take the &quot;RpcClient&quot; class from the <a href="https://www.rabbitmq.com/tutorials/tutorial-six-dotnet.html">RabbitMQ .Net RPC tutorial</a> and call it from within my function.</li>
</ol>
<p>Here's my project.json file:</p>
<pre><code>{
  &quot;frameworks&quot;: {
    &quot;net46&quot;:{
      &quot;dependencies&quot;: {
        &quot;RabbitMQ.Client&quot;: &quot;5.0.1&quot;
      }
    }
   }
}
</code></pre>
<p>And here's my run.csx file:</p>
<pre><code>using System.Net;
using System;
using System.Collections.Concurrent;
using System.Text;
using RabbitMQ.Client;
using RabbitMQ.Client.Events;

public static async Task&lt;HttpResponseMessage&gt; Run(HttpRequestMessage req, TraceWriter log)
{
    log.Info(&quot;Processing request&quot;);

    // parse query parameter
    string query = req.GetQueryNameValuePairs()
        .FirstOrDefault(q =&gt; string.Compare(q.Key, &quot;query&quot;, true) == 0)
        .Value;

    // Get request body
    dynamic data = await req.Content.ReadAsAsync&lt;object&gt;();

    // Set name to query string or body data
    query = query ?? data?.query;

    var rpcClient = new RpcClient();
    
    log.Info(string.Format(&quot; [.] query start time {0}&quot;, DateTime.Now.ToString(&quot;MM/dd/yyyy hh:mm:ss.fff tt&quot;)));
    var response = rpcClient.Call(query);

    log.Info(string.Format(&quot; [.] query end time {0}&quot;, DateTime.Now.ToString(&quot;MM/dd/yyyy hh:mm:ss.fff tt&quot;)));
    rpcClient.Close();

    return req.CreateResponse(HttpStatusCode.OK, response);
}

public class RpcClient
{
    private readonly IConnection connection;
    private readonly IModel channel;
    private readonly string replyQueueName;
    private readonly EventingBasicConsumer consumer;
    private readonly BlockingCollection&lt;string&gt; respQueue = new BlockingCollection&lt;string&gt;();
    private readonly IBasicProperties props;

    public RpcClient()
    {
        var factory = new ConnectionFactory() { HostName = &quot;RABBITHOST&quot;, UserName=&quot;RABBITUSER&quot;, Password=&quot;RABBITUSERPASS&quot;  };

        connection = factory.CreateConnection();
        channel = connection.CreateModel();
        replyQueueName = channel.QueueDeclare().QueueName;
        consumer = new EventingBasicConsumer(channel);

        props = channel.CreateBasicProperties();
        var correlationId = Guid.NewGuid().ToString();
        props.CorrelationId = correlationId;
        props.ReplyTo = replyQueueName;

        consumer.Received += (model, ea) =&gt;
        {
            var body = ea.Body;
            var response = Encoding.UTF8.GetString(body);
            if (ea.BasicProperties.CorrelationId == correlationId)
            {
                respQueue.Add(response);
            }
        };
    }

    public string Call(string message)
    {
        var messageBytes = Encoding.UTF8.GetBytes(message);
        channel.BasicPublish(
            exchange: &quot;&quot;,
            routingKey: &quot;rpc_queue&quot;,
            basicProperties: props,
            body: messageBytes);

        channel.BasicConsume(
            consumer: consumer,
            queue: replyQueueName,
            autoAck: true);

        return respQueue.Take(); ;
    }

    public void Close()
    {
        connection.Close();
    }
}
</code></pre>
<p>Here's a screenshot showing me calling the C# function with Postman.<br>
<img src="https://alexanderdevelopment.net/content/images/2018/02/Postman_2018-02-05_22-02-52.png#img-thumbnail" alt="Building a simple service relay for Dynamics 365 CE with RabbitMQ and Python - part 4"></p>
<p>Because I did actually build a Python function, I will go ahead and share how I did it if you're interested. Here are the steps I took:</p>
<ol>
<li>Create a Python HTTP trigger function.</li>
<li>Install Python 3.6 via site extensions (see steps 2.1-2.4 <a href="https://stackoverflow.com/a/47213859">here</a>).</li>
<li>Install the necessary libraries using pip via <a href="https://david-obrien.net/2016/07/azure-functions-kudu/">KUDU</a>.</li>
</ol>
<p>Here's the Python function code:</p>
<pre><code>import os
import sys
import json
import pika
import uuid
import datetime

class CrmRpcClient(object):
    def __init__(self):
        #RabbitMQ connection details
        self.rabbituser = 'RABBITUSERNAME'
        self.rabbitpass = 'RABBITUSERPASS'
        self.rabbithost = 'RABBITHOST' 
        self.rabbitport = 5672
        self.rabbitqueue = 'rpc_queue'
        rabbitcredentials = pika.PlainCredentials(self.rabbituser, self.rabbitpass)
        rabbitparameters = pika.ConnectionParameters(host=self.rabbithost,
                                    port=self.rabbitport,
                                    virtual_host='/',
                                    credentials=rabbitcredentials)

        self.rabbitconn = pika.BlockingConnection(rabbitparameters)

        self.channel = self.rabbitconn.channel()

        #create an anonymous exclusive callback queue
        result = self.channel.queue_declare(exclusive=True)
        self.callback_queue = result.method.queue

        self.channel.basic_consume(self.on_response, no_ack=True,
                                   queue=self.callback_queue)

    #callback method for when a response is received - note the check for correlation id
    def on_response(self, ch, method, props, body):
        if self.corr_id == props.correlation_id:
            self.response = body

    #method to make the initial request
    def call(self, n):
        self.response = None
        #generate a new correlation id
        self.corr_id = str(uuid.uuid4())

        #publish the message to the rpc_queue - note the reply_to property is set to the callback queue from above
        self.channel.basic_publish(exchange='',
                                   routing_key=self.rabbitqueue,
                                   properties=pika.BasicProperties(
                                         reply_to = self.callback_queue,
                                         correlation_id = self.corr_id,
                                         ),
                                   body=n)
        while self.response is None:
            self.rabbitconn.process_data_events()
        return self.response

print(&quot; [.] query start time %r&quot; % str(datetime.datetime.now()))
#instantiate an rpc client
crm_rpc = CrmRpcClient()

postreqdata = json.loads(open(os.environ['req']).read())
query = postreqdata['query']

crm_rpc = CrmRpcClient()
print(&quot; [.] query start time %r&quot; % str(datetime.datetime.now()))
queryresponse = crm_rpc.call(query)
print(&quot; [.] query end time %r&quot; % str(datetime.datetime.now()))
response = open(os.environ['res'], 'w')
response.write(queryresponse.decode())
response.close()
</code></pre>
<p>Here's a screenshot showing me calling the Python function with Postman.<img src="https://alexanderdevelopment.net/content/images/2018/02/Postman_2018-02-05_22-10-20.png#img-thumbnail" alt="Building a simple service relay for Dynamics 365 CE with RabbitMQ and Python - part 4"></p>
<p>Note the difference in time between the two functions - 5.62 seconds for Python and .46 seconds for C#!</p>
<h4 id="securityandscalability">Security and scalability</h4>
<p>If you decide to use this approach in production, I'd suggest you carefully consider both security and scalability. Obviously the overall solution will only be as secure as your RabbitMQ broker and communications between the broker and its clients, so you'll want to look at best practices for access control and securing the communications with TLS. Here are some links for further reading on those subjects:</p>
<ul>
<li>TLS - <a href="https://www.rabbitmq.com/ssl.html">https://www.rabbitmq.com/ssl.html</a></li>
<li>Access control - <a href="https://www.rabbitmq.com/access-control.html">https://www.rabbitmq.com/access-control.html</a></li>
</ul>
<p>As for scalability, the approach I've shown creates a separate response queue for each consumer, but it can have problems scaling, especially if you are using a RabbitMQ cluster. You may want to look at the <a href="https://www.rabbitmq.com/direct-reply-to.html">&quot;direct reply-to&quot;</a> approach instead. For an interesting real-world overview of using direct reply-to, take a look at this <a href="https://facundoolano.wordpress.com/2016/06/26/real-world-rpc-with-rabbitmq-and-node-js/">blog post.</a>.</p>
<h4 id="wrappingup">Wrapping up</h4>
<p>I hope you've enjoyed this series and that it has given you some ideas about how to implement service relays in your Dynamics 365 CE projects. As I worked through the examples, I certainly learned a few new things, especially when I created my Python service proxy in Azure Functions.</p>
<p>Here are links to all the previous posts in this series.</p>
<ol>
<li><a href="https://alexanderdevelopment.net/post/2018/01/30/relaying-external-queries-to-on-premise-dynamics-365-ce-orgs-with-rabbitmq-and-python/">Part 1</a> - Series introduction</li>
<li><a href="https://alexanderdevelopment.net/post/2018/02/01/building-a-simple-service-relay-for-dynamics-365-ce-with-rabbitmq-and-python-part-2/">Part 2</a> - Solution prerequisites</li>
<li><a href="https://alexanderdevelopment.net/post/2018/02/05/building-a-simple-service-relay-for-dynamics-365-ce-with-rabbitmq-and-python-part-3/">Part 3</a> - Python code for the consumer and listener processes</li>
</ol>
<p>What do you think about this approach? Is it something you think you'd use in production? Let us know in the comments!</p>
</div>]]></content:encoded></item><item><title><![CDATA[Building a simple service relay for Dynamics 365 CE with RabbitMQ and Python - part 3]]></title><description><![CDATA[<div class="kg-card-markdown"><p>In my last <a href="https://alexanderdevelopment.net/post/2018/02/01/building-a-simple-service-relay-for-dynamics-365-ce-with-rabbitmq-and-python-part-2/">post</a> in this series, I walked through the prerequisites for building a simple service relay for Dynamics 365 CE with RabbitMQ and Python. In today's post I will show the Python code to make the service relay work.</p>
<p>As I described in the <a href="https://alexanderdevelopment.net/post/2018/01/30/relaying-external-queries-to-on-premise-dynamics-365-ce-orgs-with-rabbitmq-and-python/">first post</a> in this</p></div>]]></description><link>https://alexanderdevelopment.net/post/2018/02/05/building-a-simple-service-relay-for-dynamics-365-ce-with-rabbitmq-and-python-part-3/</link><guid isPermaLink="false">5a6cab4cc86c8900016cf352</guid><category><![CDATA[Microsoft Dynamics CRM]]></category><category><![CDATA[Dynamics 365]]></category><category><![CDATA[integration]]></category><category><![CDATA[Python]]></category><category><![CDATA[RabbitMQ]]></category><dc:creator><![CDATA[Lucas Alexander]]></dc:creator><pubDate>Mon, 05 Feb 2018 17:57:29 GMT</pubDate><media:content url="https://alexanderdevelopment.net/content/images/2018/02/simple-service-relay-1.png" medium="image"/><content:encoded><![CDATA[<div class="kg-card-markdown"><img src="https://alexanderdevelopment.net/content/images/2018/02/simple-service-relay-1.png" alt="Building a simple service relay for Dynamics 365 CE with RabbitMQ and Python - part 3"><p>In my last <a href="https://alexanderdevelopment.net/post/2018/02/01/building-a-simple-service-relay-for-dynamics-365-ce-with-rabbitmq-and-python-part-2/">post</a> in this series, I walked through the prerequisites for building a simple service relay for Dynamics 365 CE with RabbitMQ and Python. In today's post I will show the Python code to make the service relay work.</p>
<p>As I described in the <a href="https://alexanderdevelopment.net/post/2018/01/30/relaying-external-queries-to-on-premise-dynamics-365-ce-orgs-with-rabbitmq-and-python/">first post</a> in this series, this approach relies on a consumer process and a queue listener process that can both access a RabbitMQ message broker.</p>
<blockquote>
<p>A consumer writes a request to a cloud-hosted RabbitMQ request queue (either directly or through a proxy service) and starts waiting for a response. On the other end, a Python script monitors the request queue for inbound requests. When it sees a new one, it executes the appropriate request through the Dynamics 365 Web API and writes the response back to a client-specific RabbitMQ response queue. The consumer then picks up the response from the queue.</p>
</blockquote>
<p>This solution is based on the remote procedure call (RPC) approach shown <a href="https://www.rabbitmq.com/tutorials/tutorial-six-python.html">here</a>. The main difference is that I have added logic to the queue monitoring script to query the Dynamics 365 Web API based on the inbound request from the consumer.</p>
<h4 id="consumersample">Consumer sample</h4>
<p>The consumer does the following:</p>
<ol>
<li>Read the text of the request to write to the queue from a command-line argument.</li>
<li>Establish a connection to the RabbitMQ broker.</li>
<li>Create a new anonymous, exclusive callback queue.</li>
<li>Write a request message a queue called &quot;rpc_queue.&quot; This message will include the callback queue as its &quot;reply_to&quot; property.</li>
<li>Monitor the callback queue for a response.</li>
</ol>
<p>There's no validation in this sample, so if you run it without a command-line argument, it will just throw an error and exit.</p>
<pre><code>import sys
import pika
import uuid
import datetime

class CrmRpcClient(object):
    def __init__(self):
        #RabbitMQ connection details
        self.rabbituser = 'crmuser'
        self.rabbitpass = 'crmpass'
        self.rabbithost = '127.0.0.1' 
        self.rabbitport = 5672
        self.rabbitqueue = 'rpc_queue'
        rabbitcredentials = pika.PlainCredentials(self.rabbituser, self.rabbitpass)
        rabbitparameters = pika.ConnectionParameters(host=self.rabbithost,
                                    port=self.rabbitport,
                                    virtual_host='/',
                                    credentials=rabbitcredentials)

                self.rabbitconn = pika.BlockingConnection(rabbitparameters)

        self.channel = self.rabbitconn.channel()

        #create an anonymous exclusive callback queue
        result = self.channel.queue_declare(exclusive=True)
        self.callback_queue = result.method.queue

        self.channel.basic_consume(self.on_response, no_ack=True,
                                   queue=self.callback_queue)

    #callback method for when a response is received - note the check for correlation id
    def on_response(self, ch, method, props, body):
        if self.corr_id == props.correlation_id:
            self.response = body

    #method to make the initial request
    def call(self, n):
        self.response = None
        #generate a new correlation id
        self.corr_id = str(uuid.uuid4())

        #publish the message to the rpc_queue - note the reply_to property is set to the callback queue from above
        self.channel.basic_publish(exchange='',
                                   routing_key=self.rabbitqueue,
                                   properties=pika.BasicProperties(
                                         reply_to = self.callback_queue,
                                         correlation_id = self.corr_id,
                                         ),
                                   body=n)
        while self.response is None:
            self.rabbitconn.process_data_events()
        return self.response

#instantiate an rpc client
crm_rpc = CrmRpcClient()

#read the request from the command line
request = sys.argv[1]

#make the request and get the response
print(&quot; [x] Requesting crm data(&quot;+request+&quot;)&quot;)
print(&quot; [.] Start time %s&quot; % str(datetime.datetime.now()))
response = crm_rpc.call(request)

#convert the response message body from the queue to a string 
decoderesponse = response.decode()

#print the output
print(&quot; [.] Received response: %s&quot; % decoderesponse)
print(&quot; [.] End time %s&quot; % str(datetime.datetime.now()))
</code></pre>
<h4 id="queuelistenersample">Queue listener sample</h4>
<p>The queue listener does the following:</p>
<ol>
<li>Establish a connection to the RabbitMQ broker</li>
<li>Monitor &quot;rpc_queue&quot; queue.</li>
<li>When a new message from the &quot;rpc_queue&quot; queue is delivered, decode the message body as a string, and determine what Web API query to execute. Note: This sample can return a list of contacts or accounts from Dynamics 365 CE based on the request the consumer sends (&quot;getcontacts&quot; or &quot;getaccounts&quot;). If any other request is received, the listener will return an error message to the consumer callback queue.</li>
<li>Execute the appropriate query against the Dynamics 365 Web API and write the response to the callback queue the client established originally.</li>
</ol>
<pre><code>import pika
import requests
from requests_ntlm import HttpNtlmAuth
import json

#NTLM credentials to access on-prem Dynamics 365 Web API
username = 'DOMAIN\\USERNAME'
userpassword = 'PASSWORD'

#full path to Web API
crmwebapi = 'http://33.0.0.16/lucastest02/api/data/v8.1'

#RabbitMQ connection details
rabbituser = 'crmuser'
rabbitpass = 'crmpass'
rabbithost = '127.0.0.1' 
rabbitport = 5672

#method to execute a Web API query based on the client request
def processquery(query):
    #set the Web API request headers
    crmrequestheaders = {
        'OData-MaxVersion': '4.0',
        'OData-Version': '4.0',
        'Accept': 'application/json',
        'Content-Type': 'application/json; charset=utf-8',
        'Prefer': 'odata.maxpagesize=500',
        'Prefer': 'odata.include-annotations=OData.Community.Display.V1.FormattedValue'
    }

    #determine which Web API query to execute
    if query == 'getcontacts':
        crmwebapiquery = '/contacts?$select=fullname,contactid'
    elif query == 'getaccounts':
        crmwebapiquery = '/accounts?$select=name,accountid'
    else:
        #only handle 'getcontacts' or 'getaccounts' requests
        return 'Operation not supported'

    #execute the query
    crmres = requests.get(crmwebapi+crmwebapiquery, headers=crmrequestheaders,auth=HttpNtlmAuth(username,userpassword))
    
    #get the results json
    crmjson = crmres.json()

    #return the json
    return crmjson

#method to handle new inbound requests
def on_request(ch, method, props, body):
    #convert the message body from the queue to a string
    decodebody = body.decode('utf-8')

    #print the request
    print(&quot; [.] Received request: '%s'&quot; % decodebody)

    #process the request query
    response = processquery(decodebody)

    #publish the response back to 'reply-to' queue from the request message and set the correlation id
    ch.basic_publish(exchange='',
                     routing_key=props.reply_to,
                     properties=pika.BasicProperties(correlation_id = \
                                                         props.correlation_id),
                     body=str(response).encode(encoding=&quot;utf-8&quot;, errors=&quot;strict&quot;))
    ch.basic_ack(delivery_tag = method.delivery_tag)

print(&quot; [x] Awaiting RPC requests&quot;)

#connect to RabbitMQ broker
rabbitcredentials = pika.PlainCredentials(rabbituser, rabbitpass)
rabbitparameters = pika.ConnectionParameters(host=rabbithost,
                               port=rabbitport,
                               virtual_host='/',
                               credentials=rabbitcredentials)
rabbitconn = pika.BlockingConnection(rabbitparameters)
channel = rabbitconn.channel()

#declare the 'rpc_queue' queue
channel.queue_declare(queue='rpc_queue')

#set qos settings for the channel
channel.basic_qos(prefetch_count=1)

#assign the 'on_request' method as a callback for when new messages delivered from the 'rpc_queue' queue
channel.basic_consume(on_request, queue='rpc_queue')

#start listening for requests
channel.start_consuming()
</code></pre>
<h4 id="tryingitout">Trying it out</h4>
<p>As I mentioned in my last post, I initially wrote my code to use a RabbitMQ broker running on my local PC, so that's why the connections in the samples show 127.0.0.1 as the host. For a demo, I've spun up a copy of RabbitMQ in a Docker container in the cloud and updated my connection parameters accordingly, but I am still running my queue listener and consumer processes on my local PC.</p>
<p>When the listener first starts, it displays a simple status message.<br>
<img src="https://alexanderdevelopment.net/content/images/2018/02/1_start_listener.png#img-thumbnail" alt="Building a simple service relay for Dynamics 365 CE with RabbitMQ and Python - part 3"></p>
<p>Then I execute a &quot;getcontacts&quot; request from the consumer in a separate window.<br>
<img src="https://alexanderdevelopment.net/content/images/2018/02/2_get_contacts.png#img-thumbnail" alt="Building a simple service relay for Dynamics 365 CE with RabbitMQ and Python - part 3"></p>
<p>From the timestamps before and after the request, you can see the round-trip time is less than .2 seconds, which includes two round trips between my local PC and the cloud-based RabbitMQ broker <em>plus</em> the actual query processing time in my local Dynamics 365 CE org.</p>
<p>Then I execute a &quot;getaccounts&quot; request.<br>
<img src="https://alexanderdevelopment.net/content/images/2018/02/4_get_accounts.png#img-thumbnail" alt="Building a simple service relay for Dynamics 365 CE with RabbitMQ and Python - part 3"></p>
<p>This request was also fulfilled in less than .2 seconds.</p>
<p>Finally I execute an invalid request to show what the error response looks like.<br>
<img src="https://alexanderdevelopment.net/content/images/2018/02/6_get_leads.png#img-thumbnail" alt="Building a simple service relay for Dynamics 365 CE with RabbitMQ and Python - part 3"></p>
<p>You'll note the total time from request to response is only about .05 seconds less than the total time for the valid queries. That indicates most of the time used in these samples is being spent on the round trips between my local PC and the RabbitMQ broker, which is not surprising.</p>
<p>Meanwhile, the queue listener wrote a simple status update for every request it received. If I were using this in production, I would use more sophisticated logging.<br>
<img src="https://alexanderdevelopment.net/content/images/2018/02/7_listener_output.png#img-thumbnail" alt="Building a simple service relay for Dynamics 365 CE with RabbitMQ and Python - part 3"></p>
<h4 id="wrappingup">Wrapping up</h4>
<p>That's it for now. In my next (and final) post in this series, I will show how you can use <a href="https://azure.microsoft.com/en-us/services/functions/">Azure Functions</a> to make a consumer service proxy so consuming applications don't have to access to your RabbitMQ broker directly, and I will also discuss some general thoughts on security and scalability for the service .</p>
</div>]]></content:encoded></item><item><title><![CDATA[Building a simple service relay for Dynamics 365 CE with RabbitMQ and Python - part 2]]></title><description><![CDATA[<div class="kg-card-markdown"><p>In my <a href="https://alexanderdevelopment.net/post/2018/01/30/relaying-external-queries-to-on-premise-dynamics-365-ce-orgs-with-rabbitmq-and-python/">last post</a> in this series, I outlined an approach for building a simple service relay with <a href="https://www.rabbitmq.com/">RabbitMQ</a> and <a href="https://www.python.org/">Python</a> to easily expose an on-premises Dynamics 365 Customer Engagement organization to external consumers. In this post I will walk through the prerequisites for building this out. I'm assuming you</p></div>]]></description><link>https://alexanderdevelopment.net/post/2018/02/01/building-a-simple-service-relay-for-dynamics-365-ce-with-rabbitmq-and-python-part-2/</link><guid isPermaLink="false">5a6ca8fec86c8900016cf351</guid><category><![CDATA[Microsoft Dynamics CRM]]></category><category><![CDATA[Dynamics 365]]></category><category><![CDATA[integration]]></category><category><![CDATA[Python]]></category><category><![CDATA[RabbitMQ]]></category><dc:creator><![CDATA[Lucas Alexander]]></dc:creator><pubDate>Fri, 02 Feb 2018 03:24:51 GMT</pubDate><media:content url="https://alexanderdevelopment.net/content/images/2018/02/simple-service-relay.png" medium="image"/><content:encoded><![CDATA[<div class="kg-card-markdown"><img src="https://alexanderdevelopment.net/content/images/2018/02/simple-service-relay.png" alt="Building a simple service relay for Dynamics 365 CE with RabbitMQ and Python - part 2"><p>In my <a href="https://alexanderdevelopment.net/post/2018/01/30/relaying-external-queries-to-on-premise-dynamics-365-ce-orgs-with-rabbitmq-and-python/">last post</a> in this series, I outlined an approach for building a simple service relay with <a href="https://www.rabbitmq.com/">RabbitMQ</a> and <a href="https://www.python.org/">Python</a> to easily expose an on-premises Dynamics 365 Customer Engagement organization to external consumers. In this post I will walk through the prerequisites for building this out. I'm assuming you have access to a Dynamics 365 CE organization, so I'm going to skip the setup for that and focus on just RabbitMQ and Python today.</p>
<h4 id="settinguprabbitmq">Setting up RabbitMQ</h4>
<p>Back in 2015 When I first blogged about RabbitMQ and Dynamics 365, I wrote a <a href="https://alexanderdevelopment.net/post/2015/01/14/using-rabbitmq-as-a-message-broker-in-dynamics-crm-data-interfaces-part-2/">detailed post</a> showing how to install and configure RabbitMQ. Since then I have discovered the joys of <a href="https://www.docker.com/">Docker</a>, which makes the process significantly easier. If you have access to Docker, I highly recommend using it. Once you have Docker running, you can use one of the <a href="https://hub.docker.com/_/rabbitmq/">official RabbitMQ images</a>. For this project, I initially used the rabbitmq:3-management image in <a href="https://www.docker.com/docker-windows">Docker for Windows</a> running on my local PC. After I got the basic functionality working, I then moved to an instance of Docker running in the cloud on a <a href="https://www.digitalocean.com" target="_blank">Digital Ocean</a> VPS.</p>
<p>If don't want to use Docker, you can use a full RabbitMQ install like I showed <a href="https://alexanderdevelopment.net/post/2015/01/14/using-rabbitmq-as-a-message-broker-in-dynamics-crm-data-interfaces-part-2/">previously</a>. The main thing to remember is that no matter how you set up your RabbitMQ server, if it is not accessible from the public internet, you will not be able to use it as a service relay between an on-premises Dynamics 365 org and external consumers.</p>
<h4 id="settinguppython">Setting up Python</h4>
<p>I'm assuming if you've gotten this far, you have a functional Python development environment (if not, give <a href="https://code.visualstudio.com/docs/languages/python">Visual Studio Code</a> a try), and the code I have written works in Python versions 2.7 or 3.x. In order to connect to both RabbitMQ and Dynamics 365, you will need a few additional packages. To connect to RabbitMQ, <a href="https://pika.readthedocs.io/en/0.11.2/">Pika</a> is the RabbitMQ team's recommended Python client, and you can get it using <a href="https://pypi.python.org/pypi/pip">pip</a>.</p>
<p>To communicate with Dynamics 365, you'll need to use the Web API, but authentication will be handled differently depending on whether you connect to an on-premises org or an online / IFD org. For online or IFD orgs, you can either use <a href="https://jlattimer.blogspot.com/2015/11/crm-web-api-using-python.html">ADAL</a> or this <a href="http://alexanderdevelopment.net/post/2016/11/27/dynamics-365-and-python-integration-using-the-web-api/">alternate approach</a> I described back in 2016. If you have an on-premises org, you can authenticate using the requests_ntlm package like I showed <a href="https://alexanderdevelopment.net/post/2018/01/15/connecting-to-an-on-premise-dynamics-365-org-from-python/">here</a>. As with the Pika client, all the packages you need to connect to Dynamics 365 are also available via pip.</p>
<h4 id="wrappingup">Wrapping up</h4>
<p>That's it for today. In my next post in this series I will show the Python code you need to make this service relay work.</p>
</div>]]></content:encoded></item><item><title><![CDATA[Building a simple service relay for Dynamics 365 CE with RabbitMQ and Python - part 1]]></title><description><![CDATA[<div class="kg-card-markdown"><p>Integrating with external systems is a common requirement in Dynamics 365 Customer Engagement projects, but when the project involves an on-premises instance of Dynamics 365, routing requests from external systems through your firewall can present an additional challenge. Over the course of the next few posts, I will show you</p></div>]]></description><link>https://alexanderdevelopment.net/post/2018/01/30/relaying-external-queries-to-on-premise-dynamics-365-ce-orgs-with-rabbitmq-and-python/</link><guid isPermaLink="false">5a636975e2df920001a88f8e</guid><category><![CDATA[Microsoft Dynamics CRM]]></category><category><![CDATA[Dynamics 365]]></category><category><![CDATA[integration]]></category><category><![CDATA[Python]]></category><category><![CDATA[RabbitMQ]]></category><dc:creator><![CDATA[Lucas Alexander]]></dc:creator><pubDate>Wed, 31 Jan 2018 01:01:10 GMT</pubDate><media:content url="https://alexanderdevelopment.net/content/images/2018/01/simple-service-relay-1.png" medium="image"/><content:encoded><![CDATA[<div class="kg-card-markdown"><img src="https://alexanderdevelopment.net/content/images/2018/01/simple-service-relay-1.png" alt="Building a simple service relay for Dynamics 365 CE with RabbitMQ and Python - part 1"><p>Integrating with external systems is a common requirement in Dynamics 365 Customer Engagement projects, but when the project involves an on-premises instance of Dynamics 365, routing requests from external systems through your firewall can present an additional challenge. Over the course of the next few posts, I will show you can easily build a simple service relay with <a href="https://www.rabbitmq.com/">RabbitMQ</a> and <a href="https://www.python.org/">Python</a> to handle inbound requests from external data interface consumers.</p>
<p>Here's how my approach works. A consumer writes a request to a cloud-hosted RabbitMQ request queue (either directly or through a proxy service) and starts waiting for a response. On the other end, a Python script monitors the request queue for inbound requests. When it sees a new one, it executes the appropriate request through the Dynamics 365 Web API and writes the response back to a client-specific RabbitMQ response queue. The consumer then picks up the response from the queue. This way the consumer doesn't need to know anything other than how to write the initial request, and no extra inbound firewall ports need to be opened.</p>
<p>This diagram shows an overview of the process. <img src="https://alexanderdevelopment.net/content/images/2018/01/simple-service-relay.png#img-thumbnail" alt="Building a simple service relay for Dynamics 365 CE with RabbitMQ and Python - part 1"></p>
<p>Although my original goal was to accelerate the deployment of data interfaces for on-premises Dynamics 365 CE instances, a simple service relay like this could also be useful for IFD or Dynamics 365 online deployments if you don't want to allow direct access to your organization. Because the queue monitoring process is single-threaded, it's an easy way to throttle requests, but you can run multiple instances of the queue monitor script if you want to increase the number of concurrent requests the relay can process.</p>
<h4 id="whyusethisapproach">Why use this approach?</h4>
<p>There are lots message brokers and service bus offerings (Azure Service Bus, IBM MQ, Amazon SQS, etc.) you could use to build a service relay. In fact there's even an Azure offering called <a href="https://docs.microsoft.com/en-us/azure/service-bus-relay/relay-what-is-it">Azure Relay</a> that aims to solve exactly the same problem that my approach does, but not just for Dynamics 365, so &quot;why use this?&quot; is a great question.</p>
<p>First, I think RabbitMQ is just a great tool, and I previously wrote a <a href="https://alexanderdevelopment.net/post/2015/01/27/using-rabbitmq-as-a-message-broker-in-dynamics-crm-data-interfaces-part-5/">five-part series</a> about using RabbitMQ with Dynamics 365 (back when it was still called Dynamics CRM). Second, using RabbitMQ instead of a cloud-specific service bus offering gives you maximum flexibility in where you host your request and response queues and how you chose to scale. For example, my RabbitMQ broker runs in a <a href="https://www.docker.com">Docker</a> container on a <a href="https://www.digitalocean.com/" target="_blank">Digital Ocean</a> VPS. If I ever decide to move off of Digital Ocean, I can easily switch to any IaaS or VPS provider. I can also configure a RabbitMQ cluster to achieve significantly faster throughput.</p>
<p>As for why I'm using Python instead of C#, which is probably more familiar to most Dynamics 365 developers, Python also makes this approach more flexible. Using Python means I'm not tied to the Dynamics 365 SDK client libraries or a Windows host for running my queue monitoring process, and I can easily package my monitoring process in a Docker image. <em>(Although I highly recommend Python, there are RabbitMQ clients for <a href="https://www.nuget.org/packages/RabbitMQ.Client">.Net</a>, and you can also find RabbitMQ tutorials for other languages including Java, Ruby and JavaScript <a href="https://www.rabbitmq.com/getstarted.html">here</a>.)</em></p>
<h4 id="wrappingup">Wrapping up</h4>
<p>That's it for now. In my next post in this series I will walk through the prerequisites for building the simple service relay.</p>
<p>How have you handled inbound data interfaces for on-premises Dynamics 365 CE organizations? Let us know in the comments!</p>
</div>]]></content:encoded></item><item><title><![CDATA[Dynamics 365 Configuration Data Mover v2.4]]></title><description><![CDATA[<div class="kg-card-markdown"><p>I've released an <a href="https://github.com/lucasalexander/AlexanderDevelopment.ConfigDataMover/releases/tag/v2.4.6587.18905">updated version</a> of my popular Dynamics 365 Configuration Data Mover utility that was built with .Net 4.7 to address the new requirement to use TLS 1.2 (or better) for connections to Dynamics 365 online instances as described in this entry on the Microsoft Dynamics 365</p></div>]]></description><link>https://alexanderdevelopment.net/post/2018/01/16/dynamics-365-configuration-data-mover-v2-4/</link><guid isPermaLink="false">5a5e85dae2df920001a88f85</guid><category><![CDATA[Microsoft Dynamics CRM]]></category><category><![CDATA[Dynamics 365]]></category><category><![CDATA[Configuration Data Mover]]></category><category><![CDATA[integration]]></category><category><![CDATA[utilities]]></category><dc:creator><![CDATA[Lucas Alexander]]></dc:creator><pubDate>Tue, 16 Jan 2018 23:12:00 GMT</pubDate><content:encoded><![CDATA[<div class="kg-card-markdown"><p>I've released an <a href="https://github.com/lucasalexander/AlexanderDevelopment.ConfigDataMover/releases/tag/v2.4.6587.18905">updated version</a> of my popular Dynamics 365 Configuration Data Mover utility that was built with .Net 4.7 to address the new requirement to use TLS 1.2 (or better) for connections to Dynamics 365 online instances as described in this entry on the Microsoft Dynamics 365 team blog: <a href="https://blogs.msdn.microsoft.com/crm/2017/09/28/updates-coming-to-dynamics-365-customer-engagement-connection-security">https://blogs.msdn.microsoft.com/crm/2017/09/28/updates-coming-to-dynamics-365-customer-engagement-connection-security</a>.</p>
<p>This upgrade is fully compatible with existing job files.</p>
<h4 id="gettingthedynamics365configurationdatamover">Getting the Dynamics 365 Configuration Data Mover</h4>
<p>The source code is available in my GitHub repository <a href="https://github.com/lucasalexander/AlexanderDevelopment.ConfigDataMover">here</a>.</p>
<p>A compiled version can be downloaded <a href="https://github.com/lucasalexander/AlexanderDevelopment.ConfigDataMover/releases/tag/v2.4.6587.18905">here</a>.</p>
</div>]]></content:encoded></item><item><title><![CDATA[Accessing an on-premises Dynamics 365 organization from Python]]></title><description><![CDATA[<div class="kg-card-markdown"><p>I've previously <a href="https://alexanderdevelopment.net/post/2016/11/27/dynamics-365-and-python-integration-using-the-web-api/">showed</a> how to access online and IFD instances of Dynamics 365 Customer Engagement from Python code. Because that sample code authenticated to the Web API using OAuth, it won't work with on-premises instances. Recently I've been doing some work with Python and an on-premises Dynamics 365 organization, so</p></div>]]></description><link>https://alexanderdevelopment.net/post/2018/01/15/connecting-to-an-on-premise-dynamics-365-org-from-python/</link><guid isPermaLink="false">5a5939bae2df920001a88f77</guid><category><![CDATA[Microsoft Dynamics CRM]]></category><category><![CDATA[Dynamics 365]]></category><category><![CDATA[Python]]></category><category><![CDATA[programming]]></category><category><![CDATA[integration]]></category><dc:creator><![CDATA[Lucas Alexander]]></dc:creator><pubDate>Mon, 15 Jan 2018 14:58:00 GMT</pubDate><media:content url="https://alexanderdevelopment.net/content/images/2018/01/Code_2018-01-12_17-00-50.png" medium="image"/><content:encoded><![CDATA[<div class="kg-card-markdown"><img src="https://alexanderdevelopment.net/content/images/2018/01/Code_2018-01-12_17-00-50.png" alt="Accessing an on-premises Dynamics 365 organization from Python"><p>I've previously <a href="https://alexanderdevelopment.net/post/2016/11/27/dynamics-365-and-python-integration-using-the-web-api/">showed</a> how to access online and IFD instances of Dynamics 365 Customer Engagement from Python code. Because that sample code authenticated to the Web API using OAuth, it won't work with on-premises instances. Recently I've been doing some work with Python and an on-premises Dynamics 365 organization, so I thought I'd share a sample that shows how to authenticate to the Web API using NTLM.</p>
<pre><code>import requests
from requests_ntlm import HttpNtlmAuth
import json

username = 'companyx\\administrator'
userpassword = 'PASSWORD GOES HERE'

#set these values to query your crm data
crmwebapi = 'http://33.0.0.16/lucastest02/api/data/v8.1'
crmwebapiquery = '/contacts?$select=fullname,contactid'

crmrequestheaders = {
    'OData-MaxVersion': '4.0',
    'OData-Version': '4.0',
    'Accept': 'application/json',
    'Content-Type': 'application/json; charset=utf-8',
    'Prefer': 'odata.maxpagesize=500',
    'Prefer': 'odata.include-annotations=OData.Community.Display.V1.FormattedValue'
}

print('making crm request . . .')
crmres = requests.get(crmwebapi+crmwebapiquery, headers=crmrequestheaders,auth=HttpNtlmAuth(username,userpassword))
print('crm response received . . .')
try:
    print('parsing crm response . . .')
    crmresults = crmres.json()
    for x in crmresults['value']:
        print (x['fullname'] + ' - ' + x['contactid'])
except KeyError:
    print('Could not parse CRM results')
</code></pre>
<p>As you can see, this code doesn't retrieve an OAuth token before calling the Dynamics 365 Web API, but rather it uses the <a href="https://github.com/requests/requests-ntlm">requests-ntlm</a> package to authenticate directly to the Web API using a username and password. Other than that small change, everything else works the same as in my previous examples.</p>
</div>]]></content:encoded></item><item><title><![CDATA[Using proxy connections with the Dynamics 365 Configuration Data Mover]]></title><description><![CDATA[<div class="kg-card-markdown"><p>I was recently asked to add a feature to my Dynamics 365 Configuration Data Mover to enable connections through a proxy server. Because the tool is a .Net application, proxy server connections can be configured directly in the AlexanderDevelopmentConfigDataMover.exe.config file. For example, if you want to use the</p></div>]]></description><link>https://alexanderdevelopment.net/post/2018/01/08/using-proxy-connections-with-the-dynamics-365-configuration-data-mover/</link><guid isPermaLink="false">5a5837246636a30001b978f1</guid><category><![CDATA[Microsoft Dynamics CRM]]></category><category><![CDATA[Dynamics 365]]></category><category><![CDATA[Configuration Data Mover]]></category><category><![CDATA[integration]]></category><category><![CDATA[utilities]]></category><dc:creator><![CDATA[Lucas Alexander]]></dc:creator><pubDate>Mon, 08 Jan 2018 15:16:49 GMT</pubDate><content:encoded><![CDATA[<div class="kg-card-markdown"><p>I was recently asked to add a feature to my Dynamics 365 Configuration Data Mover to enable connections through a proxy server. Because the tool is a .Net application, proxy server connections can be configured directly in the AlexanderDevelopmentConfigDataMover.exe.config file. For example, if you want to use the default Internet Explorer proxy settings, just add the following values inside the <configuration> element:</configuration></p>
<pre><code>&lt;system.net&gt;
  &lt;defaultproxy enabled=&quot;true&quot;&gt; 
    &lt;proxy usesystemdefault=&quot;true&quot;/&gt; 
  &lt;/defaultproxy&gt; 
&lt;/system.net&gt;
</code></pre>
<p>For more information on proxy settings, take a look at this Microsoft overview on .Net <a href="https://docs.microsoft.com/en-us/dotnet/framework/network-programming/proxy-configuration">proxy configuration</a>.</p>
</div>]]></content:encoded></item><item><title><![CDATA[A Dynamics 365 local message listener for web client notifications - part 3]]></title><description><![CDATA[<div class="kg-card-markdown"><p>Several months ago I discussed an <a href="https://alexanderdevelopment.net/post/2017/07/19/a-dynamics-365-local-message-listener-for-web-client-notifications-part-1/">approach</a> for passing notifications from local applications to the Dynamics 365 web client through a message listener process that runs on an end user's PC and shared some <a href="https://alexanderdevelopment.net/post/2017/07/21/a-dynamics-365-local-message-listener-for-web-client-notifications-part-2/">sample code</a> for how to implement it.</p>
<p>Recently I used this approach to establish communication between</p></div>]]></description><link>https://alexanderdevelopment.net/post/2017/12/28/a-dynamics-365-local-message-listener-for-web-client-notifications-part-3/</link><guid isPermaLink="false">5a5837246636a30001b978ec</guid><category><![CDATA[Microsoft Dynamics CRM]]></category><category><![CDATA[Dynamics 365]]></category><category><![CDATA[integration]]></category><category><![CDATA[utilities]]></category><dc:creator><![CDATA[Lucas Alexander]]></dc:creator><pubDate>Thu, 28 Dec 2017 17:53:33 GMT</pubDate><content:encoded><![CDATA[<div class="kg-card-markdown"><p>Several months ago I discussed an <a href="https://alexanderdevelopment.net/post/2017/07/19/a-dynamics-365-local-message-listener-for-web-client-notifications-part-1/">approach</a> for passing notifications from local applications to the Dynamics 365 web client through a message listener process that runs on an end user's PC and shared some <a href="https://alexanderdevelopment.net/post/2017/07/21/a-dynamics-365-local-message-listener-for-web-client-notifications-part-2/">sample code</a> for how to implement it.</p>
<p>Recently I used this approach to establish communication between Dynamics 365 web resources and a fingerprint reader attached to a local PC. The actual coding was fairly simple, but I did run into a problem that I did not encounter when I built my original proof-of-concept message listener because I was accessing Dynamics 365 via HTTPS as opposed to HTTP like I had been in my on-prem development sandbox. Because mixed content can be a security risk, modern browsers typically block pages that are accessed via HTTPS from loading scripts over HTTP, which is how my listener process was running.</p>
<p>In order to make my listener process accessible via HTTPS, I did the following:</p>
<ol>
<li>Generated a self-signed SSL certificate to represent a root certificate authority.</li>
<li>Generated an SSL certificate signed by my root CA to encrypt communication with my message listener process.</li>
<li>Added the root SSL certificate from step #1 as a trusted root certificate on the local PC.</li>
<li>Added the SSL certificate from step #2 to the personal certificate store on the local PC.</li>
<li>Bound the SSL certificate from step #2 to the port my listener process uses.</li>
</ol>
<h4 id="generatingthecertificates">Generating the certificates</h4>
<p>This <a href="http://blog.davidchristiansen.com/2016/09/howto-create-self-signed-certificates-with-powershell/">blog post</a> has detailed steps I followed to generate self-signed certificates with PowerShell's <a href="https://docs.microsoft.com/en-us/powershell/module/pkiclient/new-selfsignedcertificate?view=win10-ps">New-SelfSignedCertificate</a> cmdlet. In the past I have used OpenSSL to generate self-signed certificates, and while OpenSSL would definitely work here, I found the PowerShell approach to be much  easier for my initial development purposes.</p>
<p><em>(A note on security - As I look at making this solution fully ready for production, I need to do some further investigation into the security implications of using self-signed certificates. Because I am only using them to enable communication with a process running on the local PC, I think the risks are minimal as long as the root certificate is kept secure.)</em></p>
<h4 id="installingthecertificates">Installing the certificates</h4>
<p>Once you have generated your two SSL certificates, you need to install them on the local PC. If you followed the directions from the blog post above to generate your certificates, you should have a public certificate for your root CA as a .cer file and a private certificate to use for the application as a .pfx file.</p>
<p>To install the root certificate as a trusted root certificate authority, do the following:</p>
<ol>
<li>Double-click public certificate (.cer file) for the root CA you generated. The certificate properties window will open. Click &quot;install certificate.&quot; <img src="https://alexanderdevelopment.net/content/images/2017/12/import-cer-01.png#img-thumbnail" alt="Install root CA - step 1"></li>
<li>The certificate import wizard will open. Select &quot;local machine&quot; and click next. <img src="https://alexanderdevelopment.net/content/images/2017/12/import-cer-02.png" alt="Install root CA - step 2"></li>
<li>You may be presented with a confirmation dialog asking if you &quot;want to allow this app to make changes to your device.&quot; If so, click yes.</li>
<li>On the next screen, select &quot;place all certificates in the following store&quot; and click browse.</li>
<li>Select &quot;trusted root certification authorities.&quot;<img src="https://alexanderdevelopment.net/content/images/2017/12/import-cer-03.png#img-thumbnail" alt="Install root CA - step 5"></li>
<li>Verify the certificate store input shows &quot;trusted root certification authorities&quot; and click next.<img src="https://alexanderdevelopment.net/content/images/2017/12/import-cer-04.png#img-thumbnail" alt="Install root CA - step 6"></li>
<li>Click finish.<img src="https://alexanderdevelopment.net/content/images/2017/12/import-cer-05.png#img-thumbnail" alt="Install root CA - step 7"></li>
<li>You will receive a success message. Click OK to close it.</li>
<li>Close the certificate properties window.</li>
</ol>
<p>To install the SSL certificate that will encrypt communication with the message listener, do the following:</p>
<ol>
<li>Double-click private certificate (.pfx) for the application certificate CA you generated. The certificate import wizard will open. Select &quot;local machine&quot; and click next.<img src="https://alexanderdevelopment.net/content/images/2017/12/import-pfx-01.png#img-thumbnail" alt="Install application certificate - step 1"></li>
<li>You may be presented with a confirmation dialog asking if you &quot;want to allow this app to make changes to your device.&quot; If so, click yes.</li>
<li>You will see the name of your .pfx file in the file name input box. Click next. <img src="https://alexanderdevelopment.net/content/images/2017/12/import-pfx-02.png#img-thumbnail" alt="Install application certificate - step 3"></li>
<li>Enter the password for the .pfx file and click next.<img src="https://alexanderdevelopment.net/content/images/2017/12/import-pfx-03.png#img-thumbnail" alt="Install application certificate - step 4"></li>
<li>On the next screen, select &quot;place all certificates in the following store&quot; and click browse.</li>
<li>Select &quot;personal.&quot;<img src="https://alexanderdevelopment.net/content/images/2017/12/import-pfx-04.png#img-thumbnail" alt="Install application certificate - step 6"></li>
<li>Verify the certificate store input shows &quot;personal&quot; and click next.<img src="https://alexanderdevelopment.net/content/images/2017/12/import-pfx-05.png#img-thumbnail" alt="Install application certificate - step 7"></li>
<li>Click finish.<img src="https://alexanderdevelopment.net/content/images/2017/12/import-pfx-06.png#img-thumbnail" alt="Install application certificate - step 8"></li>
</ol>
<h4 id="bindingtheapplicationsslcertificatetothemessagelistenerport">Binding the application SSL certificate to the message listener port</h4>
<p>Once you have installed the certificates on the local PC, you can bind the application SSL certificate to the port on which the message listener is listening. I am using Windows 10, but the directions outlined in this <a href="https://docs.microsoft.com/en-us/dotnet/framework/wcf/feature-details/how-to-configure-a-port-with-an-ssl-certificate">Microsoft document</a> for Windows Vista worked for me. Presumably they will also work for Windows 7 and 8.</p>
<p>If you don't feel like reading the entire document, basically you can use the <a href="https://technet.microsoft.com/en-us/library/bb490939.aspx?f=255&amp;MSPPError=-2147217396">netsh</a> command to encrypt all requests on a specific port for a specific application using a specific SSL certificate. Here's how to configure it:</p>
<ol>
<li>Get the thumbprint from the SSL certificate you installed in the &quot;personal&quot; store following the instructions here - <a href="https://docs.microsoft.com/en-us/dotnet/framework/wcf/feature-details/how-to-retrieve-the-thumbprint-of-a-certificate">https://docs.microsoft.com/en-us/dotnet/framework/wcf/feature-details/how-to-retrieve-the-thumbprint-of-a-certificate</a>.</li>
<li>Get the application id from your message listener application. If your message listener is a .Net application, you would use the &quot;Assembly: Guid&quot; value from your application's AssemblyInfo file.</li>
<li>Open a command prompt as an administrator and execute the following command: <code>netsh http add sslcert ipport=0.0.0.0:9345 certhash=CC5F7BF58FD666EEC844C1B949E9661267A8A310 appid={b31bee72-c2ac-411e-959b-adbd25bba2cf}</code> You will need to substitute your specific values for &quot;ipport,&quot; &quot;certhash&quot; and &quot;appid.&quot;</li>
<li>Verify your configuration works by starting your message listener and then making an HTTPS request.</li>
</ol>
<p>Assuming everything works, you should now be able to access your local message listener process from your Dynamics 365 web resources. If you want to use the message listener on other PCs, you don't have to regenerate new certificates. You can just install the same certificates and run the same netsh command on every PC where the listener application will run.</p>
<p>What do think about this approach? Are there any real-world scenarios where it would be useful for you?</p>
</div>]]></content:encoded></item><item><title><![CDATA[Creating many-to-many associations with the Dynamics 365 Configuration Data Mover]]></title><description><![CDATA[<div class="kg-card-markdown"><p>I've released an updated version of my popular Dynamics 365 Configuration Data Mover utility that includes the ability to create many-to-many associations in the target system. This upgrade is fully compatible with existing job files.</p>
<p>To create a many-to-many job step in the GUI, select the new &quot;many to</p></div>]]></description><link>https://alexanderdevelopment.net/post/2017/11/28/creating-many-to-many-associations-with-the-dynamics-365-configuration-data-mover/</link><guid isPermaLink="false">5a5837246636a30001b978e6</guid><category><![CDATA[Microsoft Dynamics CRM]]></category><category><![CDATA[Dynamics 365]]></category><category><![CDATA[Configuration Data Mover]]></category><category><![CDATA[integration]]></category><category><![CDATA[utilities]]></category><dc:creator><![CDATA[Lucas Alexander]]></dc:creator><pubDate>Tue, 28 Nov 2017 20:00:00 GMT</pubDate><media:content url="https://alexanderdevelopment.net/content/images/2017/11/AlexanderDevelopment-ConfigDataMover_2017-11-28_10-28-59.png" medium="image"/><content:encoded><![CDATA[<div class="kg-card-markdown"><img src="https://alexanderdevelopment.net/content/images/2017/11/AlexanderDevelopment-ConfigDataMover_2017-11-28_10-28-59.png" alt="Creating many-to-many associations with the Dynamics 365 Configuration Data Mover"><p>I've released an updated version of my popular Dynamics 365 Configuration Data Mover utility that includes the ability to create many-to-many associations in the target system. This upgrade is fully compatible with existing job files.</p>
<p>To create a many-to-many job step in the GUI, select the new &quot;many to many&quot; step type and input a FetchXML query for the relationship entity (relationship entity name on the many-to-many relationship form) that includes the GUID fields for each entity. The relationship entities cannot be queried in the advanced find builder, so you must write the FetchXML manually or use a separate query builder.</p>
<p><img src="https://alexanderdevelopment.net/content/images/2017/11/AlexanderDevelopment-ConfigDataMover_2017-11-28_10-28-59-1.png#img-thumbnail" alt="Creating many-to-many associations with the Dynamics 365 Configuration Data Mover"></p>
<p>One thing to keep in mind is that a many-to-many job step will create many-to-many record associations in the target system, but it will not delete any existing N:N associations that have been removed in the source.</p>
<h4 id="gettingthedynamics365configurationdatamover">Getting the Dynamics 365 Configuration Data Mover</h4>
<p>The source code is available in my GitHub repository <a href="https://github.com/lucasalexander/AlexanderDevelopment.ConfigDataMover">here</a>.</p>
<p>A compiled version can be downloaded <a href="https://github.com/lucasalexander/AlexanderDevelopment.ConfigDataMover/releases">here</a>.</p>
</div>]]></content:encoded></item></channel></rss>