<?xml version="1.0" encoding="UTF-8"?><rss xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:atom="http://www.w3.org/2005/Atom" version="2.0" xmlns:media="http://search.yahoo.com/mrss/"><channel><title><![CDATA[Azure - Alexander Development]]></title><description><![CDATA[Azure - Alexander Development]]></description><link>https://alexanderdevelopment.net/</link><generator>Ghost 1.20</generator><lastBuildDate>Thu, 23 Apr 2026 06:14:06 GMT</lastBuildDate><atom:link href="https://alexanderdevelopment.net/tag/azure/rss/" rel="self" type="application/rss+xml"/><ttl>60</ttl><item><title><![CDATA[Installing and securing OpenFaaS on an AKS cluster]]></title><description><![CDATA[<div class="kg-card-markdown"><p>A few months back, I wrote a <a href="https://alexanderdevelopment.net/post/2018/02/25/installing-and-securing-openfaas-on-a-google-cloud-virtual-machine/">guide</a> for installing and locking down <a href="https://www.openfaas.com/">OpenFaaS</a> in a Docker Swarm running on <a href="https://cloud.google.com/">Google Cloud Platform</a> virtual machines. Today I want to share a step-by-step guide that shows how to install OpenFaaS on a new <a href="https://azure.microsoft.com/en-us/services/container-service/kubernetes/">Azure Kubernetes Service</a> (AKS) cluster using an <a href="https://github.com/kubernetes/ingress-nginx">Nginx</a></p></div>]]></description><link>https://alexanderdevelopment.net/post/2018/05/31/installing-and-securing-openfaas-on-an-aks/</link><guid isPermaLink="false">5b0e9d8797f5e30001931b70</guid><category><![CDATA[OpenFaaS]]></category><category><![CDATA[Docker]]></category><category><![CDATA[Kubernetes]]></category><category><![CDATA[Azure]]></category><category><![CDATA[serverless]]></category><dc:creator><![CDATA[Lucas Alexander]]></dc:creator><pubDate>Thu, 31 May 2018 14:00:00 GMT</pubDate><media:content url="https://alexanderdevelopment.net/content/images/2018/05/powershell_2018-05-30_16-29-47.png" medium="image"/><content:encoded><![CDATA[<div class="kg-card-markdown"><img src="https://alexanderdevelopment.net/content/images/2018/05/powershell_2018-05-30_16-29-47.png" alt="Installing and securing OpenFaaS on an AKS cluster"><p>A few months back, I wrote a <a href="https://alexanderdevelopment.net/post/2018/02/25/installing-and-securing-openfaas-on-a-google-cloud-virtual-machine/">guide</a> for installing and locking down <a href="https://www.openfaas.com/">OpenFaaS</a> in a Docker Swarm running on <a href="https://cloud.google.com/">Google Cloud Platform</a> virtual machines. Today I want to share a step-by-step guide that shows how to install OpenFaaS on a new <a href="https://azure.microsoft.com/en-us/services/container-service/kubernetes/">Azure Kubernetes Service</a> (AKS) cluster using an <a href="https://github.com/kubernetes/ingress-nginx">Nginx</a> ingress controller to lock it down with basic authentication and free <a href="https://letsencrypt.org/">Let's Encrypt</a> TLS certificates.</p>
<h4 id="beforewebegin">Before we begin</h4>
<p>If you just want to do a quick deployment of OpenFaaS on AKS, there's a guide in the official AKS documentation <a href="https://docs.microsoft.com/en-us/azure/aks/openfaas">here</a>, however it does not show how to implement TLS encryption or authentication.</p>
<p>All the Azure configuration I'll show today is done via the command line, so if you don't already have the Azure CLI installed on your local system, install it from <a href="https://docs.microsoft.com/en-us/cli/azure/install-azure-cli?view=azure-cli-latest">here</a>. You can do it through the Azure portal, but it's much faster to do with the CLI.</p>
<p>You will also need Git command-line tools installed so you can pull down the latest version of OpenFaaS from its repository.</p>
<p>Finally, in order to secure your OpenFaaS installation with TLS, you will need a domain and access to your DNS provider so you can point a hostname to your cluster's public IP address</p>
<p>Ready? Let's get started.</p>
<h4 id="basicazureconfiguration">Basic Azure configuration</h4>
<ol>
<li>From the command line, log in to Azure using <code>az login</code>. Follow the prompts to complete your authentication.</li>
<li>If you don't have an existing resource group you want to use for your AKS cluster, create a new one with <code>az group create -l REGIONNAME -n RESOURCEGROUP</code>. Replace REGIONNAME and RESOURCEGROUP with appropriate values, but make sure you use a region where AKS is <a href="https://docs.microsoft.com/en-us/azure/aks/container-service-quotas">currently available</a>.</li>
<li>Create a new AKS cluster with <code>az aks create -g RESOURCEGROUP -n CLUSTERNAME --generate-ssh-keys</code>. The RESOURCEGROUP value is the same as before, and CLUSTERNAME is whatever you want it to be called. Note that the default virtual machine size for your cluster is Standard_DS1_v2. You can change this by setting the <code>--node-vm-size</code>, and I am personally using burstable Standard_B2s VMs for my AKS cluster.</li>
<li>Once the AKS cluster creation completes, use this command to get the credentials you need to manage the cluster with the Kubernetes CLI <code>az aks get-credentials --resource-group RESOURCEGROUP --name CLUSTERNAME</code>.</li>
<li>Install the Kubernetes CLI (kubectl) with <code>az aks install-cli</code>.</li>
<li>Get the name of the node resource group that was created for your AKS cluster with this command <code>az resource show --resource-group RESOURCEGROUP --name CLUSTERNAME --resource-type Microsoft.ContainerService/managedClusters --query properties.nodeResourceGroup -o tsv</code>. You should get an output that looks like MC_resourcegroup_clustername_regionname. You will use this return value in the next step.</li>
<li>Create a public IP address in the node resource groupe with this command <code>az network public-ip create --resource-group MC_RESOURCEGROUP --name IPADDRESSNAME --allocation-method static</code>. You will get a JSON response that contains a &quot;publicIp&quot; object. Copy its &quot;ipAddress&quot; value and save it for later.<br>
<em>Note: you might be tempted to create a DNS name label for this IP address so you can avoid using a custom domain name, but *.cloudapp.azure.com host names don't work with Let's Encrypt.</em></li>
<li>Go to your DNS provider and register a new A record for your a hostname that points to the external IP you reserved in the previous step (mine is akskube.alexanderdevelopment.net). This will be the hostname you use to access OpenFaaS. You may also need to create a new CAA record to explicitly allow Let's Encrypt to issue certificates for your domain.</li>
</ol>
<h4 id="basicclusterconfiguration">Basic cluster configuration</h4>
<p>Once the basic Azure configuration work is done, it's time to configure the AKS cluster.</p>
<ol>
<li>If you don't already have the Helm client installed on your local system, install it by following the directions <a href="https://github.com/kubernetes/helm/blob/master/docs/install.md">here</a>. I am using a Windows dev workstation, so I installed <a href="https://chocolatey.org/">Chocolatey</a> and then installed the Helm client with <code>choco install kubernetes-helm</code>.</li>
<li>Install Helm components on your AKS cluster with <code>helm init --upgrade --service-account default</code>.</li>
<li>Install the Nginx ingress controller on your AKS cluster with <code>helm install stable/nginx-ingress --namespace kube-system --set rbac.create=false --set rbac.createRole=false --set rbac.createClusterRole=false --set controller.service.loadBalancerIP=STATICIPADDRESS</code>. Replace STATICIPADDRESS with the public IP address you created previously.</li>
<li>Install <a href="https://github.com/jetstack/cert-manager">cert-manager</a> to request and manage your TLS certificates <code>helm install --name cert-manager --namespace kube-system stable/cert-manager --set rbac.create=false</code>.</li>
</ol>
<h4 id="installingopenfaas">Installing OpenFaas</h4>
<p>It's relatively easy to install OpenFaaS on your AKS cluster using Helm, and a detailed readme is available <a href="https://github.com/openfaas/faas-netes/blob/master/chart/openfaas/README.md">here</a>. Basically you need to download the <a href="https://github.com/openfaas/faas-netes">faas-netes</a> Git repository to your local system, create a couple of namespaces on the AKS cluser and use the Helm chart in the repo you downloaded. Here's how I set it up on my AKS cluster.</p>
<pre><code>kubectl create ns openfaas
kubectl create ns openfaas-fn

git clone https://github.com/openfaas/faas-netes
cd faas-netes

helm install --namespace openfaas -n openfaas --set functionNamespace=openfaas-fn --set ingress.enabled=true --set rbac=false chart/openfaas/
</code></pre>
<p>Once OpenFaaS is installed, you need to create ingress resources to make it available externally.</p>
<h4 id="creatingtheingressresources">Creating the ingress resources</h4>
<p>Before creating your ingress resources, you need to create certificate issuer resources to get TLS certificates. Here's the YAML for a Let's Encrypt staging issuer:</p>
<pre><code>apiVersion: certmanager.k8s.io/v1alpha1
kind: Issuer
metadata:
  name: letsencrypt-staging
spec:
  acme:
    # The ACME server URL
    server: https://acme-staging-v02.api.letsencrypt.org/directory
    # Email address used for ACME registration
    email: EMAILADDRESS
    # Name of a secret used to store the ACME account private key
    privateKeySecretRef:
      name: letsencrypt-staging
    # Enable the HTTP-01 challenge provider
    http01: {}
</code></pre>
<p>Copy it, replace EMAILADDRESS with your email address and save it as faas-staging-issuer.yml. Then run <code>kubectl apply -f faas-staging-issuer.yml -n openfaas</code>.</p>
<p>Here's a corresponding production issuer:</p>
<pre><code>apiVersion: certmanager.k8s.io/v1alpha1
kind: Issuer
metadata:
  name: letsencrypt-production
spec:
  acme:
    # The ACME server URL
    server: https://acme-v02.api.letsencrypt.org/directory
    # Email address used for ACME registration
    email: EMAILADDRESS
    # Name of a secret used to store the ACME account private key
    privateKeySecretRef:
      name: letsencrypt-production
    # Enable the HTTP-01 challenge provider
    http01: {}
</code></pre>
<p>Copy it, replace EMAILADDRESS with your email address and save it as faas-production-issuer.yml. Then run <code>kubectl apply -f faas-production-issuer.yml -n openfaas</code>.</p>
<p>Next you need to create a password file to implement basic authentication. If you are working on a system with apache2-utils installed, you can just use the htpasswd command. Otherwise, you can use a tool like <a href="http://aspirine.org/htpasswd_en.html">http://aspirine.org/htpasswd_en.html</a> to generate your htpasswd content. Once you have your htpasswd content generated, save it in a file named &quot;auth&quot; and run the following command <code>kubectl create secret generic basic-auth --from-file=auth -n openfaas</code></p>
<p>Now you can use the following YAML to create an ingress resource that exposes your OpenFaaS instance:</p>
<pre><code>apiVersion: extensions/v1beta1
kind: Ingress
metadata:
  name: faas-ingress
  annotations:
    nginx.ingress.kubernetes.io/auth-realm: &quot;Authentication Required&quot;
    nginx.ingress.kubernetes.io/auth-secret: basic-auth
    nginx.ingress.kubernetes.io/auth-type: basic
    kubernetes.io/tls-acme: &quot;true&quot;
    certmanager.k8s.io/issuer: letsencrypt-staging
    nginx.ingress.kubernetes.io/rewrite-target: /
spec:
  tls:
  - hosts:
    - HOSTNAME
    secretName: faas-letsencrypt-staging
  rules:
  - host: HOSTNAME
    http:
      paths:
      - path: /faas-admin
        backend:
          serviceName: gateway
          servicePort: 8080
</code></pre>
<p>Copy it, replace both instances of HOSTNAME with the hostname you created earlier and save it as faas-ingress.yml. Deploy it to your cluser with this command <code>kubectl apply -f faas-ingress.yml -n openfaas</code>.</p>
<p>As the ingress starts up, it will request a staging certificate from Let's Encrypt, and then it will start listening for requests. It may take a few minutes, so now might be a good time to take a short break. Once everything is complete, you will be able to access your OpenFaaS UI from <a href="https://HOSTNAME/faas-admin/ui/">https://HOSTNAME/faas-admin/ui/</a>, and once you deploy functions, they will be available at <a href="https://HOSTNAME/faas-admin/functions/FUNCTIONNAME">https://HOSTNAME/faas-admin/functions/FUNCTIONNAME</a>. You should get a browser warning about the certificate because it's using a Let's Encrypt production certificate, but that's OK for now. You should also be prompted for basic authentication credentials, which will be the username and password you created earlier.</p>
<p>If everything looks good, you can switch over to using a production TLS certificate. Take the faas-ingress YAML and replace the &quot;letsencrypt-staging&quot; in the secretname and certmanager.k8s.io issuer values with &quot;letsencrypt-production&quot; instead. Save it and deploy the update with <code>kubectl apply -f faas-ingress.yml -n openfaas</code>. Like before, the ingress will take a few minutes to restart and request a production TLS certificate from Let's Encrypt. Once that's done, you can access the your OpenFaaS UI via the same URL, but now you should not get a warning about an invalid certificate.</p>
<p>At this point you have a locked-down OpenFaaS installation, but you might not want to use basic authentication to restrict access to your OpenFaaS functions. If that's the case you can create another ingress resource that exposes them outside the &quot;/faas-admin&quot; path. Here's the YAML for that resource:</p>
<pre><code>apiVersion: extensions/v1beta1
kind: Ingress
metadata:
  name: faas-function-ingress
  annotations:
    kubernetes.io/tls-acme: &quot;true&quot;
    certmanager.k8s.io/issuer: letsencrypt-production
    nginx.ingress.kubernetes.io/rewrite-target: /functions
spec:
  tls:
  - hosts:
    - HOSTNAME
    secretName: faas-letsencrypt-production
  rules:
  - host: HOSTNAME
    http:
      paths:
      - path: /functions
        backend:
          serviceName: gateway
          servicePort: 8080
</code></pre>
<p>Copy it, replace both instances of HOSTNAME with your DNS hostname from earlier and save it as faas-function-ingress.yml. Deploy it to your cluser with this command <code>kubectl apply -f faas-function-ingress.yml -n openfaas</code>.</p>
<p>Once the ingress starts up and applies the TLS certificate, you will be able to access your functions at <a href="https://HOSTNAME/functions/FUNCTIONNAME">https://HOSTNAME/functions/FUNCTIONNAME</a> without authenticating.</p>
<h4 id="wrappingup">Wrapping up</h4>
<p>A few closing thoughts:</p>
<ol>
<li>I am still extremely new to AKS and Kubernetes, and I've tried to simplify this guide as much as possible for other newbies. In figuring out how to set this up, I relied heavily on the official <a href="https://docs.microsoft.com/en-us/azure/aks/">AKS docs</a>, and I encourage you to take a look at them if you want to dig in deeper.</li>
<li>This configuration does not expose the OpenFaaS Prometheus montitoring service. If you want to set that up, you will need to create a different DNS entry (either an A record or CNAME record) and create another ingress resource in the openfaas namespace for that host name that points to the &quot;prometheus&quot; service on service port 9090.</li>
<li>The Nginx ingress controller configuration I showed here is extremely simple. If you want to use a more sophisticated configurations to enable advanced features like rate limiting, for example, take a look at <a href="https://github.com/kubernetes/charts/tree/master/stable/nginx-ingress#configuration">https://github.com/kubernetes/charts/tree/master/stable/nginx-ingress#configuration</a> and <a href="https://github.com/kubernetes/ingress-nginx/blob/master/docs/user-guide/nginx-configuration/configmap.md">https://github.com/kubernetes/ingress-nginx/blob/master/docs/user-guide/nginx-configuration/configmap.md</a>.</li>
</ol>
</div>]]></content:encoded></item><item><title><![CDATA[Building a simple service relay for Dynamics 365 CE with RabbitMQ and Python - part 4]]></title><description><![CDATA[<div class="kg-card-markdown"><p>This is the final post in my series about building a service relay for Dynamics 365 CE with RabbitMQ and Python. In my previous <a href="https://alexanderdevelopment.net/post/2018/02/05/building-a-simple-service-relay-for-dynamics-365-ce-with-rabbitmq-and-python-part-3/">post</a> in this series, I showed the Python code to make the service relay work. In today's post, I will show how you can use <a href="https://azure.microsoft.com/en-us/services/functions/">Azure</a></p></div>]]></description><link>https://alexanderdevelopment.net/post/2018/02/07/building-a-simple-service-relay-for-dynamics-365-ce-with-rabbitmq-and-python-part-4/</link><guid isPermaLink="false">5a788a53c86c8900016cf367</guid><category><![CDATA[Microsoft Dynamics CRM]]></category><category><![CDATA[Dynamics 365]]></category><category><![CDATA[integration]]></category><category><![CDATA[Python]]></category><category><![CDATA[RabbitMQ]]></category><category><![CDATA[Azure]]></category><category><![CDATA[C#]]></category><dc:creator><![CDATA[Lucas Alexander]]></dc:creator><pubDate>Thu, 08 Feb 2018 04:00:42 GMT</pubDate><media:content url="https://alexanderdevelopment.net/content/images/2018/02/simple-service-relay-2.png" medium="image"/><content:encoded><![CDATA[<div class="kg-card-markdown"><img src="https://alexanderdevelopment.net/content/images/2018/02/simple-service-relay-2.png" alt="Building a simple service relay for Dynamics 365 CE with RabbitMQ and Python - part 4"><p>This is the final post in my series about building a service relay for Dynamics 365 CE with RabbitMQ and Python. In my previous <a href="https://alexanderdevelopment.net/post/2018/02/05/building-a-simple-service-relay-for-dynamics-365-ce-with-rabbitmq-and-python-part-3/">post</a> in this series, I showed the Python code to make the service relay work. In today's post, I will show how you can use <a href="https://azure.microsoft.com/en-us/services/functions/">Azure Functions</a> to make a consumer service proxy using C# so client applications don't have to access to your RabbitMQ broker directly, and I will also discuss some general thoughts on security and scalability for this service relay architecture.</p>
<p>Although this simple service relay allows external consumers to get data from Dynamics 365 CE without needing to connect directly, the examples I've shown so far require that they can connect to a RabbitMQ broker. This may be problematic for a variety of reasons, so you would probably want external consumers to connect to a web service proxy that would write requests to and read responses from the RabbitMQ broker.</p>
<h4 id="buildingaserviceproxyfunction">Building a service proxy function</h4>
<p>You can build an Azure Functions service proxy with Python, but I don't recommend it for three reasons:</p>
<ol>
<li>Azure Functions Python support is still considered experimental.</li>
<li>Python scripts that use external libraries can run <a href="https://github.com/Azure/azure-functions-host/issues/1626">exceedingly slow</a>.</li>
<li>Getting the environment set up is a bit of a hassle.</li>
</ol>
<p>On the other hand, building a service proxy function with C# was so much easier, and it performed much better than a comparable Python function (~.5 seconds for C# compared to 5+ seconds for Python).</p>
<p>Here are the steps I took to build my C# service proxy function:</p>
<ol>
<li>Create a C# HTTP trigger function.</li>
<li>Create and upload a project.json file with a dependency on the RabbitMQ client (see below).</li>
<li>Take the &quot;RpcClient&quot; class from the <a href="https://www.rabbitmq.com/tutorials/tutorial-six-dotnet.html">RabbitMQ .Net RPC tutorial</a> and call it from within my function.</li>
</ol>
<p>Here's my project.json file:</p>
<pre><code>{
  &quot;frameworks&quot;: {
    &quot;net46&quot;:{
      &quot;dependencies&quot;: {
        &quot;RabbitMQ.Client&quot;: &quot;5.0.1&quot;
      }
    }
   }
}
</code></pre>
<p>And here's my run.csx file:</p>
<pre><code>using System.Net;
using System;
using System.Collections.Concurrent;
using System.Text;
using RabbitMQ.Client;
using RabbitMQ.Client.Events;

public static async Task&lt;HttpResponseMessage&gt; Run(HttpRequestMessage req, TraceWriter log)
{
    log.Info(&quot;Processing request&quot;);

    // parse query parameter
    string query = req.GetQueryNameValuePairs()
        .FirstOrDefault(q =&gt; string.Compare(q.Key, &quot;query&quot;, true) == 0)
        .Value;

    // Get request body
    dynamic data = await req.Content.ReadAsAsync&lt;object&gt;();

    // Set name to query string or body data
    query = query ?? data?.query;

    var rpcClient = new RpcClient();
    
    log.Info(string.Format(&quot; [.] query start time {0}&quot;, DateTime.Now.ToString(&quot;MM/dd/yyyy hh:mm:ss.fff tt&quot;)));
    var response = rpcClient.Call(query);

    log.Info(string.Format(&quot; [.] query end time {0}&quot;, DateTime.Now.ToString(&quot;MM/dd/yyyy hh:mm:ss.fff tt&quot;)));
    rpcClient.Close();

    return req.CreateResponse(HttpStatusCode.OK, response);
}

public class RpcClient
{
    private readonly IConnection connection;
    private readonly IModel channel;
    private readonly string replyQueueName;
    private readonly EventingBasicConsumer consumer;
    private readonly BlockingCollection&lt;string&gt; respQueue = new BlockingCollection&lt;string&gt;();
    private readonly IBasicProperties props;

    public RpcClient()
    {
        var factory = new ConnectionFactory() { HostName = &quot;RABBITHOST&quot;, UserName=&quot;RABBITUSER&quot;, Password=&quot;RABBITUSERPASS&quot;  };

        connection = factory.CreateConnection();
        channel = connection.CreateModel();
        replyQueueName = channel.QueueDeclare().QueueName;
        consumer = new EventingBasicConsumer(channel);

        props = channel.CreateBasicProperties();
        var correlationId = Guid.NewGuid().ToString();
        props.CorrelationId = correlationId;
        props.ReplyTo = replyQueueName;

        consumer.Received += (model, ea) =&gt;
        {
            var body = ea.Body;
            var response = Encoding.UTF8.GetString(body);
            if (ea.BasicProperties.CorrelationId == correlationId)
            {
                respQueue.Add(response);
            }
        };
    }

    public string Call(string message)
    {
        var messageBytes = Encoding.UTF8.GetBytes(message);
        channel.BasicPublish(
            exchange: &quot;&quot;,
            routingKey: &quot;rpc_queue&quot;,
            basicProperties: props,
            body: messageBytes);

        channel.BasicConsume(
            consumer: consumer,
            queue: replyQueueName,
            autoAck: true);

        return respQueue.Take(); ;
    }

    public void Close()
    {
        connection.Close();
    }
}
</code></pre>
<p>Here's a screenshot showing me calling the C# function with Postman.<br>
<img src="https://alexanderdevelopment.net/content/images/2018/02/Postman_2018-02-05_22-02-52.png#img-thumbnail" alt="Building a simple service relay for Dynamics 365 CE with RabbitMQ and Python - part 4"></p>
<p>Because I did actually build a Python function, I will go ahead and share how I did it if you're interested. Here are the steps I took:</p>
<ol>
<li>Create a Python HTTP trigger function.</li>
<li>Install Python 3.6 via site extensions (see steps 2.1-2.4 <a href="https://stackoverflow.com/a/47213859">here</a>).</li>
<li>Install the necessary libraries using pip via <a href="https://david-obrien.net/2016/07/azure-functions-kudu/">KUDU</a>.</li>
</ol>
<p>Here's the Python function code:</p>
<pre><code>import os
import sys
import json
import pika
import uuid
import datetime

class CrmRpcClient(object):
    def __init__(self):
        #RabbitMQ connection details
        self.rabbituser = 'RABBITUSERNAME'
        self.rabbitpass = 'RABBITUSERPASS'
        self.rabbithost = 'RABBITHOST' 
        self.rabbitport = 5672
        self.rabbitqueue = 'rpc_queue'
        rabbitcredentials = pika.PlainCredentials(self.rabbituser, self.rabbitpass)
        rabbitparameters = pika.ConnectionParameters(host=self.rabbithost,
                                    port=self.rabbitport,
                                    virtual_host='/',
                                    credentials=rabbitcredentials)

        self.rabbitconn = pika.BlockingConnection(rabbitparameters)

        self.channel = self.rabbitconn.channel()

        #create an anonymous exclusive callback queue
        result = self.channel.queue_declare(exclusive=True)
        self.callback_queue = result.method.queue

        self.channel.basic_consume(self.on_response, no_ack=True,
                                   queue=self.callback_queue)

    #callback method for when a response is received - note the check for correlation id
    def on_response(self, ch, method, props, body):
        if self.corr_id == props.correlation_id:
            self.response = body

    #method to make the initial request
    def call(self, n):
        self.response = None
        #generate a new correlation id
        self.corr_id = str(uuid.uuid4())

        #publish the message to the rpc_queue - note the reply_to property is set to the callback queue from above
        self.channel.basic_publish(exchange='',
                                   routing_key=self.rabbitqueue,
                                   properties=pika.BasicProperties(
                                         reply_to = self.callback_queue,
                                         correlation_id = self.corr_id,
                                         ),
                                   body=n)
        while self.response is None:
            self.rabbitconn.process_data_events()
        return self.response

print(&quot; [.] query start time %r&quot; % str(datetime.datetime.now()))
#instantiate an rpc client
crm_rpc = CrmRpcClient()

postreqdata = json.loads(open(os.environ['req']).read())
query = postreqdata['query']

crm_rpc = CrmRpcClient()
print(&quot; [.] query start time %r&quot; % str(datetime.datetime.now()))
queryresponse = crm_rpc.call(query)
print(&quot; [.] query end time %r&quot; % str(datetime.datetime.now()))
response = open(os.environ['res'], 'w')
response.write(queryresponse.decode())
response.close()
</code></pre>
<p>Here's a screenshot showing me calling the Python function with Postman.<img src="https://alexanderdevelopment.net/content/images/2018/02/Postman_2018-02-05_22-10-20.png#img-thumbnail" alt="Building a simple service relay for Dynamics 365 CE with RabbitMQ and Python - part 4"></p>
<p>Note the difference in time between the two functions - 5.62 seconds for Python and .46 seconds for C#!</p>
<h4 id="securityandscalability">Security and scalability</h4>
<p>If you decide to use this approach in production, I'd suggest you carefully consider both security and scalability. Obviously the overall solution will only be as secure as your RabbitMQ broker and communications between the broker and its clients, so you'll want to look at best practices for access control and securing the communications with TLS. Here are some links for further reading on those subjects:</p>
<ul>
<li>TLS - <a href="https://www.rabbitmq.com/ssl.html">https://www.rabbitmq.com/ssl.html</a></li>
<li>Access control - <a href="https://www.rabbitmq.com/access-control.html">https://www.rabbitmq.com/access-control.html</a></li>
</ul>
<p>As for scalability, the approach I've shown creates a separate response queue for each consumer, but it can have problems scaling, especially if you are using a RabbitMQ cluster. You may want to look at the <a href="https://www.rabbitmq.com/direct-reply-to.html">&quot;direct reply-to&quot;</a> approach instead. For an interesting real-world overview of using direct reply-to, take a look at this <a href="https://facundoolano.wordpress.com/2016/06/26/real-world-rpc-with-rabbitmq-and-node-js/">blog post.</a>.</p>
<h4 id="wrappingup">Wrapping up</h4>
<p>I hope you've enjoyed this series and that it has given you some ideas about how to implement service relays in your Dynamics 365 CE projects. As I worked through the examples, I certainly learned a few new things, especially when I created my Python service proxy in Azure Functions.</p>
<p>Here are links to all the previous posts in this series.</p>
<ol>
<li><a href="https://alexanderdevelopment.net/post/2018/01/30/relaying-external-queries-to-on-premise-dynamics-365-ce-orgs-with-rabbitmq-and-python/">Part 1</a> - Series introduction</li>
<li><a href="https://alexanderdevelopment.net/post/2018/02/01/building-a-simple-service-relay-for-dynamics-365-ce-with-rabbitmq-and-python-part-2/">Part 2</a> - Solution prerequisites</li>
<li><a href="https://alexanderdevelopment.net/post/2018/02/05/building-a-simple-service-relay-for-dynamics-365-ce-with-rabbitmq-and-python-part-3/">Part 3</a> - Python code for the consumer and listener processes</li>
</ol>
<p>What do you think about this approach? Is it something you think you'd use in production? Let us know in the comments!</p>
</div>]]></content:encoded></item><item><title><![CDATA[Running Dynamics 365 Configuration Data Mover jobs in Azure Functions]]></title><description><![CDATA[<div class="kg-card-markdown"><p>My <a href="https://alexanderdevelopment.net/tag/configuration-data-mover/">Dynamics 365 Configuration Data Mover</a> utility allows you to run synchronization jobs from an interactive GUI tool or the command line, but the actual data synchronization logic is contained in a separate AlexanderDevelopment.ConfigDataMover.Lib.dll file that can be included in other applications. In today's post I will</p></div>]]></description><link>https://alexanderdevelopment.net/post/2017/08/09/running-dynamics-365-configuration-data-mover-jobs-in-azure-functions/</link><guid isPermaLink="false">5a5837246636a30001b978d4</guid><category><![CDATA[Microsoft Dynamics CRM]]></category><category><![CDATA[Dynamics 365]]></category><category><![CDATA[integration]]></category><category><![CDATA[utilities]]></category><category><![CDATA[Configuration Data Mover]]></category><category><![CDATA[Azure]]></category><dc:creator><![CDATA[Lucas Alexander]]></dc:creator><pubDate>Wed, 09 Aug 2017 19:13:58 GMT</pubDate><media:content url="https://alexanderdevelopment.net/content/images/2017/08/chrome_2017-08-09_12-44-48-1.png" medium="image"/><content:encoded><![CDATA[<div class="kg-card-markdown"><img src="https://alexanderdevelopment.net/content/images/2017/08/chrome_2017-08-09_12-44-48-1.png" alt="Running Dynamics 365 Configuration Data Mover jobs in Azure Functions"><p>My <a href="https://alexanderdevelopment.net/tag/configuration-data-mover/">Dynamics 365 Configuration Data Mover</a> utility allows you to run synchronization jobs from an interactive GUI tool or the command line, but the actual data synchronization logic is contained in a separate AlexanderDevelopment.ConfigDataMover.Lib.dll file that can be included in other applications. In today's post I will show how you can set up an Azure Function to execute a Configuration Data Mover job file to sync data between two Dynamics 365 organizations.</p>
<h4 id="settingupyourazurefunction">Setting up your Azure Function</h4>
<p>First create a new Azure Function. I created an HTTP trigger function with the default &quot;function&quot; authorization level so that I could post parameters to it, but if you could modify the code sample later in this post to use it with a different type of trigger. <img src="https://alexanderdevelopment.net/content/images/2017/08/chrome_2017-08-09_12-37-17.png#img-thumbnail" alt="Running Dynamics 365 Configuration Data Mover jobs in Azure Functions"></p>
<p>Next open your function's <a href="https://blogs.msdn.microsoft.com/benjaminperkins/2017/04/13/how-to-add-assembly-references-to-an-azure-function-app/">Kudu console</a> so you can create a &quot;bin&quot; directory to upload the AlexanderDevelopment.ConfigDataMover.Lib.dll file. <img src="https://alexanderdevelopment.net/content/images/2017/08/chrome_2017-08-09_12-38-26.png#img-thumbnail" alt="Running Dynamics 365 Configuration Data Mover jobs in Azure Functions"></p>
<p>Navigate to the correct directory and create a &quot;bin&quot; directory inside it. <img src="https://alexanderdevelopment.net/content/images/2017/08/chrome_2017-08-09_12-38-56.png#img-thumbnail" alt="Running Dynamics 365 Configuration Data Mover jobs in Azure Functions"></p>
<p>After that, go back to the main function editor interface to see the newly created &quot;bin&quot; directory in the &quot;view files&quot; area. <img src="https://alexanderdevelopment.net/content/images/2017/08/chrome_2017-08-09_12-39-17.png" alt="Running Dynamics 365 Configuration Data Mover jobs in Azure Functions"></p>
<p>Look in the local directory where you have installed the Configuration Data Mover and find the &quot;AlexanderDevelopment.ConfigDataMover.Lib.dll&quot; file. Upload it to the &quot;bin&quot; directory you just created. <img src="https://alexanderdevelopment.net/content/images/2017/08/chrome_2017-08-09_12-39-35.png#img-thumbnail" alt="Running Dynamics 365 Configuration Data Mover jobs in Azure Functions"></p>
<p>Navigate back to the main function directory and create a project.json file to pull in dependencies via NuGet. <img src="https://alexanderdevelopment.net/content/images/2017/08/chrome_2017-08-09_12-40-10.png#img-thumbnail" alt="Running Dynamics 365 Configuration Data Mover jobs in Azure Functions"></p>
<p>Here is the content of the file so you can copy and paste:</p>
<pre><code>{
  &quot;frameworks&quot;: {
    &quot;net46&quot;:{
      &quot;dependencies&quot;: {
        &quot;Microsoft.CrmSdk.CoreAssemblies&quot;: &quot;8.2.0&quot;,
        &quot;Microsoft.CrmSdk.XrmTooling.CoreAssembly&quot;: &quot;8.2.0&quot;,
        &quot;log4net&quot;: &quot;2.0.8&quot;
      }
    }
  }
}
</code></pre>
<p>Finally, you need to update the function code in the run.csx file. <img src="https://alexanderdevelopment.net/content/images/2017/08/chrome_2017-08-09_13-10-11.png#img-thumbnail" alt="Running Dynamics 365 Configuration Data Mover jobs in Azure Functions"></p>
<p>Here's the code you can use:</p>
<pre><code>#r &quot;NewtonSoft.Json&quot;
#r &quot;AlexanderDevelopment.ConfigDataMover.Lib.dll&quot;
using System.Net;
using System.Collections.Generic;
using AlexanderDevelopment.ConfigDataMover.Lib;
using System.Xml;

static string _sourceString = null;
static string _targetString = null;
static bool _mapBaseBu = false;
static bool _mapBaseCurrency = false;
static List&lt;GuidMapping&gt; _guidMappings = new List&lt;GuidMapping&gt;();
static List&lt;JobStep&gt; _jobSteps = new List&lt;JobStep&gt;();

public static async Task&lt;HttpResponseMessage&gt; Run(HttpRequestMessage req, TraceWriter log)
{
    log.Info(&quot;C# HTTP trigger function processed a request.&quot;);

    // parse query parameters
    string jobdata = req.GetQueryNameValuePairs()
        .FirstOrDefault(q =&gt; string.Compare(q.Key, &quot;jobdata&quot;, true) == 0)
        .Value;
    string sourceparam = req.GetQueryNameValuePairs()
        .FirstOrDefault(q =&gt; string.Compare(q.Key, &quot;source&quot;, true) == 0)
        .Value;
    string targetparam = req.GetQueryNameValuePairs()
        .FirstOrDefault(q =&gt; string.Compare(q.Key, &quot;target&quot;, true) == 0)
        .Value;

    if(jobdata == null)
    {
        return req.CreateResponse(HttpStatusCode.BadRequest, &quot;Please post jobdata&quot;);
    }
    else
    {
        ParseConfig(jobdata);
		
		//use source and target values if provided in the POST
		if(!string.IsNullOrEmpty(targetparam))
		{
			_targetString = targetparam;
		}
		if(!string.IsNullOrEmpty(sourceparam))
		{
			_sourceString = sourceparam;
		}
		
		//do some basic validations
		if (string.IsNullOrEmpty(_sourceString))
		{
			return req.CreateResponse(HttpStatusCode.BadRequest,&quot;no source connection specified - exiting&quot;);
		}
		if (string.IsNullOrEmpty(_targetString))
		{
			return req.CreateResponse(HttpStatusCode.BadRequest,&quot;no target connection specified - exiting&quot;);
		}
		if (!(_jobSteps.Count &gt; 0))
		{
			return req.CreateResponse(HttpStatusCode.BadRequest,&quot;no steps in job - exiting&quot;);
		}

		Importer importer = new Importer();
		importer.GuidMappings = _guidMappings;
		importer.JobSteps = _jobSteps;
		importer.SourceString = _sourceString; 
		importer.TargetString = _targetString; 
		importer.MapBaseBu = _mapBaseBu;
		importer.MapBaseCurrency = _mapBaseCurrency;
		importer.Process();

		int errorCount = importer.ErrorCount;

		importer = null;
		
		//show a message to the user
		if (errorCount == 0)
		{
			return req.CreateResponse(HttpStatusCode.OK,&quot;Job finished with no errors.&quot;);
		}
		else
		{
			return req.CreateResponse(HttpStatusCode.BadRequest,&quot;Job finished with errors.&quot;);
		}
    }
}

public static void ParseConfig(string jobdata)
{
    XmlDocument xml = new XmlDocument();
	xml.LoadXml(jobdata);
	_jobSteps.Clear();
	_guidMappings.Clear();

	XmlNodeList stepList = xml.GetElementsByTagName(&quot;Step&quot;);
	foreach (XmlNode xn in stepList)
	{
		JobStep step = new JobStep();
		step.StepName = xn.SelectSingleNode(&quot;Name&quot;).InnerText;
		step.StepFetch = xn.SelectSingleNode(&quot;Fetch&quot;).InnerText;
		step.UpdateOnly = false;
		if(xn.Attributes[&quot;updateOnly&quot;]!=null)
			step.UpdateOnly = Convert.ToBoolean(xn.Attributes[&quot;updateOnly&quot;].Value);

		step.CreateOnly = false;
		if (xn.Attributes[&quot;createOnly&quot;] != null)
			step.CreateOnly = Convert.ToBoolean(xn.Attributes[&quot;createOnly&quot;].Value);

		_jobSteps.Add(step);
	}

	XmlNodeList configData = xml.GetElementsByTagName(&quot;JobConfig&quot;);
	_mapBaseBu = Convert.ToBoolean(configData[0].Attributes[&quot;mapBuGuid&quot;].Value);
	_mapBaseCurrency = Convert.ToBoolean(configData[0].Attributes[&quot;mapCurrencyGuid&quot;].Value);

	XmlNodeList mappingList = xml.GetElementsByTagName(&quot;GuidMapping&quot;);
	foreach (XmlNode xn in mappingList)
	{
		Guid sourceGuid = new Guid(xn.Attributes[&quot;source&quot;].Value);
		Guid targetGuid = new Guid(xn.Attributes[&quot;target&quot;].Value);
		_guidMappings.Add(new GuidMapping { sourceId = sourceGuid, targetId = targetGuid });
	}
	XmlNodeList connectionNodes = xml.GetElementsByTagName(&quot;ConnectionDetails&quot;);
	if (connectionNodes.Count &gt; 0)
	{
		_sourceString = connectionNodes[0].Attributes[&quot;source&quot;].Value;
		_targetString = connectionNodes[0].Attributes[&quot;target&quot;].Value;
	}
}
</code></pre>
<h4 id="executingyourazurefunction">Executing your Azure Function</h4>
<p>To execute the function, you'll need to copy the function key so that you can send it as a POST parameter.<br>
<img src="https://alexanderdevelopment.net/content/images/2017/08/chrome_2017-08-09_12-43-21.png#img-thumbnail" alt="Running Dynamics 365 Configuration Data Mover jobs in Azure Functions"></p>
<p>Once you have that, you can build a POST request. Here's what it looks like in Postman:<br>
<img src="https://alexanderdevelopment.net/content/images/2017/08/Postman_2017-08-09_12-44-19-redacted.png#img-thumbnail" alt="Running Dynamics 365 Configuration Data Mover jobs in Azure Functions"></p>
<p>The only required parameters are &quot;code&quot; and &quot;jobdata.&quot; &quot;Code&quot; is the function key you copied earlier. &quot;Jobdata&quot; is the content of a Configuration Data Mover job XML file. If you do not have the source/target connection details saved in your &quot;jobdata&quot; XML, you will need to also include &quot;source&quot; and &quot;target&quot; connection string parameters in your request. If you do have connection details saved in the &quot;jobdata&quot; XML and you also supply &quot;source&quot; and &quot;target&quot; connection string parameters, they will be used instead of what is in the &quot;jobdata&quot; XML.</p>
<p>Assuming everything goes as expected, you should get a &quot;job finished with no errors&quot; response like this:<br>
<img src="https://alexanderdevelopment.net/content/images/2017/08/Postman_2017-08-09_12-45-11.png#img-thumbnail" alt="Running Dynamics 365 Configuration Data Mover jobs in Azure Functions"></p>
<p>If you still have the function editor interface open, you'll see new log entries like this:<br>
<img src="https://alexanderdevelopment.net/content/images/2017/08/chrome_2017-08-09_12-44-48.png#img-thumbnail" alt="Running Dynamics 365 Configuration Data Mover jobs in Azure Functions"></p>
<h4 id="caveats">Caveats</h4>
<ol>
<li>I originally built the Configuration Data Mover to use Apache log4net for detailed logging. The AlexanderDevelopment.ConfigDataMover.Lib library here is still trying to use log4net, but those log entries aren't going anywhere. I'll probably look into modifying the logging approach in the library so that it works better in situations where the library used like this. The upshot is that what I've outlined here will give you only a basic success/failure message.</li>
<li>Currently the AlexanderDevelopment.ConfigDataMover.Lib library can either connect to a live CRM instance or read configuration data from a JSON file that it can open. I have a future enhancement in mind that would allow for configuration data to be passed to the library to make it more flexible.</li>
</ol>
</div>]]></content:encoded></item><item><title><![CDATA[Executing Dynamics 365 workflows from Microsoft Flow]]></title><description><![CDATA[<div class="kg-card-markdown"><p>The only Dynamics 365 actions that Microsoft Flow offers right now are &quot;create a new record&quot; and &quot;list records,&quot; but with just a bit of additional effort it's possible to access all the capabilities of the Web API. Today I will show how to create a</p></div>]]></description><link>https://alexanderdevelopment.net/post/2016/12/10/executing-dynamics-365-workflows-from-microsoft-flow/</link><guid isPermaLink="false">5a5837246636a30001b9787e</guid><category><![CDATA[Microsoft Dynamics CRM]]></category><category><![CDATA[Dynamics 365]]></category><category><![CDATA[Azure]]></category><category><![CDATA[integration]]></category><dc:creator><![CDATA[Lucas Alexander]]></dc:creator><pubDate>Sat, 10 Dec 2016 19:34:51 GMT</pubDate><media:content url="https://alexanderdevelopment.net/content/images/2016/12/01-1.png" medium="image"/><content:encoded><![CDATA[<div class="kg-card-markdown"><img src="https://alexanderdevelopment.net/content/images/2016/12/01-1.png" alt="Executing Dynamics 365 workflows from Microsoft Flow"><p>The only Dynamics 365 actions that Microsoft Flow offers right now are &quot;create a new record&quot; and &quot;list records,&quot; but with just a bit of additional effort it's possible to access all the capabilities of the Web API. Today I will show how to create a Microsoft Flow that queries a set of accounts and executes a workflow for each one.</p>
<p>My Flow comprises five separate actions, which you can see in the screenshot below:<br>
<img src="https://alexanderdevelopment.net/content/images/2016/12/01.png#img-thumbnail" alt="Executing Dynamics 365 workflows from Microsoft Flow"></p>
<ol>
<li>The request trigger - In this case I am triggering my Flow by making an empty HTTP POST request, but you could use any supported trigger.</li>
<li>A &quot;getauthtoken&quot; HTTP request - Flow knows how to query a list of CRM records without requiring any additional authentication, but in order to work with the Web API directly, it's necessary to get an OAuth2 token from Active Directory like I described in my recent <a href="https://alexanderdevelopment.net/post/2016/11/23/dynamics-365-and-node-js-integration-using-the-web-api/">&quot;Dynamics 365 and Node.js integration using the Web API&quot;</a> post.</li>
<li>A Dynamics 365 list records action - This action queries Dynamics 365 for the accounts on which the workflow will be executed.</li>
<li>An &quot;apply to each&quot; loop - This loops through the returned accounts from the previous step and executes the workflow for each one.</li>
<li>A response step - This action sends back a success message regardless of what happens in the previous steps. Ideally I would handle errors appropriately here.</li>
</ol>
<p>Let's take a closer look at each action.</p>
<h4 id="triggeringtheflow">Triggering the Flow</h4>
<p>As I mentioned above, my Flow is triggered from an empty POST request, but you could use a different trigger.<br>
<img src="https://alexanderdevelopment.net/content/images/2016/12/02.png#img-thumbnail" alt="Executing Dynamics 365 workflows from Microsoft Flow"></p>
<h4 id="gettingtheoauthtoken">Getting the OAuth token</h4>
<p>In order to get an OAuth token, my Flow uses an HTTP request action to post the following parameters to the AD token endpoint:</p>
<ol>
<li>client_id - registered client id from AD</li>
<li>resource - root path to your Dynamics 365 org (<a href="https://XXXXX.crm.dynamics.com">https://XXXXX.crm.dynamics.com</a>, for example)</li>
<li>username - Dynamics 365 username</li>
<li>password - Dynamics 365 user password</li>
<li>grant_type = &quot;password&quot;</li>
</ol>
<p>The AD token endpoint expects data to be sent as <code>application/x-www-form-urlencoded</code> content, so you have to set the request header and manually build the POST string. The POST string should look like this:</p>
<pre><code>client_id=REGISTERED_CLIENT_ID&amp;resource=https://CRMORG.crm.dynamics.com&amp;username=USERNAME&amp;password=PASSWORD&amp;grant_type=password
</code></pre>
<p>Interestingly enough, you do not have to URI encode your parameters to escape special characters in general, though I suspect it would be necessary if your password contains &quot;&amp;&quot; or &quot;=&quot;<br>
characters.</p>
<p>Here's what my configured action looks like:</p>
<p><img src="https://alexanderdevelopment.net/content/images/2016/12/03.png#img-thumbnail" alt="Executing Dynamics 365 workflows from Microsoft Flow"></p>
<p><em>(It would make more sense store all these string configuration settings as variables using Flow &quot;compose&quot; actions, but I chose not to do that for this post to make the overall sequence easier to follow.)</em></p>
<h4 id="retrievingrecords">Retrieving records</h4>
<p>The next step in my Flow queries Dynamics 365 for accounts using the native &quot;list records&quot; functionality. This action does not require the OAuth token retrieved in the previous step.</p>
<p><img src="https://alexanderdevelopment.net/content/images/2016/12/04.png#img-thumbnail" alt="Executing Dynamics 365 workflows from Microsoft Flow"></p>
<h4 id="loopingthroughretrievedrecords">Looping through retrieved records</h4>
<p>Next my Flow use an &quot;apply to each&quot; step to loop through the retrieved accounts and then execute the workflow.<br>
<img src="https://alexanderdevelopment.net/content/images/2016/12/05.png#img-thumbnail" alt="Executing Dynamics 365 workflows from Microsoft Flow"></p>
<p>Calling the workflow requires POSTing a JSON object containing the accountid value to the Dynamics 365 Web API endpoint (note the workflow id is included in the endpoint here), and the Web API expects the OAuth token from earlier to be included in the request headers. Flow does not automatically know how to parse the JSON response from the OAuth token retrieval request, so it won't show up in the dynamic content menu, but you can access it using this format <code>@{body('HTTP REQUEST ACTION NAME').access_token}</code>.</p>
<p>Here's my request headers object:</p>
<pre><code>{
&quot;Authorization&quot;: &quot;Bearer @{body('getauthtoken').access_token}&quot;,
&quot;OData-MaxVersion&quot;: &quot;4.0&quot;,
&quot;OData-Version&quot;: &quot;4.0&quot;,
&quot;Accept&quot;: &quot;application/json&quot;,
&quot;Content-Type&quot;: &quot;application/json; charset=utf-8&quot;
}
</code></pre>
<p>Once you save the Flow, the UI will replace the <code>@{body('HTTP REQUEST ACTION NAME').access_token}</code> with something that looks like a dynamic content placeholder as you can see in the screenshot below.<br>
<img src="https://alexanderdevelopment.net/content/images/2016/12/06.png#img-thumbnail" alt="Executing Dynamics 365 workflows from Microsoft Flow"></p>
<h4 id="sendingaresponse">Sending a response</h4>
<p>Finally my Flow uses a response action to send a message back to the client. As I mentioned above, I should do some error handling here, but for now I am just sending back &quot;success&quot; no matter what happens earlier in the Flow.<br>
<img src="https://alexanderdevelopment.net/content/images/2016/12/07.png#img-thumbnail" alt="Executing Dynamics 365 workflows from Microsoft Flow"></p>
<p>When I call this Flow via Postman, I get this output:<br>
<img src="https://alexanderdevelopment.net/content/images/2016/12/08.png#img-thumbnail" alt="Executing Dynamics 365 workflows from Microsoft Flow"></p>
<p>Checking the Flow run log shows me this visual with all green checks.<br>
<img src="https://alexanderdevelopment.net/content/images/2016/12/09.png#img-thumbnail" alt="Executing Dynamics 365 workflows from Microsoft Flow"></p>
<p>And I can expand the individual actions to see what happened in each step. Here you can see the loop of execute workflow calls.<br>
<img src="https://alexanderdevelopment.net/content/images/2016/12/10.png#img-thumbnail" alt="Executing Dynamics 365 workflows from Microsoft Flow"></p>
<h4 id="wrappingup">Wrapping up</h4>
<p>Now that I've shown how you <em>could</em> call a Dynamics 365 workflow from Microsoft Flow, it's worth asking whether you actually <em>should</em> call a Dynamics 365 workflow from Microsoft Flow.<br>
<img src="https://alexanderdevelopment.net/content/images/2016/12/malcom-could-should.jpg#img-thumbnail" alt="Executing Dynamics 365 workflows from Microsoft Flow"></p>
<p>Given the state of Microsoft Flow's native Dynamics 365 support today, I would say no. At this point Flow's visual designer makes it a lot harder to build this functionality than it is if you use Azure Functions as I discussed in these earlier posts:</p>
<ol>
<li><a href="https://alexanderdevelopment.net/post/2016/11/25/scheduling-dynamics-365-workflows-with-azure-functions/">Scheduling Dynamics 365 workflows with Azure Functions and Node.js</a></li>
<li><a href="https://alexanderdevelopment.net/post/2016/11/29/scheduling-dynamics-365-workflows-with-azure-functions-and-python/">Scheduling Dynamics 365 workflows with Azure Functions and Python</a></li>
<li><a href="https://alexanderdevelopment.net/post/2016/11/30/scheduling-dynamics-365-workflows-with-azure-functions-and-csharp/">Scheduling Dynamics 365 workflows with Azure Functions and C#</a></li>
</ol>
<p>For more considerations on when to use Flow or something else, here is an interesting post from Microsoft that compares and contrasts its various &quot;serverless&quot; computing offerings: <a href="https://docs.microsoft.com/en-us/azure/azure-functions/functions-compare-logic-apps-ms-flow-webjobs">https://docs.microsoft.com/en-us/azure/azure-functions/functions-compare-logic-apps-ms-flow-webjobs</a>.</p>
<p>Although I think Flow isn't a great tool for this particular use case, I can envision some good uses in the Dynamics 365 space, and I will certainly be paying attention to see how it improves in the future. Thanks for reading!</p>
</div>]]></content:encoded></item><item><title><![CDATA[Scheduling Dynamics 365 workflows with Azure Functions and C#]]></title><description><![CDATA[<div class="kg-card-markdown"><p>Over the past few days, I've shared two approaches for scheduling Dynamics 365 workflows using Azure Functions and the Dynamics 365 Web API. One uses <a href="https://alexanderdevelopment.net/post/2016/11/25/scheduling-dynamics-365-workflows-with-azure-functions/">Node.js</a>, and the other uses <a href="https://alexanderdevelopment.net/post/2016/11/29/scheduling-dynamics-365-workflows-with-azure-functions-and-python/">Python</a>. Because most Dynamics CRM developers are probably more familiar with C# than Node.js or Python, I also</p></div>]]></description><link>https://alexanderdevelopment.net/post/2016/11/29/scheduling-dynamics-365-workflows-with-azure-functions-and-csharp/</link><guid isPermaLink="false">5a5837246636a30001b97878</guid><category><![CDATA[Microsoft Dynamics CRM]]></category><category><![CDATA[Dynamics 365]]></category><category><![CDATA[Azure]]></category><category><![CDATA[demonstrations]]></category><category><![CDATA[C#]]></category><dc:creator><![CDATA[Lucas Alexander]]></dc:creator><pubDate>Wed, 30 Nov 2016 02:00:00 GMT</pubDate><media:content url="https://alexanderdevelopment.net/content/images/2016/11/chrome_2016-11-29_13-07-59.png" medium="image"/><content:encoded><![CDATA[<div class="kg-card-markdown"><img src="https://alexanderdevelopment.net/content/images/2016/11/chrome_2016-11-29_13-07-59.png" alt="Scheduling Dynamics 365 workflows with Azure Functions and C#"><p>Over the past few days, I've shared two approaches for scheduling Dynamics 365 workflows using Azure Functions and the Dynamics 365 Web API. One uses <a href="https://alexanderdevelopment.net/post/2016/11/25/scheduling-dynamics-365-workflows-with-azure-functions/">Node.js</a>, and the other uses <a href="https://alexanderdevelopment.net/post/2016/11/29/scheduling-dynamics-365-workflows-with-azure-functions-and-python/">Python</a>. Because most Dynamics CRM developers are probably more familiar with C# than Node.js or Python, I also created an equivalent C# version. Just like with my previous examples, this version calls the Web API directly instead of using any SDK assemblies.</p>
<p>Here's my code. It does the following:</p>
<ol>
<li>Request an OAuth token using a username and password.</li>
<li>Query the Dynamics 365 Web API for accounts with names that start with the letter &quot;F.&quot;</li>
<li>Execute a workflow for each record that was retrieved in the previous step. The workflow that I am executing is the same workflow I used in my <a href="https://alexanderdevelopment.net/post/2016/11/25/scheduling-dynamics-365-workflows-with-azure-functions/">Node.js example</a> to create a note on an account.</li>
</ol>
<pre><code>#r &quot;Newtonsoft.Json&quot;

using System;
using System.Net;
using System.IO;
using Newtonsoft.Json;

//set these values to retrieve the oauth token
static string crmorg = &quot;https://CRMORG.crm.dynamics.com&quot;;
static string clientid = &quot;00000000-0000-0000-0000-000000000000&quot;;
static string username = &quot;xxxxxx@xxxxxxxx&quot;;
static string userpassword = &quot;xxxxxxxx&quot;;
static string tokenendpoint = &quot;https://login.microsoftonline.com/00000000-0000-0000-0000-000000000000/oauth2/token&quot;;

//set these values to query your crm data
static string crmwebapihost = &quot;https://CRMORG.api.crm.dynamics.com/api/data/v8.2&quot;;
static string crmwebapipath = &quot;/accounts?$select=name,accountid&amp;$filter=startswith(name,'F')&quot;;

static string workflowid = &quot;DC8519EC-F3CE-4BC9-BB79-DF2AD70217A1&quot;;

public static void Run(TimerInfo myTimer, TraceWriter log)
{
	//build the authorization request
	var reqstring = &quot;client_id=&quot; + clientid;
	reqstring += &quot;&amp;resource=&quot; + Uri.EscapeUriString(crmorg);
	reqstring += &quot;&amp;username=&quot; + Uri.EscapeUriString(username);
	reqstring += &quot;&amp;password=&quot; + Uri.EscapeUriString(userpassword);
	reqstring += &quot;&amp;grant_type=password&quot;;

	WebRequest req = WebRequest.Create(tokenendpoint);
	req.ContentType = &quot;application/x-www-form-urlencoded&quot;;
	req.Method = &quot;POST&quot;;
	byte[] bytes = System.Text.Encoding.ASCII.GetBytes(reqstring);
	req.ContentLength = bytes.Length;
	System.IO.Stream os = req.GetRequestStream();
	os.Write(bytes, 0, bytes.Length);
	os.Close();

	HttpWebResponse resp = (HttpWebResponse)req.GetResponse();
	StreamReader tokenreader = new StreamReader(resp.GetResponseStream());
	string responseBody = tokenreader.ReadToEnd();
	tokenreader.Close();
	var tokenresponse = JsonConvert.DeserializeObject&lt;Newtonsoft.Json.Linq.JObject&gt;(responseBody);
	var token = tokenresponse[&quot;access_token&quot;];
	log.Info(&quot;got token&quot;);

	WebRequest crmreq = WebRequest.Create(crmwebapihost + crmwebapipath);
	crmreq.Headers = new WebHeaderCollection();
	crmreq.Headers.Add(&quot;Authorization&quot;, &quot;Bearer &quot; + token);
	crmreq.Headers.Add(&quot;OData-MaxVersion&quot;, &quot;4.0&quot;);
	crmreq.Headers.Add(&quot;OData-Version&quot;, &quot;4.0&quot;);
	crmreq.Headers.Add(&quot;Prefer&quot;, &quot;odata.maxpagesize=500&quot;);
	crmreq.Headers.Add(&quot;Prefer&quot;, &quot;odata.include-annotations=OData.Community.Display.V1.FormattedValue&quot;);
	crmreq.ContentType = &quot;application/json; charset=utf-8&quot;;
	crmreq.Method = &quot;GET&quot;;

	HttpWebResponse crmresp = (HttpWebResponse)crmreq.GetResponse();
	StreamReader crmreader = new StreamReader(crmresp.GetResponseStream());
	string crmresponseBody = crmreader.ReadToEnd();
	crmreader.Close();
	var crmresponseobj = JsonConvert.DeserializeObject&lt;Newtonsoft.Json.Linq.JObject&gt;(crmresponseBody);
	foreach(var row in crmresponseobj[&quot;value&quot;].Children())
	{
		log.Info(row[&quot;name&quot;].ToString());
		runWorkflow(token.ToString(), new Guid(row[&quot;accountid&quot;].ToString()), log);

	}

	Console.ReadLine();
}

static void runWorkflow(string token, Guid entityid, TraceWriter log)
{
	var crmwebapiworkflowpath = &quot;/workflows(&quot; + workflowid + &quot;)/Microsoft.Dynamics.CRM.ExecuteWorkflow&quot;;
	WebRequest req = WebRequest.Create(crmwebapihost + crmwebapiworkflowpath);

	log.Info(&quot;  calling workflow for &quot; + entityid);

	string reqobject = &quot;{ \&quot;EntityId\&quot;: \&quot;&quot; + entityid + &quot;\&quot;}&quot;;
    
	req.Headers.Add(&quot;Authorization&quot;, &quot;Bearer &quot; + token);
	req.Headers.Add(&quot;OData-MaxVersion&quot;, &quot;4.0&quot;);
	req.Headers.Add(&quot;OData-Version&quot;, &quot;4.0&quot;);
	req.Headers.Add(&quot;Prefer&quot;, &quot;odata.maxpagesize=500&quot;);
	req.Headers.Add(&quot;Prefer&quot;, &quot;odata.include-annotations=OData.Community.Display.V1.FormattedValue&quot;);
	req.ContentType = &quot;application/json; charset=utf-8&quot;;
    req.Method = &quot;POST&quot;;
	
	byte[] bytes = System.Text.Encoding.ASCII.GetBytes(reqobject);
	req.ContentLength = bytes.Length;
	System.IO.Stream os = req.GetRequestStream();
	os.Write(bytes, 0, bytes.Length);
	os.Close();

	HttpWebResponse resp = (HttpWebResponse)req.GetResponse();
	StreamReader reader = new StreamReader(resp.GetResponseStream());
	string responseBody = reader.ReadToEnd();
	reader.Close();
	
    var responseobj = JsonConvert.DeserializeObject&lt;Newtonsoft.Json.Linq.JObject&gt;(responseBody);
	if(resp.StatusCode == HttpStatusCode.OK)
	{
		log.Info(&quot;    success &quot; + entityid.ToString());
	}
	else
	{
		log.Info(&quot;    error &quot; + entityid.ToString());
	}
}
</code></pre>
<p>To set this up in your Azure tenant, set up a Functions App and a new C# timer trigger function like I described in the <a href="https://alexanderdevelopment.net/post/2016/11/25/scheduling-dynamics-365-workflows-with-azure-functions/">Node.js example</a>. Copy the C# code from above and paste it into the function editor window. Set any specifics relative to your Dynamics 365 organization and click save. That's all there is to it.</p>
<p>*<em>If you're wondering about that <code>#r &quot;Newtonsoft.Json&quot;</code> at the top of the C# code, take a look <a href="https://docs.microsoft.com/en-us/azure/azure-functions/functions-reference-csharp#referencing-external-assemblies">here</a>.</em></p>
</div>]]></content:encoded></item><item><title><![CDATA[Scheduling Dynamics 365 workflows with Azure Functions and Python]]></title><description><![CDATA[<div class="kg-card-markdown"><p>Last week I shared a <a href="https://alexanderdevelopment.net/post/2016/11/25/scheduling-dynamics-365-workflows-with-azure-functions/">solution</a> for Scheduling Dynamics 365 workflows with Azure Functions and Node.js. In this post, I will show how to achieve equivalent functionality using Python. The actual Python code is simpler than my Node.js example, but the Azure Functions configuration is much more complicated.</p></div>]]></description><link>https://alexanderdevelopment.net/post/2016/11/29/scheduling-dynamics-365-workflows-with-azure-functions-and-python/</link><guid isPermaLink="false">5a5837246636a30001b97872</guid><category><![CDATA[Microsoft Dynamics CRM]]></category><category><![CDATA[Dynamics 365]]></category><category><![CDATA[Python]]></category><category><![CDATA[Azure]]></category><category><![CDATA[demonstrations]]></category><dc:creator><![CDATA[Lucas Alexander]]></dc:creator><pubDate>Tue, 29 Nov 2016 12:00:00 GMT</pubDate><media:content url="https://alexanderdevelopment.net/content/images/2016/11/06-2.png" medium="image"/><content:encoded><![CDATA[<div class="kg-card-markdown"><img src="https://alexanderdevelopment.net/content/images/2016/11/06-2.png" alt="Scheduling Dynamics 365 workflows with Azure Functions and Python"><p>Last week I shared a <a href="https://alexanderdevelopment.net/post/2016/11/25/scheduling-dynamics-365-workflows-with-azure-functions/">solution</a> for Scheduling Dynamics 365 workflows with Azure Functions and Node.js. In this post, I will show how to achieve equivalent functionality using Python. The actual Python code is simpler than my Node.js example, but the Azure Functions configuration is much more complicated.</p>
<p>First, here's the Python script I am using. It does this following:</p>
<ol>
<li>Request an OAuth token using a username and password.</li>
<li>Query the Dynamics 365 Web API for accounts with names that start with the letter &quot;F.&quot;</li>
<li>Execute a workflow for each record that was retrieved in the previous step. The workflow that I am executing is the same workflow I used in my Node.js example to create a note on an account.</li>
</ol>
<pre><code>import os
import sys
cwd = os.getcwd()
sitepackage = cwd + &quot;\site-packages&quot;
sys.path.append(sitepackage)

import requests
import json

#set these values to retrieve the oauth token
crmorg = 'https://CRMORG.crm.dynamics.com' #base url for crm org  
clientid = '00000000-0000-0000-0000-000000000000' #application client id  
username = 'xxxxxx@xxxxxxxx' #username  
userpassword = 'xxxxxxxx' #password  
tokenendpoint = 'https://login.microsoftonline.com/00000000-0000-0000-0000-000000000000/oauth2/token' #oauth token endpoint

#set these values to query your crm data
crmwebapi = 'https://lucasdemo01.api.crm.dynamics.com/api/data/v8.2'
crmwebapiquery = &quot;/accounts?$select=name,accountid&amp;$filter=startswith(name,'f')&quot;

workflowid = 'DC8519EC-F3CE-4BC9-BB79-DF2AD70217A1'; #guid for the workflow you want to execute

def start():
    #build the authorization request
    tokenpost = {
        'client_id':clientid,
        'resource':crmorg,
        'username':username,
        'password':userpassword,
        'grant_type':'password'
    }

    #make the token request
    print('requesting token . . .')
    tokenres = requests.post(tokenendpoint, data=tokenpost)
    print('token response received. . .')

    accesstoken = ''

    #extract the access token
    try:
        print('parsing token response . . .')
        accesstoken = tokenres.json()['access_token']
        #print('accesstoken is - ' + accesstoken)

    except(KeyError):
        print('Could not get access token')

    if(accesstoken!=''):
        crmrequestheaders = {
            'Authorization': 'Bearer ' + accesstoken,
            'OData-MaxVersion': '4.0',
            'OData-Version': '4.0',
            'Accept': 'application/json',
            'Content-Type': 'application/json; charset=utf-8',
            'Prefer': 'odata.maxpagesize=500',
            'Prefer': 'odata.include-annotations=OData.Community.Display.V1.FormattedValue'
        }

        print('making crm account request . . .')
        crmres = requests.get(crmwebapi+crmwebapiquery, headers=crmrequestheaders)
        print('crm account response received . . .')
        try:
            print('parsing crm account response . . .')
            crmresults = crmres.json()
            for x in crmresults['value']:
                print (x['name'] + ' - ' + x['accountid'])
                runWorkflow(accesstoken, x['accountid'])
        except KeyError:
            print('Could not parse CRM account results')

        
def runWorkflow(token, entityid):
    crmwebapiworkflowpath = &quot;/workflows(&quot;+workflowid+&quot;)/Microsoft.Dynamics.CRM.ExecuteWorkflow&quot;

    #set the web api request headers
    requestheaders = { 
        'Authorization': 'Bearer ' + token,
        'OData-MaxVersion': '4.0',
        'OData-Version': '4.0',
        'Accept': 'application/json',
        'Content-Type': 'application/json; charset=utf-8'
    };
    
    reqobj = {'EntityId': entityid}
    
    print('  calling workflow for ' + entityid)
	
    crmres = requests.post(crmwebapi+crmwebapiworkflowpath, headers=requestheaders, data=json.dumps(reqobj))
    if(crmres.status_code == requests.codes.ok):
        print('    success ' + entityid)
    else:
        print('    error ' + entityid)

start()

</code></pre>
<p>To set it up as an Azure Function, after you have either created a new Function app or opened an existing Function app, do the following:</p>
<ol>
<li>
<p>Create a new function. Select &quot;create your own custom function&quot; at the bottom. <img src="https://alexanderdevelopment.net/content/images/2016/11/01-1.png#img-thumbnail" alt="Scheduling Dynamics 365 workflows with Azure Functions and Python"></p>
</li>
<li>
<p>Set language to &quot;Python&quot; and scenario to &quot;Experimental.&quot; Currently there is no timer trigger template for Python, but you can work around that by selecting the &quot;QueueTrigger-Python&quot; template in this step and manually changing the trigger later. <img src="https://alexanderdevelopment.net/content/images/2016/11/02-1.png#img-thumbnail" alt="Scheduling Dynamics 365 workflows with Azure Functions and Python"></p>
</li>
<li>
<p>Give your function a name and click create. Ignore the queue name and storage account connection values. <img src="https://alexanderdevelopment.net/content/images/2016/11/03-1.png#img-thumbnail" alt="Scheduling Dynamics 365 workflows with Azure Functions and Python"></p>
</li>
<li>
<p>Once the function is created, select the &quot;integrate&quot; tab from the menu on the left and delete the Azure Queue Storage trigger. <img src="https://alexanderdevelopment.net/content/images/2016/11/04-2.png#img-thumbnail" alt="Scheduling Dynamics 365 workflows with Azure Functions and Python"></p>
</li>
<li>
<p>Click &quot;new trigger&quot; at the top, and then select &quot;timer&quot; from the list that appears. <img src="https://alexanderdevelopment.net/content/images/2016/11/05-1.png#img-thumbnail" alt="Scheduling Dynamics 365 workflows with Azure Functions and Python"></p>
</li>
<li>
<p>Give it a name (or leave the default) and set the schedule. The <code>0 */5 * * * *</code> value here will execute every five minutes, just like in my previous Node.js example. <img src="https://alexanderdevelopment.net/content/images/2016/11/06-1.png#img-thumbnail" alt="Scheduling Dynamics 365 workflows with Azure Functions and Python"></p>
</li>
<li>
<p>After the function is configured to run on a timer trigger, you have to upload the Python Requests library that is used to make the web service calls. Open a new browser window to download the Requests library code from GitHub at <a href="https://github.com/kennethreitz/requests/releases/">https://github.com/kennethreitz/requests/releases/</a>. Select the most recent .zip file.</p>
</li>
<li>
<p>After it downloads, open it and extract the entire &quot;requests&quot; directory. It should contain a directory called &quot;packages.&quot;</p>
</li>
<li>
<p>Zip up just the &quot;requests&quot; directory in a separate &quot;requests.zip&quot; file.</p>
</li>
<li>
<p>Back in the Azure portal Function App blade, select &quot;function app settings&quot; from the menu on the left and then click &quot;Go to Kudu.&quot; <img src="https://alexanderdevelopment.net/content/images/2016/11/07-1.png#img-thumbnail" alt="Scheduling Dynamics 365 workflows with Azure Functions and Python"></p>
</li>
<li>
<p>In the window that opens, click the &quot;site&quot; link at the top. <img src="https://alexanderdevelopment.net/content/images/2016/11/08-1.png#img-thumbnail" alt="Scheduling Dynamics 365 workflows with Azure Functions and Python"></p>
</li>
<li>
<p>Then click the &quot;wwwroot&quot; link. <img src="https://alexanderdevelopment.net/content/images/2016/11/09-1.png#img-thumbnail" alt="Scheduling Dynamics 365 workflows with Azure Functions and Python"></p>
</li>
<li>
<p>Then click the name of your function. In this case it is &quot;CrmWorkflowTimerPython.&quot; <img src="https://alexanderdevelopment.net/content/images/2016/11/10-1.png#img-thumbnail" alt="Scheduling Dynamics 365 workflows with Azure Functions and Python"></p>
</li>
<li>
<p>Go to the cmd prompt at the bottom of the page and type <code>mkdir site-packages</code>. Click enter. You should see a new &quot;site-packages&quot; directory get created. <img src="https://alexanderdevelopment.net/content/images/2016/11/11-1.png#img-thumbnail" alt="Scheduling Dynamics 365 workflows with Azure Functions and Python"></p>
</li>
<li>
<p>Click on the new &quot;site-packages&quot; link to enter that directory.</p>
</li>
<li>
<p>Drag the &quot;requests&quot; .zip file you created in step 9 above into the &quot;site-packages&quot; directory like is shown. <img src="https://alexanderdevelopment.net/content/images/2016/11/12.png#img-thumbnail" alt="Scheduling Dynamics 365 workflows with Azure Functions and Python"></p>
</li>
<li>
<p>Once it is uploaded, extract it by typing <code>unzip requests.zip</code> at the cmd prompt and clicking enter. You should see something like this when it's complete. <img src="https://alexanderdevelopment.net/content/images/2016/11/13.png#img-thumbnail" alt="Scheduling Dynamics 365 workflows with Azure Functions and Python"></p>
</li>
<li>
<p>At this point, you can close the Kudu window.</p>
</li>
<li>
<p>From the Function App blade, select your Python function and click &quot;develop.&quot; Highlight everything and delete it. <img src="https://alexanderdevelopment.net/content/images/2016/11/14.png#img-thumbnail" alt="Scheduling Dynamics 365 workflows with Azure Functions and Python"></p>
</li>
<li>
<p>Paste the Python code from the beginning of this post. The highlighted lines in the image are how the script knows how to find the Requests library you uploaded earlier. Set any specifics relative to your Dynamics 365 organization, and click save. At this point it should just start running on the schedule you set earlier. <img src="https://alexanderdevelopment.net/content/images/2016/11/15-1.png#img-thumbnail" alt="Scheduling Dynamics 365 workflows with Azure Functions and Python"></p>
</li>
<li>
<p>You can see all the function invocation logs on the monitor tab. <img src="https://alexanderdevelopment.net/content/images/2016/11/chrome_2016-11-28_13-49-37.png#img-thumbnail" alt="Scheduling Dynamics 365 workflows with Azure Functions and Python"></p>
</li>
<li>
<p>Here are the notes that the workflow created in my Dynamics 365 online org. <img src="https://alexanderdevelopment.net/content/images/2016/11/chrome_2016-11-28_13-52-30.png#img-thumbnail" alt="Scheduling Dynamics 365 workflows with Azure Functions and Python"></p>
</li>
</ol>
<p>A few notes/caveats:</p>
<ol>
<li>My Python code has hardly any error handling right now. If the workflow execution call returns an error, the Python code will not recognize it as an error.</li>
<li>My CRM record retrieval is set to retrieve a maximum of 500 records. You would need to modify the Web API request logic to handle more.</li>
<li>Per the &quot;Best Practices for Azure Functions&quot; guide:</li>
</ol>
<blockquote>
<p>Assume your function could encounter an exception at any time. Design your functions with the ability to continue from a previous fail point during the next execution.</p>
</blockquote>
<p>This means you should put logic in your workflow to make sure that duplicate executions are avoided (unless that's what you intend to happen).</p>
<p>That's it for today. Until next time, happy coding!</p>
</div>]]></content:encoded></item><item><title><![CDATA[Scheduling Dynamics 365 workflows with Azure Functions and Node.js]]></title><description><![CDATA[<div class="kg-card-markdown"><p>Earlier this week I showed an easy way to <a href="https://alexanderdevelopment.net/post/2016/11/23/dynamics-365-and-node-js-integration-using-the-web-api/">integrate a Node.js application with Dynamics 365 using the Web API</a>. Building on that example, I have created a scheduled workflow runner using Node.js and <a href="https://docs.microsoft.com/en-us/azure/azure-functions/functions-overview">Azure Functions</a>. Here's how I did it.</p>
<p>First, I created a workflow in Dynamics</p></div>]]></description><link>https://alexanderdevelopment.net/post/2016/11/25/scheduling-dynamics-365-workflows-with-azure-functions/</link><guid isPermaLink="false">5a5837246636a30001b97866</guid><category><![CDATA[Dynamics 365]]></category><category><![CDATA[Microsoft Dynamics CRM]]></category><category><![CDATA[Node.js]]></category><category><![CDATA[Azure]]></category><category><![CDATA[demonstrations]]></category><dc:creator><![CDATA[Lucas Alexander]]></dc:creator><pubDate>Fri, 25 Nov 2016 17:00:03 GMT</pubDate><media:content url="https://alexanderdevelopment.net/content/images/2016/11/04-1.png" medium="image"/><content:encoded><![CDATA[<div class="kg-card-markdown"><img src="https://alexanderdevelopment.net/content/images/2016/11/04-1.png" alt="Scheduling Dynamics 365 workflows with Azure Functions and Node.js"><p>Earlier this week I showed an easy way to <a href="https://alexanderdevelopment.net/post/2016/11/23/dynamics-365-and-node-js-integration-using-the-web-api/">integrate a Node.js application with Dynamics 365 using the Web API</a>. Building on that example, I have created a scheduled workflow runner using Node.js and <a href="https://docs.microsoft.com/en-us/azure/azure-functions/functions-overview">Azure Functions</a>. Here's how I did it.</p>
<p>First, I created a workflow in Dynamics 365 that creates a note on an account record. The screenshots below shows what it looks like:</p>
<p><img src="https://alexanderdevelopment.net/content/images/2016/11/workflow01.png#img-thumbnail" alt="Scheduling Dynamics 365 workflows with Azure Functions and Node.js"></p>
<p><img src="https://alexanderdevelopment.net/content/images/2016/11/workflow02.png#img-thumbnail" alt="Scheduling Dynamics 365 workflows with Azure Functions and Node.js"></p>
<p>Next, I wrote Node.js code to do the following in an Azure Function.</p>
<ol>
<li>Request an OAuth token using a username and password.</li>
<li>Query the Dynamics 365 Web API for accounts with names that start with the letter &quot;F.&quot;</li>
<li>Execute a workflow for each record that was retrieved in the previous step.</li>
</ol>
<p><em>Most of this is regular Node.js, but there are a couple of nuances specific to Azure Functions. See the<br>
<a href="https://docs.microsoft.com/en-us/azure/azure-functions/functions-reference-node">&quot;Azure Functions NodeJS developer reference&quot;</a> for more information.</em></p>
<pre><code>var https = require('https');

//set these values to retrieve the oauth token
//see http://alexanderdevelopment.net/post/2016/11/23/dynamics-365-and-node-js-integration-using-the-web-api/ for more details
var _crmorg = 'https://CRMORG...dynamics.crom';  
var _clientid = 'OAUTH CLIENT ID';  
var _username = 'CRM USERNAME';  
var _userpassword = 'CRM PASSWORD';  
var _tokenendpoint = 'OAUTH TOKEN ENDPOINT FROM EARLIER';

//set these values to query your crm data
var _apipath = '/api/data/v8.2'; //web api version
var _workflowid = 'DC8519EC-F3CE-4BC9-BB79-DF2AD70217A1'; //guid for the workflow you want to execute
var _crmwebapihost = 'XXXX.api.crm.dynamics.com'; //crm api url (without https://)
var _crmwebapiquerypath = &quot;/accounts?$select=name,accountid&amp;$filter=startswith(name,'f')&quot;; //web api query

var _counter = 0; //variable to keep track of how many records retrieved and workflows started

module.exports = function (context, myTimer) {
	//remove https from _tokenendpoint url
	_tokenendpoint = _tokenendpoint.toLowerCase().replace('https://','');

	//get the authorization endpoint host name
	var authhost = _tokenendpoint.split('/')[0];

	//get the authorization endpoint path
	var authpath = '/' + _tokenendpoint.split('/').slice(1).join('/');

	//build the authorization request
	var reqstring = 'client_id='+_clientid;
	reqstring+='&amp;resource='+encodeURIComponent(_crmorg);
	reqstring+='&amp;username='+encodeURIComponent(_username);
	reqstring+='&amp;password='+encodeURIComponent(_userpassword);
	reqstring+='&amp;grant_type=password';

	//set the token request parameters
	var tokenrequestoptions = {
		host: authhost,
		path: authpath,
		method: 'POST',
		headers: {
			'Content-Type': 'application/x-www-form-urlencoded',
			'Content-Length': Buffer.byteLength(reqstring)
		}
	};

	//make the token request
	context.log('starting token request');
	var tokenrequest = https.request(tokenrequestoptions, function(response) {
		//make an array to hold the response parts if we get multiple parts
		var responseparts = [];
		response.setEncoding('utf8');
		response.on('data', function(chunk) {
			//add each response chunk to the responseparts array for later
			responseparts.push(chunk);		
		});
		response.on('end', function(){
			//once we have all the response parts, concatenate the parts into a single string
			var completeresponse = responseparts.join('');
			//context.log('Response: ' + completeresponse);
			context.log('token response retrieved');
			
			//parse the response JSON
			var tokenresponse = JSON.parse(completeresponse);
			
			//extract the token
			var token = tokenresponse.access_token;
			//context.log(token);
			
			//pass the token to our data retrieval function
			getData(context, token);
		});
	});
	tokenrequest.on('error', function(e) {
		context.error(e);
		context.done();
	});

	//post the token request data
	tokenrequest.write(reqstring);

	//close the token request
	tokenrequest.end();
}

function getData(context, token){
	//set the web api request headers
	var requestheaders = { 
		'Authorization': 'Bearer ' + token,
		'OData-MaxVersion': '4.0',
		'OData-Version': '4.0',
		'Accept': 'application/json',
		'Content-Type': 'application/json; charset=utf-8',
		'Prefer': 'odata.maxpagesize=500',
		'Prefer': 'odata.include-annotations=OData.Community.Display.V1.FormattedValue'
	};
	
	//set the crm request parameters
	var crmrequestoptions = {
		host: _crmwebapihost,
		path: _apipath+_crmwebapiquerypath,
		method: 'GET',
		headers: requestheaders
	};
	
	//make the web api request
	context.log('starting data request');
	var crmrequest = https.request(crmrequestoptions, function(response) {
		//make an array to hold the response parts if we get multiple parts
		var responseparts = [];
		response.setEncoding('utf8');
		response.on('data', function(chunk) {
			//add each response chunk to the responseparts array for later
			responseparts.push(chunk);		
		});
		response.on('end', function(){
			//once we have all the response parts, concatenate the parts into a single string
			var completeresponse = responseparts.join('');
			
			//parse the response JSON
			var collection = JSON.parse(completeresponse).value;
			
			//set counter length = number of records
			_counter = collection.length;

			//loop through the results and call the workflow for each one
			collection.forEach(function (row, i) {
				callWorkflow(context, token, row['accountid']);
			});
		});
	});
	crmrequest.on('error', function(e) {
		context.error(e);
		context.done();
	});
	//close the web api request
	crmrequest.end();
}

function callWorkflow(context, token, entityid){
	var crmwebapiworkflowpath = _apipath + &quot;/workflows(&quot;+_workflowid+&quot;)/Microsoft.Dynamics.CRM.ExecuteWorkflow&quot;;

	//set the web api request headers
	var requestheaders = { 
		'Authorization': 'Bearer ' + token,
		'OData-MaxVersion': '4.0',
		'OData-Version': '4.0',
		'Accept': 'application/json',
		'Content-Type': 'application/json; charset=utf-8'
	};
	
	//set the crm request parameters
	var crmrequestoptions = {
		host: _crmwebapihost,
		path: crmwebapiworkflowpath,
		method: 'POST',
		headers: requestheaders
	};

	//create an object to post to the executeworkflow action
	var reqobj = {};
	reqobj[&quot;EntityId&quot;] = entityid;
	
	//turn it into a string
	var reqjson = JSON.stringify(reqobj);
	
	//calculate the length to set the content-length header
	crmrequestoptions.headers['Content-Length'] = Buffer.byteLength(reqjson);
	
	//make the web api request
	context.log('starting workflow request for ' + entityid);
	var crmrequest = https.request(crmrequestoptions, function(response) {
		//make an array to hold the response parts if we get multiple parts
		var responseparts = [];
		response.setEncoding('utf8');
		response.on('data', function(chunk) {
			//add each response chunk to the responseparts array for later
			responseparts.push(chunk);		
		});
		response.on('end', function(){
			//once we have all the response parts, concatenate the parts into a single string
			var completeresponse = responseparts.join('');
			context.log('success ' + entityid);
			
			//decrement the counter
			_counter = _counter-1;
			
			//if nothing is left to start, we are done
			if(_counter==0){
				context.log('all workflows started');
				context.done();
			}
		});
	});
	crmrequest.on('error', function(e) {
		context.error(e);
		context.done();
	});
	crmrequest.write(reqjson);

	//close the web api request
	crmrequest.end();
}
</code></pre>
<p>Then in the Azure Portal, I configured an Azure Function app to query accounts and execute the workflow every five minutes. Here are the detailed steps to replicate that.</p>
<ol>
<li>Create a new Function app via New-&gt;Compute-&gt;Function App. <img src="https://alexanderdevelopment.net/content/images/2016/11/01.png#img-thumbnail" alt="Scheduling Dynamics 365 workflows with Azure Functions and Node.js"></li>
<li>Set the app name, resource group, etc. <img src="https://alexanderdevelopment.net/content/images/2016/11/02.png#img-thumbnail" alt="Scheduling Dynamics 365 workflows with Azure Functions and Node.js"></li>
<li>Once the new Function app is provisioned, open it. <img src="https://alexanderdevelopment.net/content/images/2016/11/03.png#img-thumbnail" alt="Scheduling Dynamics 365 workflows with Azure Functions and Node.js"></li>
<li>Select &quot;new function&quot; on the left. <img src="https://alexanderdevelopment.net/content/images/2016/11/04.png#img-thumbnail" alt="Scheduling Dynamics 365 workflows with Azure Functions and Node.js"></li>
<li>Set language to &quot;JavaScript&quot; and scenario to &quot;Core.&quot; Find the &quot;TimerTrigger-JavaScript&quot; template and select it. <img src="https://alexanderdevelopment.net/content/images/2016/11/05.png#img-thumbnail" alt="Scheduling Dynamics 365 workflows with Azure Functions and Node.js"></li>
<li>Give your function a name and set the schedule options. The <a href="https://docs.microsoft.com/en-us/azure/azure-functions/functions-bindings-timer#schedule-examples">schedule value</a> is a CRON expression that includes six fields: {second} {minute} {hour} {day} {month} {day of the week}. You can accept the default value of every five minutes and change it later. Click &quot;create&quot; to create the new function. <img src="https://alexanderdevelopment.net/content/images/2016/11/06.png#img-thumbnail" alt="Scheduling Dynamics 365 workflows with Azure Functions and Node.js"></li>
<li>Copy the Node.js code from above and paste it into the editor window. Set any specifics relative to your Dynamics 365 organization, and click save. (You can also use Git for deploying your code, but that's beyond the scope of today's post.) <img src="https://alexanderdevelopment.net/content/images/2016/11/07.png#img-thumbnail" alt="Scheduling Dynamics 365 workflows with Azure Functions and Node.js"></li>
<li>On the &quot;integrate&quot; tab, you can modify the timer schedule. The schedule shown (0 */5 * * * *) will execute the function every five minutes. <img src="https://alexanderdevelopment.net/content/images/2016/11/08.png#img-thumbnail" alt="Scheduling Dynamics 365 workflows with Azure Functions and Node.js"></li>
<li>The function will automatically execute at the next fifth minute, and the invocation log is available on the monitor tab. Selecting a specific invocation row shows detailed logging output on the right. <img src="https://alexanderdevelopment.net/content/images/2016/11/09.png#img-thumbnail" alt="Scheduling Dynamics 365 workflows with Azure Functions and Node.js"></li>
<li>This screenshot shows the process sessions for when the workflow was executed in Dynamics 365. <img src="https://alexanderdevelopment.net/content/images/2016/11/10.png#img-thumbnail" alt="Scheduling Dynamics 365 workflows with Azure Functions and Node.js"></li>
<li>This screenshot shows the note records that were created by the workflow. <img src="https://alexanderdevelopment.net/content/images/2016/11/11.png#img-thumbnail" alt="Scheduling Dynamics 365 workflows with Azure Functions and Node.js"></li>
</ol>
<p>A few notes/caveats:</p>
<ol>
<li>My Node.js code has hardly any error handling right now. If the workflow execution call returns an error, the Node.js code will not recognize it as an error.</li>
<li>My CRM record retrieval is set to retrieve a maximum of 500 records. You would need to modify the Web API request logic to handle more.</li>
<li>Per the <a href="https://docs.microsoft.com/en-us/azure/azure-functions/functions-best-practices">&quot;Best Practices for Azure Functions&quot;</a> guide:</li>
</ol>
<blockquote>
<p>Assume your function could encounter an exception at any time. Design your functions with the ability to continue from a previous fail point during the next execution.</p>
</blockquote>
<p>This means you should put logic in your workflow to make sure that duplicate executions are avoided (unless that's what you intend to happen).</p>
<p>This sample just scratches the surface of what's possible with Azure Functions and Dynamics 365, and I'm looking forward to working with Azure Functions more in the future. Have you looked at Azure Functions yet? What do you think? Please let me know in the comments.</p>
</div>]]></content:encoded></item><item><title><![CDATA[Azure Text Analytics sentiment analysis with North52]]></title><description><![CDATA[<div class="kg-card-markdown"><p>For the last several months I've been working on an enterprise Dynamics CRM project where one of our goals is to minimize the amount of custom code we write by using <a href="http://www.north52.com/business-process-activities/">North52's Business Process Activities</a>. I had not been exposed to North52 before working on this project, but I have</p></div>]]></description><link>https://alexanderdevelopment.net/post/2016/05/17/azure-text-analytics-sentiment-analysis-with-north52/</link><guid isPermaLink="false">5a5837236636a30001b9782d</guid><category><![CDATA[Microsoft Dynamics CRM]]></category><category><![CDATA[text analysis]]></category><category><![CDATA[integration]]></category><category><![CDATA[Azure]]></category><category><![CDATA[analytics]]></category><category><![CDATA[web services]]></category><category><![CDATA[North52]]></category><dc:creator><![CDATA[Lucas Alexander]]></dc:creator><pubDate>Tue, 17 May 2016 13:19:37 GMT</pubDate><media:content url="https://alexanderdevelopment.net/content/images/2016/05/2016-05-17_08-51-04.png" medium="image"/><content:encoded><![CDATA[<div class="kg-card-markdown"><img src="https://alexanderdevelopment.net/content/images/2016/05/2016-05-17_08-51-04.png" alt="Azure Text Analytics sentiment analysis with North52"><p>For the last several months I've been working on an enterprise Dynamics CRM project where one of our goals is to minimize the amount of custom code we write by using <a href="http://www.north52.com/business-process-activities/">North52's Business Process Activities</a>. I had not been exposed to North52 before working on this project, but I have been pleasantly surprised with how much it has allowed our mostly functional resources to achieve without needing technical assistance.</p>
<p>While looking through North52's documentation a while back, I noticed it could be used to <a href="http://support.north52.com/knowledgebase/articles/488697-introduction-to-north52-s-webfusion">call a REST web service</a>. This got me thinking about how I could rework my <a href="https://alexanderdevelopment.net/post/2015/10/12/sentiment-analysis-in-dynamics-crm-using-azure-text-analytics/">Sentiment analysis in Dynamics CRM using Azure Text Analytics</a> sample using North52 instead of a custom workflow activity.</p>
<p>I found I was able to replace the custom workflow activity with a North52 <a href="http://www.north52.com/business-process-activities/process-genie-for-microsoft-dynamics-crm-xrm/">Process Genie</a>. It executes a <a href="https://datamarket.azure.com/dataset/amla/text-analytics">Smart Flow</a> to call the Azure Machine Learning <a href="https://datamarket.azure.com/dataset/amla/text-analytics">Text Analytics API</a> and then return the result to the calling CRM dialog.</p>
<p>Here's the formula I used in my Process Genie:</p>
<pre><code>SmartFlow(
  SetVar('jsoninput', CreateJObject( 
     CreateJProperty('Inputs', 
           CreateJArray(CreateJObject(
                           CreateJProperty('Id', '1'),  
                           CreateJProperty('Text', [account.texttoanalyze]) 
                              )
                        )
                     )
               )
          ),

  CallRestAPI(
      SetRequestBaseURL('https://api.datamarket.azure.com/data.ashx/amla/text-analytics/v1'),
      SetRequestResource('/GetSentimentBatch'),
      SetRequestDetails('POST'),
      SetRequestHeaders(),
      SetRequestParams('RawContentTextJSON',GetVar('jsoninput')),
      SetRequestAuthenticationBasic('AccountKey','YOUR_AZURE_ML_API_KEY_HERE'),
      SetRequestFiles(),
      SetRequestExpected('OK'),
      SetRequestActionPass(SetVar('result', GetVarJsonValue('SentimentBatch{0}.Score'))),
      SetRequestActionFail(SetVar('result', 'ERROR' + GetVarJsonValue('Errors{0}.Message')))
    ), 

   SmartFlowReturn(GetVar('result'))
)
</code></pre>
<p>Here's a screenshot of my CRM dialog:<br>
<img src="https://alexanderdevelopment.net/content/images/2016/05/2016-05-16_19-30-01.png#img-thumbnail" alt="Azure Text Analytics sentiment analysis with North52"></p>
<p>To execute the Process Genie, you use a North52 N52 Process Genie step like this:<br>
<img src="https://alexanderdevelopment.net/content/images/2016/05/2016-05-16_19-32-09.png#img-thumbnail" alt="Azure Text Analytics sentiment analysis with North52"></p>
<p>The Formula ShortCode value is the short code of the North52 Process Genie. The Formula Parameter Xml value contains the text to analyze from the dialog input in the format expected by the formula:</p>
<p><code>&lt;account&gt;&lt;texttoanalyze&gt;{Response Text(Get text)}&lt;/texttoanalyze&gt;&lt;/account&gt;</code></p>
<p>If were going to use this in real solution, I would not hardcore the Azure ML API key directly in the formula. Other than that I think this is a production-ready approach.</p>
<p>What do you think? Would you consider using something like this as an alternative to writing your own custom code?</p>
</div>]]></content:encoded></item><item><title><![CDATA[Webcast: Sentiment Analysis in Microsoft Dynamics CRM using Azure Text Analytics]]></title><description><![CDATA[<div class="kg-card-markdown"><p>On Monday, April 11, at 12 p.m. EDT, I will be presenting a <a href="https://msdynamicsworld.webex.com/msdynamicsworld/onstage/g.php?MTID=e9c658b74d1caa0e4f1063423421cd47c&amp;SourceId=la">webcast</a> at MSDynamicsWorld.com that will show how a custom integration with Microsoft Azure Machine Learning can be used to perform sentiment analysis on any data stored in Dynamics CRM.</p>
<p>Custom sentiment analysis integrations can enable</p></div>]]></description><link>https://alexanderdevelopment.net/post/2016/04/06/webcast-sentiment-analysis-in-microsoft-dynamics-crm-using-azure-text-analytics/</link><guid isPermaLink="false">5a5837236636a30001b97819</guid><category><![CDATA[Microsoft Dynamics CRM]]></category><category><![CDATA[machine learning]]></category><category><![CDATA[Azure]]></category><category><![CDATA[text analysis]]></category><category><![CDATA[demonstrations]]></category><dc:creator><![CDATA[Lucas Alexander]]></dc:creator><pubDate>Wed, 06 Apr 2016 14:37:20 GMT</pubDate><media:content url="https://alexanderdevelopment.net/content/images/2016/04/sentiment-dialog.png" medium="image"/><content:encoded><![CDATA[<div class="kg-card-markdown"><img src="https://alexanderdevelopment.net/content/images/2016/04/sentiment-dialog.png" alt="Webcast: Sentiment Analysis in Microsoft Dynamics CRM using Azure Text Analytics"><p>On Monday, April 11, at 12 p.m. EDT, I will be presenting a <a href="https://msdynamicsworld.webex.com/msdynamicsworld/onstage/g.php?MTID=e9c658b74d1caa0e4f1063423421cd47c&amp;SourceId=la">webcast</a> at MSDynamicsWorld.com that will show how a custom integration with Microsoft Azure Machine Learning can be used to perform sentiment analysis on any data stored in Dynamics CRM.</p>
<p>Custom sentiment analysis integrations can enable a number of interesting processes in Dynamics CRM including:</p>
<ul>
<li>Routing emails to queues based on sentiment</li>
<li>Dynamically load agent scripts</li>
<li>Reporting on interactions</li>
</ul>
<p>This webinar will include an introduction to Azure Machine Learning, an overview of a sample solution and a code deep dive. Programming knowledge will be helpful during the code deep dive, but it is not required for most of the discussion.</p>
<p>You can register for the session <a href="https://msdynamicsworld.webex.com/msdynamicsworld/onstage/g.php?MTID=e9c658b74d1caa0e4f1063423421cd47c&amp;SourceId=la">here</a>.</p>
</div>]]></content:encoded></item><item><title><![CDATA[Predictions in Dynamics CRM with custom Azure Machine Learning integrations]]></title><description><![CDATA[<div class="kg-card-markdown"><p>Earlier this year I wrote a <a href="https://alexanderdevelopment.net/post/2015/10/12/sentiment-analysis-in-dynamics-crm-using-azure-text-analytics/">post</a> that showed how to perform sentiment analysis in Dynamics CRM using <a href="https://datamarket.azure.com/dataset/amla/text-analytics">Microsoft Azure Text Analytics</a>. Azure Text Analytics makes it incredibly easy to use sentiment analysis (with English text only), but the full Azure Machine Learning offering is much more powerful. In today's</p></div>]]></description><link>https://alexanderdevelopment.net/post/2015/11/30/using-azure-machine-learning-predictive-data-models-in-dynamics-crm/</link><guid isPermaLink="false">5a5837236636a30001b977df</guid><category><![CDATA[Microsoft Dynamics CRM]]></category><category><![CDATA[CRM 2015]]></category><category><![CDATA[Azure]]></category><category><![CDATA[machine learning]]></category><category><![CDATA[data mining]]></category><category><![CDATA[analytics]]></category><category><![CDATA[web services]]></category><dc:creator><![CDATA[Lucas Alexander]]></dc:creator><pubDate>Tue, 01 Dec 2015 03:08:34 GMT</pubDate><media:content url="https://alexanderdevelopment.net/content/images/2015/11/ml-03-1.png" medium="image"/><content:encoded><![CDATA[<div class="kg-card-markdown"><img src="https://alexanderdevelopment.net/content/images/2015/11/ml-03-1.png" alt="Predictions in Dynamics CRM with custom Azure Machine Learning integrations"><p>Earlier this year I wrote a <a href="https://alexanderdevelopment.net/post/2015/10/12/sentiment-analysis-in-dynamics-crm-using-azure-text-analytics/">post</a> that showed how to perform sentiment analysis in Dynamics CRM using <a href="https://datamarket.azure.com/dataset/amla/text-analytics">Microsoft Azure Text Analytics</a>. Azure Text Analytics makes it incredibly easy to use sentiment analysis (with English text only), but the full Azure Machine Learning offering is much more powerful. In today's post I will show how to create a custom predictive web service in Azure ML and make predictions with it in Dynamics CRM.</p>
<h4 id="whyamidoingthis">Why am I doing this?</h4>
<p>One of the exciting announcements about Dynamics CRM 2016 is that it includes some sort of integration with Azure ML, so what's the point of this blog post? Actually, here are two reasons:</p>
<ol>
<li>It's not clear (as of late November 2016) what capabilities the standard CRM-Azure ML integration will have. The approach demonstrated here allows for total customization.</li>
<li>The approach below can be used right now in any version of CRM (online or on-premise) from CRM 2011 onward.</li>
</ol>
<h4 id="thedemodataset">The demo data set</h4>
<p>For this demonstration I am using data from the AdventureWorks data warehouse sample database to build a model to predict whether a contact in CRM is likely to be a bicycle buyer. I created a flat file of contacts that includes several independent variables and a yes/no flag for whether they purchased a bicycle. That sample file is included in my solution source linked at the bottom of this post.</p>
<p>You can also download the full AdventureWorks database from CodePlex <a href="https://msftdbprodsamples.codeplex.com/releases/view/125550">here</a>.</p>
<h4 id="gettingstartedwithazureml">Getting started with Azure ML</h4>
<p>As I was trying to get the hang of Azure ML, I found the <a href="https://bluewatersql.wordpress.com/">Bluewater SQL blog</a> tremendously helpful. If you're just getting started with Azure ML, I suggest you take a look at both of these posts before continuing. <em>(These two posts also use the AdventureWorks data warehouse sample database, though I had already decided to use it for my demo before I found this site. I guess the AdventureWorks data is an obvious choice for this sort of thing.)</em></p>
<ol>
<li><a href="https://bluewatersql.wordpress.com/2014/07/31/azure-machine-learning-first-look/">https://bluewatersql.wordpress.com/2014/07/31/azure-machine-learning-first-look/</a></li>
<li><a href="https://bluewatersql.wordpress.com/2014/08/01/azure-machine-learning-a-deeper-look/">https://bluewatersql.wordpress.com/2014/08/01/azure-machine-learning-a-deeper-look/</a></li>
</ol>
<h4 id="creatingthepredictivewebservice">Creating the predictive web service</h4>
<p>Now that the introductory bits are out of the way, here's an overview of how I set up my predictive web service in Azure ML. I've also shared my work in the Cortana Analytics Gallery, so if you just want to see how the experiments are set up without going through all the steps yourself, skip to the end of this post for the gallery links.</p>
<ol>
<li>Export your data set from CRM. I use a simple KingswaySoft SSIS package to do this. It's included in my sample project files.</li>
<li>Upload your data set to Azure ML Studio. I am using the Azure ML free tier, which requires me to manually upload the data set for the initial upload and any subsequent updates. If I were using the standard tier, it would be possible to upload the data set to Azure blob storage as part of my SSIS package. This would be useful for periodically retraining the model with updated data in the future. <img src="https://alexanderdevelopment.net/content/images/2015/11/ml-01.png#img-thumbnail" alt="Predictions in Dynamics CRM with custom Azure Machine Learning integrations"></li>
<li>Create and run a new experiment to classify bike buyers. I basically followed the steps outlined in the two Bluewater SQL posts, so take a look at those if you want to see a more detailed explanation. <img src="https://alexanderdevelopment.net/content/images/2015/11/ml-02.png#img-thumbnail" alt="Predictions in Dynamics CRM with custom Azure Machine Learning integrations"></li>
<li>Save the trained model from this experiment to use in your predictive web service.</li>
<li>Create a new experiment that will serve as the foundation for the predictive web service</li>
<li>Add the following items to the new experiment canvas:</li>
<li>Data set</li>
<li>Trained model from the earlier experiment</li>
<li>Score model component</li>
<li>Project columns transformation between the data set and score model component to exclude the lpa_bikerbuyername column</li>
<li>Project columns transformation after score model component to only include score labels</li>
<li>Run the experiment.</li>
<li>Create a predictive web service. <img src="https://alexanderdevelopment.net/content/images/2015/11/ml-03.png#img-thumbnail" alt="Predictions in Dynamics CRM with custom Azure Machine Learning integrations"></li>
<li>Run the experiment.</li>
<li>Deploy the web service. <img src="https://alexanderdevelopment.net/content/images/2015/11/ml-04.png#img-thumbnail" alt="Predictions in Dynamics CRM with custom Azure Machine Learning integrations"></li>
<li>Open the web service and copy the API key to use later. <img src="https://alexanderdevelopment.net/content/images/2015/11/ml-05.png#img-thumbnail" alt="Predictions in Dynamics CRM with custom Azure Machine Learning integrations"></li>
<li>Open the request/response page.</li>
<li>Copy the post URL to use later. Leave off the &quot;&amp;details=true&quot; at the end. <img src="https://alexanderdevelopment.net/content/images/2015/11/ml-06.png#img-thumbnail" alt="Predictions in Dynamics CRM with custom Azure Machine Learning integrations"></li>
<li>Scroll down to find the JSON request/response samples and copy them for review later. <img src="https://alexanderdevelopment.net/content/images/2015/11/ml-07.png#img-thumbnail" alt="Predictions in Dynamics CRM with custom Azure Machine Learning integrations"></li>
</ol>
<h4 id="makingpredictionsfromcrm">Making predictions from CRM</h4>
<p>Once your predictive web service is set up, using it in Dynamics CRM is as simple as posting a JSON request to the web service and then parsing the JSON response. A few years ago, I wrote a <a href="https://code.msdn.microsoft.com/Postingprocessing-JSON-in-396ead03">code sample showing an easy way to make JSON requests from CRM custom assemblies</a> that serves as the foundation for the workflow activity I'm using in this demo. There are only a few changes required to that sample:</p>
<ol>
<li>Add an authorization header that includes the bearer API key you copied earlier.</li>
<li>Add input/output parameters that match the inputs and outputs for your predictive web service. I am also passing the API key and service endpoint as input parameters instead of hardcoding them.</li>
<li>Modify the JSON request/response classes to allow for serialization to/deserialization from the request and response messages. Because of how Azure ML passes the inputs/outputs, these are actually fairly generic, so you can probably use mine with no or minimal changes.</li>
<li>Send the correct request parameters and correctly handle the response parameters.</li>
</ol>
<p>To demonstrate the custom workflow activity, I created a sample CRM dialog.<br>
<img src="https://alexanderdevelopment.net/content/images/2015/11/ml-08.png#img-thumbnail" alt="Predictions in Dynamics CRM with custom Azure Machine Learning integrations"></p>
<p>Here's where I supply the input parameters.<br>
<img src="https://alexanderdevelopment.net/content/images/2015/11/ml-09-1.png#img-thumbnail" alt="Predictions in Dynamics CRM with custom Azure Machine Learning integrations"></p>
<p>Here's what the dialog looks like when I run it from a contact record.<br>
<img src="https://alexanderdevelopment.net/content/images/2015/11/ml-10.png#img-thumbnail" alt="Predictions in Dynamics CRM with custom Azure Machine Learning integrations"></p>
<h4 id="thecode">The code</h4>
<p>You can get all the code for this demonstration from my GitHub repository <a href="https://github.com/lucasalexander/Crm-Sample-Code/tree/master/CrmAzureMachineLearningDemo">here</a>. The  sample code includes the following:</p>
<ol>
<li>CRM solution containing:</li>
<li>Contact entity with custom fields added and a system view for data export for Azure ML</li>
<li>Sample dialog (you would need to update the parameters with your own API key and endpoint details)</li>
<li>Compiled custom workflow activity to call the web service</li>
<li>Contact data ready for import to CRM</li>
<li>SSIS package solution for data export</li>
<li>Custom workflow activity source code</li>
</ol>
<p>I've also shared my training and prediction experiments in the Cortana Analytics Gallery. Here are the links where you can open and import them into your own workspace:</p>
<ol>
<li><a href="http://gallery.azureml.net/Details/3f87e56096214a508274c4f61e989cfb">Training experiment</a></li>
<li><a href="http://gallery.azureml.net/Details/ec90eadce2c947608916c64fb3c8fb97">Prediction experiment</a></li>
</ol>
<p>What do you think about this approach? Does it seem like it would be useful to you in a real-world situation? Let me know in the comments!</p>
</div>]]></content:encoded></item><item><title><![CDATA[Sentiment analysis in Dynamics CRM using Azure Text Analytics]]></title><description><![CDATA[<div class="kg-card-markdown"><p>Last year I created a <a href="https://github.com/lucasalexander/CRM-IdolOnDemand-Tools">proof-of-concept solution</a> that showed how to integrate Dynamics CRM with <a href="https://www.havenondemand.com/">HP Haven OnDemand</a> (then called HP IDOL OnDemand) to perform sentiment analysis and index records to support &quot;find similar&quot; queries. While I was working through the <a href="https://challenge.azurecon.com/">AzureCon challenge</a> a few weeks ago, I</p></div>]]></description><link>https://alexanderdevelopment.net/post/2015/10/12/sentiment-analysis-in-dynamics-crm-using-azure-text-analytics/</link><guid isPermaLink="false">5a5837226636a30001b97775</guid><category><![CDATA[Microsoft Dynamics CRM]]></category><category><![CDATA[CRM 2015]]></category><category><![CDATA[text analysis]]></category><category><![CDATA[Azure]]></category><category><![CDATA[analytics]]></category><category><![CDATA[web services]]></category><dc:creator><![CDATA[Lucas Alexander]]></dc:creator><pubDate>Mon, 12 Oct 2015 22:36:23 GMT</pubDate><media:content url="https://alexanderdevelopment.net/content/images/2015/10/dialog-2.png" medium="image"/><content:encoded><![CDATA[<div class="kg-card-markdown"><img src="https://alexanderdevelopment.net/content/images/2015/10/dialog-2.png" alt="Sentiment analysis in Dynamics CRM using Azure Text Analytics"><p>Last year I created a <a href="https://github.com/lucasalexander/CRM-IdolOnDemand-Tools">proof-of-concept solution</a> that showed how to integrate Dynamics CRM with <a href="https://www.havenondemand.com/">HP Haven OnDemand</a> (then called HP IDOL OnDemand) to perform sentiment analysis and index records to support &quot;find similar&quot; queries. While I was working through the <a href="https://challenge.azurecon.com/">AzureCon challenge</a> a few weeks ago, I thought it would be an interesting exercise to update my sentiment analysis code to work with the <a href="https://datamarket.azure.com/dataset/amla/text-analytics">Text Analytics</a> offering from the Microsoft Azure Marketplace.</p>
<h4 id="theapproach">The approach</h4>
<p>As with my Haven OnDemand solution, the approach I'm using with Azure relies on a custom workflow activity that does the following:</p>
<ol>
<li>Parse a supplied text input and strip any HTML tags using a helper function.
</li><li>Create a JSON sentiment analysis request and post it to Azure Text Analytics with an HttpWebRequest.
</li><li>Deserialize the JSON respsonse returned by Azure Text Analytics to a custom class object using a DataContractJsonSerializer.
</li><li>Return the sentiment score to the calling process.
</li></ol>
<p>There are two main differences with the my Azure Text Analytics solution:</p>
<ol>
<li>The Azure service only returns a sentiment score, so this custom workflow activity doesn't return a positive/negative string value.
</li><li>Instead of embedding an access key in the custom workflow activity code, I've made it a parameter, which means you can take the solution straight from GitHub and start using it in your CRM organization as soon as you sign up for the Text Analytics service.
</li></ol>
<h4 id="thesolutioninaction">The solution in action</h4>
<p>Here's a sample dialog I've created to demonstrate the use of the custom workflow activity.<br>
<img src="https://alexanderdevelopment.net/content/images/2015/10/dialog.png#img-thumbnail" alt="Sentiment analysis in Dynamics CRM using Azure Text Analytics"></p>
<p><img src="https://alexanderdevelopment.net/content/images/2015/10/dialog-01-input.png#img-thumbnail" alt="Sentiment analysis in Dynamics CRM using Azure Text Analytics"></p>
<p><img src="https://alexanderdevelopment.net/content/images/2015/10/dialog-02-params.png#img-thumbnail" alt="Sentiment analysis in Dynamics CRM using Azure Text Analytics"></p>
<p><img src="https://alexanderdevelopment.net/content/images/2015/10/dialog-03-output.png#img-thumbnail" alt="Sentiment analysis in Dynamics CRM using Azure Text Analytics"></p>
<p>Here's how the process works for the sample text &quot;I hate you.&quot;</p>
<p><img src="https://alexanderdevelopment.net/content/images/2015/10/hate-01.png#img-thumbnail" alt="Sentiment analysis in Dynamics CRM using Azure Text Analytics"></p>
<p><img src="https://alexanderdevelopment.net/content/images/2015/10/hate-02.png#img-thumbnail" alt="Sentiment analysis in Dynamics CRM using Azure Text Analytics"></p>
<p>Wrapping it up on a more positive note, here's the same dialog with &quot;I love you&quot; instead.</p>
<p><img src="https://alexanderdevelopment.net/content/images/2015/10/love-01.png#img-thumbnail" alt="Sentiment analysis in Dynamics CRM using Azure Text Analytics"></p>
<p><img src="https://alexanderdevelopment.net/content/images/2015/10/love-02.png#img-thumbnail" alt="Sentiment analysis in Dynamics CRM using Azure Text Analytics"></p>
<p>As you can see, the score for &quot;I hate you&quot; is about .06, and the score for &quot;I love you&quot; is about .91, which makes sense as scores closer to one are more positive, and scores closer to zero are more negative.</p>
<h4 id="thecode">The code</h4>
<p>You can download all the custom code and a CRM solution extract from my <a href="https://github.com/lucasalexander/Crm-Sample-Code/tree/master/CrmAzureTextAnalysis">Crm-Sample-Code repository on GitHub</a>. Let me know what you think in the comments!</p>
</div>]]></content:encoded></item></channel></rss>