<?xml version="1.0" encoding="UTF-8"?><rss xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:atom="http://www.w3.org/2005/Atom" version="2.0" xmlns:media="http://search.yahoo.com/mrss/"><channel><title><![CDATA[RabbitMQ - Alexander Development]]></title><description><![CDATA[RabbitMQ - Alexander Development]]></description><link>https://alexanderdevelopment.net/</link><generator>Ghost 1.20</generator><lastBuildDate>Mon, 24 Aug 2020 19:54:15 GMT</lastBuildDate><atom:link href="https://alexanderdevelopment.net/tag/rabbitmq/rss/" rel="self" type="application/rss+xml"/><ttl>60</ttl><item><title><![CDATA[Building a simple service relay for Dynamics 365 CE with RabbitMQ and Python - part 4]]></title><description><![CDATA[<div class="kg-card-markdown"><p>This is the final post in my series about building a service relay for Dynamics 365 CE with RabbitMQ and Python. In my previous <a href="https://alexanderdevelopment.net/post/2018/02/05/building-a-simple-service-relay-for-dynamics-365-ce-with-rabbitmq-and-python-part-3/">post</a> in this series, I showed the Python code to make the service relay work. In today's post, I will show how you can use <a href="https://azure.microsoft.com/en-us/services/functions/">Azure</a></p></div>]]></description><link>https://alexanderdevelopment.net/post/2018/02/07/building-a-simple-service-relay-for-dynamics-365-ce-with-rabbitmq-and-python-part-4/</link><guid isPermaLink="false">5a788a53c86c8900016cf367</guid><category><![CDATA[Microsoft Dynamics CRM]]></category><category><![CDATA[Dynamics 365]]></category><category><![CDATA[integration]]></category><category><![CDATA[Python]]></category><category><![CDATA[RabbitMQ]]></category><category><![CDATA[Azure]]></category><category><![CDATA[C#]]></category><dc:creator><![CDATA[Lucas Alexander]]></dc:creator><pubDate>Thu, 08 Feb 2018 04:00:42 GMT</pubDate><media:content url="https://alexanderdevelopment.net/content/images/2018/02/simple-service-relay-2.png" medium="image"/><content:encoded><![CDATA[<div class="kg-card-markdown"><img src="https://alexanderdevelopment.net/content/images/2018/02/simple-service-relay-2.png" alt="Building a simple service relay for Dynamics 365 CE with RabbitMQ and Python - part 4"><p>This is the final post in my series about building a service relay for Dynamics 365 CE with RabbitMQ and Python. In my previous <a href="https://alexanderdevelopment.net/post/2018/02/05/building-a-simple-service-relay-for-dynamics-365-ce-with-rabbitmq-and-python-part-3/">post</a> in this series, I showed the Python code to make the service relay work. In today's post, I will show how you can use <a href="https://azure.microsoft.com/en-us/services/functions/">Azure Functions</a> to make a consumer service proxy using C# so client applications don't have to access to your RabbitMQ broker directly, and I will also discuss some general thoughts on security and scalability for this service relay architecture.</p>
<p>Although this simple service relay allows external consumers to get data from Dynamics 365 CE without needing to connect directly, the examples I've shown so far require that they can connect to a RabbitMQ broker. This may be problematic for a variety of reasons, so you would probably want external consumers to connect to a web service proxy that would write requests to and read responses from the RabbitMQ broker.</p>
<h4 id="buildingaserviceproxyfunction">Building a service proxy function</h4>
<p>You can build an Azure Functions service proxy with Python, but I don't recommend it for three reasons:</p>
<ol>
<li>Azure Functions Python support is still considered experimental.</li>
<li>Python scripts that use external libraries can run <a href="https://github.com/Azure/azure-functions-host/issues/1626">exceedingly slow</a>.</li>
<li>Getting the environment set up is a bit of a hassle.</li>
</ol>
<p>On the other hand, building a service proxy function with C# was so much easier, and it performed much better than a comparable Python function (~.5 seconds for C# compared to 5+ seconds for Python).</p>
<p>Here are the steps I took to build my C# service proxy function:</p>
<ol>
<li>Create a C# HTTP trigger function.</li>
<li>Create and upload a project.json file with a dependency on the RabbitMQ client (see below).</li>
<li>Take the &quot;RpcClient&quot; class from the <a href="https://www.rabbitmq.com/tutorials/tutorial-six-dotnet.html">RabbitMQ .Net RPC tutorial</a> and call it from within my function.</li>
</ol>
<p>Here's my project.json file:</p>
<pre><code>{
  &quot;frameworks&quot;: {
    &quot;net46&quot;:{
      &quot;dependencies&quot;: {
        &quot;RabbitMQ.Client&quot;: &quot;5.0.1&quot;
      }
    }
   }
}
</code></pre>
<p>And here's my run.csx file:</p>
<pre><code>using System.Net;
using System;
using System.Collections.Concurrent;
using System.Text;
using RabbitMQ.Client;
using RabbitMQ.Client.Events;

public static async Task&lt;HttpResponseMessage&gt; Run(HttpRequestMessage req, TraceWriter log)
{
    log.Info(&quot;Processing request&quot;);

    // parse query parameter
    string query = req.GetQueryNameValuePairs()
        .FirstOrDefault(q =&gt; string.Compare(q.Key, &quot;query&quot;, true) == 0)
        .Value;

    // Get request body
    dynamic data = await req.Content.ReadAsAsync&lt;object&gt;();

    // Set name to query string or body data
    query = query ?? data?.query;

    var rpcClient = new RpcClient();
    
    log.Info(string.Format(&quot; [.] query start time {0}&quot;, DateTime.Now.ToString(&quot;MM/dd/yyyy hh:mm:ss.fff tt&quot;)));
    var response = rpcClient.Call(query);

    log.Info(string.Format(&quot; [.] query end time {0}&quot;, DateTime.Now.ToString(&quot;MM/dd/yyyy hh:mm:ss.fff tt&quot;)));
    rpcClient.Close();

    return req.CreateResponse(HttpStatusCode.OK, response);
}

public class RpcClient
{
    private readonly IConnection connection;
    private readonly IModel channel;
    private readonly string replyQueueName;
    private readonly EventingBasicConsumer consumer;
    private readonly BlockingCollection&lt;string&gt; respQueue = new BlockingCollection&lt;string&gt;();
    private readonly IBasicProperties props;

    public RpcClient()
    {
        var factory = new ConnectionFactory() { HostName = &quot;RABBITHOST&quot;, UserName=&quot;RABBITUSER&quot;, Password=&quot;RABBITUSERPASS&quot;  };

        connection = factory.CreateConnection();
        channel = connection.CreateModel();
        replyQueueName = channel.QueueDeclare().QueueName;
        consumer = new EventingBasicConsumer(channel);

        props = channel.CreateBasicProperties();
        var correlationId = Guid.NewGuid().ToString();
        props.CorrelationId = correlationId;
        props.ReplyTo = replyQueueName;

        consumer.Received += (model, ea) =&gt;
        {
            var body = ea.Body;
            var response = Encoding.UTF8.GetString(body);
            if (ea.BasicProperties.CorrelationId == correlationId)
            {
                respQueue.Add(response);
            }
        };
    }

    public string Call(string message)
    {
        var messageBytes = Encoding.UTF8.GetBytes(message);
        channel.BasicPublish(
            exchange: &quot;&quot;,
            routingKey: &quot;rpc_queue&quot;,
            basicProperties: props,
            body: messageBytes);

        channel.BasicConsume(
            consumer: consumer,
            queue: replyQueueName,
            autoAck: true);

        return respQueue.Take(); ;
    }

    public void Close()
    {
        connection.Close();
    }
}
</code></pre>
<p>Here's a screenshot showing me calling the C# function with Postman.<br>
<img src="https://alexanderdevelopment.net/content/images/2018/02/Postman_2018-02-05_22-02-52.png#img-thumbnail" alt="Building a simple service relay for Dynamics 365 CE with RabbitMQ and Python - part 4"></p>
<p>Because I did actually build a Python function, I will go ahead and share how I did it if you're interested. Here are the steps I took:</p>
<ol>
<li>Create a Python HTTP trigger function.</li>
<li>Install Python 3.6 via site extensions (see steps 2.1-2.4 <a href="https://stackoverflow.com/a/47213859">here</a>).</li>
<li>Install the necessary libraries using pip via <a href="https://david-obrien.net/2016/07/azure-functions-kudu/">KUDU</a>.</li>
</ol>
<p>Here's the Python function code:</p>
<pre><code>import os
import sys
import json
import pika
import uuid
import datetime

class CrmRpcClient(object):
    def __init__(self):
        #RabbitMQ connection details
        self.rabbituser = 'RABBITUSERNAME'
        self.rabbitpass = 'RABBITUSERPASS'
        self.rabbithost = 'RABBITHOST' 
        self.rabbitport = 5672
        self.rabbitqueue = 'rpc_queue'
        rabbitcredentials = pika.PlainCredentials(self.rabbituser, self.rabbitpass)
        rabbitparameters = pika.ConnectionParameters(host=self.rabbithost,
                                    port=self.rabbitport,
                                    virtual_host='/',
                                    credentials=rabbitcredentials)

        self.rabbitconn = pika.BlockingConnection(rabbitparameters)

        self.channel = self.rabbitconn.channel()

        #create an anonymous exclusive callback queue
        result = self.channel.queue_declare(exclusive=True)
        self.callback_queue = result.method.queue

        self.channel.basic_consume(self.on_response, no_ack=True,
                                   queue=self.callback_queue)

    #callback method for when a response is received - note the check for correlation id
    def on_response(self, ch, method, props, body):
        if self.corr_id == props.correlation_id:
            self.response = body

    #method to make the initial request
    def call(self, n):
        self.response = None
        #generate a new correlation id
        self.corr_id = str(uuid.uuid4())

        #publish the message to the rpc_queue - note the reply_to property is set to the callback queue from above
        self.channel.basic_publish(exchange='',
                                   routing_key=self.rabbitqueue,
                                   properties=pika.BasicProperties(
                                         reply_to = self.callback_queue,
                                         correlation_id = self.corr_id,
                                         ),
                                   body=n)
        while self.response is None:
            self.rabbitconn.process_data_events()
        return self.response

print(&quot; [.] query start time %r&quot; % str(datetime.datetime.now()))
#instantiate an rpc client
crm_rpc = CrmRpcClient()

postreqdata = json.loads(open(os.environ['req']).read())
query = postreqdata['query']

crm_rpc = CrmRpcClient()
print(&quot; [.] query start time %r&quot; % str(datetime.datetime.now()))
queryresponse = crm_rpc.call(query)
print(&quot; [.] query end time %r&quot; % str(datetime.datetime.now()))
response = open(os.environ['res'], 'w')
response.write(queryresponse.decode())
response.close()
</code></pre>
<p>Here's a screenshot showing me calling the Python function with Postman.<img src="https://alexanderdevelopment.net/content/images/2018/02/Postman_2018-02-05_22-10-20.png#img-thumbnail" alt="Building a simple service relay for Dynamics 365 CE with RabbitMQ and Python - part 4"></p>
<p>Note the difference in time between the two functions - 5.62 seconds for Python and .46 seconds for C#!</p>
<h4 id="securityandscalability">Security and scalability</h4>
<p>If you decide to use this approach in production, I'd suggest you carefully consider both security and scalability. Obviously the overall solution will only be as secure as your RabbitMQ broker and communications between the broker and its clients, so you'll want to look at best practices for access control and securing the communications with TLS. Here are some links for further reading on those subjects:</p>
<ul>
<li>TLS - <a href="https://www.rabbitmq.com/ssl.html">https://www.rabbitmq.com/ssl.html</a></li>
<li>Access control - <a href="https://www.rabbitmq.com/access-control.html">https://www.rabbitmq.com/access-control.html</a></li>
</ul>
<p>As for scalability, the approach I've shown creates a separate response queue for each consumer, but it can have problems scaling, especially if you are using a RabbitMQ cluster. You may want to look at the <a href="https://www.rabbitmq.com/direct-reply-to.html">&quot;direct reply-to&quot;</a> approach instead. For an interesting real-world overview of using direct reply-to, take a look at this <a href="https://facundoolano.wordpress.com/2016/06/26/real-world-rpc-with-rabbitmq-and-node-js/">blog post.</a>.</p>
<h4 id="wrappingup">Wrapping up</h4>
<p>I hope you've enjoyed this series and that it has given you some ideas about how to implement service relays in your Dynamics 365 CE projects. As I worked through the examples, I certainly learned a few new things, especially when I created my Python service proxy in Azure Functions.</p>
<p>Here are links to all the previous posts in this series.</p>
<ol>
<li><a href="https://alexanderdevelopment.net/post/2018/01/30/relaying-external-queries-to-on-premise-dynamics-365-ce-orgs-with-rabbitmq-and-python/">Part 1</a> - Series introduction</li>
<li><a href="https://alexanderdevelopment.net/post/2018/02/01/building-a-simple-service-relay-for-dynamics-365-ce-with-rabbitmq-and-python-part-2/">Part 2</a> - Solution prerequisites</li>
<li><a href="https://alexanderdevelopment.net/post/2018/02/05/building-a-simple-service-relay-for-dynamics-365-ce-with-rabbitmq-and-python-part-3/">Part 3</a> - Python code for the consumer and listener processes</li>
</ol>
<p>What do you think about this approach? Is it something you think you'd use in production? Let us know in the comments!</p>
</div>]]></content:encoded></item><item><title><![CDATA[Building a simple service relay for Dynamics 365 CE with RabbitMQ and Python - part 3]]></title><description><![CDATA[<div class="kg-card-markdown"><p>In my last <a href="https://alexanderdevelopment.net/post/2018/02/01/building-a-simple-service-relay-for-dynamics-365-ce-with-rabbitmq-and-python-part-2/">post</a> in this series, I walked through the prerequisites for building a simple service relay for Dynamics 365 CE with RabbitMQ and Python. In today's post I will show the Python code to make the service relay work.</p>
<p>As I described in the <a href="https://alexanderdevelopment.net/post/2018/01/30/relaying-external-queries-to-on-premise-dynamics-365-ce-orgs-with-rabbitmq-and-python/">first post</a> in this</p></div>]]></description><link>https://alexanderdevelopment.net/post/2018/02/05/building-a-simple-service-relay-for-dynamics-365-ce-with-rabbitmq-and-python-part-3/</link><guid isPermaLink="false">5a6cab4cc86c8900016cf352</guid><category><![CDATA[Microsoft Dynamics CRM]]></category><category><![CDATA[Dynamics 365]]></category><category><![CDATA[integration]]></category><category><![CDATA[Python]]></category><category><![CDATA[RabbitMQ]]></category><dc:creator><![CDATA[Lucas Alexander]]></dc:creator><pubDate>Mon, 05 Feb 2018 17:57:29 GMT</pubDate><media:content url="https://alexanderdevelopment.net/content/images/2018/02/simple-service-relay-1.png" medium="image"/><content:encoded><![CDATA[<div class="kg-card-markdown"><img src="https://alexanderdevelopment.net/content/images/2018/02/simple-service-relay-1.png" alt="Building a simple service relay for Dynamics 365 CE with RabbitMQ and Python - part 3"><p>In my last <a href="https://alexanderdevelopment.net/post/2018/02/01/building-a-simple-service-relay-for-dynamics-365-ce-with-rabbitmq-and-python-part-2/">post</a> in this series, I walked through the prerequisites for building a simple service relay for Dynamics 365 CE with RabbitMQ and Python. In today's post I will show the Python code to make the service relay work.</p>
<p>As I described in the <a href="https://alexanderdevelopment.net/post/2018/01/30/relaying-external-queries-to-on-premise-dynamics-365-ce-orgs-with-rabbitmq-and-python/">first post</a> in this series, this approach relies on a consumer process and a queue listener process that can both access a RabbitMQ message broker.</p>
<blockquote>
<p>A consumer writes a request to a cloud-hosted RabbitMQ request queue (either directly or through a proxy service) and starts waiting for a response. On the other end, a Python script monitors the request queue for inbound requests. When it sees a new one, it executes the appropriate request through the Dynamics 365 Web API and writes the response back to a client-specific RabbitMQ response queue. The consumer then picks up the response from the queue.</p>
</blockquote>
<p>This solution is based on the remote procedure call (RPC) approach shown <a href="https://www.rabbitmq.com/tutorials/tutorial-six-python.html">here</a>. The main difference is that I have added logic to the queue monitoring script to query the Dynamics 365 Web API based on the inbound request from the consumer.</p>
<h4 id="consumersample">Consumer sample</h4>
<p>The consumer does the following:</p>
<ol>
<li>Read the text of the request to write to the queue from a command-line argument.</li>
<li>Establish a connection to the RabbitMQ broker.</li>
<li>Create a new anonymous, exclusive callback queue.</li>
<li>Write a request message a queue called &quot;rpc_queue.&quot; This message will include the callback queue as its &quot;reply_to&quot; property.</li>
<li>Monitor the callback queue for a response.</li>
</ol>
<p>There's no validation in this sample, so if you run it without a command-line argument, it will just throw an error and exit.</p>
<pre><code>import sys
import pika
import uuid
import datetime

class CrmRpcClient(object):
    def __init__(self):
        #RabbitMQ connection details
        self.rabbituser = 'crmuser'
        self.rabbitpass = 'crmpass'
        self.rabbithost = '127.0.0.1' 
        self.rabbitport = 5672
        self.rabbitqueue = 'rpc_queue'
        rabbitcredentials = pika.PlainCredentials(self.rabbituser, self.rabbitpass)
        rabbitparameters = pika.ConnectionParameters(host=self.rabbithost,
                                    port=self.rabbitport,
                                    virtual_host='/',
                                    credentials=rabbitcredentials)

                self.rabbitconn = pika.BlockingConnection(rabbitparameters)

        self.channel = self.rabbitconn.channel()

        #create an anonymous exclusive callback queue
        result = self.channel.queue_declare(exclusive=True)
        self.callback_queue = result.method.queue

        self.channel.basic_consume(self.on_response, no_ack=True,
                                   queue=self.callback_queue)

    #callback method for when a response is received - note the check for correlation id
    def on_response(self, ch, method, props, body):
        if self.corr_id == props.correlation_id:
            self.response = body

    #method to make the initial request
    def call(self, n):
        self.response = None
        #generate a new correlation id
        self.corr_id = str(uuid.uuid4())

        #publish the message to the rpc_queue - note the reply_to property is set to the callback queue from above
        self.channel.basic_publish(exchange='',
                                   routing_key=self.rabbitqueue,
                                   properties=pika.BasicProperties(
                                         reply_to = self.callback_queue,
                                         correlation_id = self.corr_id,
                                         ),
                                   body=n)
        while self.response is None:
            self.rabbitconn.process_data_events()
        return self.response

#instantiate an rpc client
crm_rpc = CrmRpcClient()

#read the request from the command line
request = sys.argv[1]

#make the request and get the response
print(&quot; [x] Requesting crm data(&quot;+request+&quot;)&quot;)
print(&quot; [.] Start time %s&quot; % str(datetime.datetime.now()))
response = crm_rpc.call(request)

#convert the response message body from the queue to a string 
decoderesponse = response.decode()

#print the output
print(&quot; [.] Received response: %s&quot; % decoderesponse)
print(&quot; [.] End time %s&quot; % str(datetime.datetime.now()))
</code></pre>
<h4 id="queuelistenersample">Queue listener sample</h4>
<p>The queue listener does the following:</p>
<ol>
<li>Establish a connection to the RabbitMQ broker</li>
<li>Monitor &quot;rpc_queue&quot; queue.</li>
<li>When a new message from the &quot;rpc_queue&quot; queue is delivered, decode the message body as a string, and determine what Web API query to execute. Note: This sample can return a list of contacts or accounts from Dynamics 365 CE based on the request the consumer sends (&quot;getcontacts&quot; or &quot;getaccounts&quot;). If any other request is received, the listener will return an error message to the consumer callback queue.</li>
<li>Execute the appropriate query against the Dynamics 365 Web API and write the response to the callback queue the client established originally.</li>
</ol>
<pre><code>import pika
import requests
from requests_ntlm import HttpNtlmAuth
import json

#NTLM credentials to access on-prem Dynamics 365 Web API
username = 'DOMAIN\\USERNAME'
userpassword = 'PASSWORD'

#full path to Web API
crmwebapi = 'http://33.0.0.16/lucastest02/api/data/v8.1'

#RabbitMQ connection details
rabbituser = 'crmuser'
rabbitpass = 'crmpass'
rabbithost = '127.0.0.1' 
rabbitport = 5672

#method to execute a Web API query based on the client request
def processquery(query):
    #set the Web API request headers
    crmrequestheaders = {
        'OData-MaxVersion': '4.0',
        'OData-Version': '4.0',
        'Accept': 'application/json',
        'Content-Type': 'application/json; charset=utf-8',
        'Prefer': 'odata.maxpagesize=500',
        'Prefer': 'odata.include-annotations=OData.Community.Display.V1.FormattedValue'
    }

    #determine which Web API query to execute
    if query == 'getcontacts':
        crmwebapiquery = '/contacts?$select=fullname,contactid'
    elif query == 'getaccounts':
        crmwebapiquery = '/accounts?$select=name,accountid'
    else:
        #only handle 'getcontacts' or 'getaccounts' requests
        return 'Operation not supported'

    #execute the query
    crmres = requests.get(crmwebapi+crmwebapiquery, headers=crmrequestheaders,auth=HttpNtlmAuth(username,userpassword))
    
    #get the results json
    crmjson = crmres.json()

    #return the json
    return crmjson

#method to handle new inbound requests
def on_request(ch, method, props, body):
    #convert the message body from the queue to a string
    decodebody = body.decode('utf-8')

    #print the request
    print(&quot; [.] Received request: '%s'&quot; % decodebody)

    #process the request query
    response = processquery(decodebody)

    #publish the response back to 'reply-to' queue from the request message and set the correlation id
    ch.basic_publish(exchange='',
                     routing_key=props.reply_to,
                     properties=pika.BasicProperties(correlation_id = \
                                                         props.correlation_id),
                     body=str(response).encode(encoding=&quot;utf-8&quot;, errors=&quot;strict&quot;))
    ch.basic_ack(delivery_tag = method.delivery_tag)

print(&quot; [x] Awaiting RPC requests&quot;)

#connect to RabbitMQ broker
rabbitcredentials = pika.PlainCredentials(rabbituser, rabbitpass)
rabbitparameters = pika.ConnectionParameters(host=rabbithost,
                               port=rabbitport,
                               virtual_host='/',
                               credentials=rabbitcredentials)
rabbitconn = pika.BlockingConnection(rabbitparameters)
channel = rabbitconn.channel()

#declare the 'rpc_queue' queue
channel.queue_declare(queue='rpc_queue')

#set qos settings for the channel
channel.basic_qos(prefetch_count=1)

#assign the 'on_request' method as a callback for when new messages delivered from the 'rpc_queue' queue
channel.basic_consume(on_request, queue='rpc_queue')

#start listening for requests
channel.start_consuming()
</code></pre>
<h4 id="tryingitout">Trying it out</h4>
<p>As I mentioned in my last post, I initially wrote my code to use a RabbitMQ broker running on my local PC, so that's why the connections in the samples show 127.0.0.1 as the host. For a demo, I've spun up a copy of RabbitMQ in a Docker container in the cloud and updated my connection parameters accordingly, but I am still running my queue listener and consumer processes on my local PC.</p>
<p>When the listener first starts, it displays a simple status message.<br>
<img src="https://alexanderdevelopment.net/content/images/2018/02/1_start_listener.png#img-thumbnail" alt="Building a simple service relay for Dynamics 365 CE with RabbitMQ and Python - part 3"></p>
<p>Then I execute a &quot;getcontacts&quot; request from the consumer in a separate window.<br>
<img src="https://alexanderdevelopment.net/content/images/2018/02/2_get_contacts.png#img-thumbnail" alt="Building a simple service relay for Dynamics 365 CE with RabbitMQ and Python - part 3"></p>
<p>From the timestamps before and after the request, you can see the round-trip time is less than .2 seconds, which includes two round trips between my local PC and the cloud-based RabbitMQ broker <em>plus</em> the actual query processing time in my local Dynamics 365 CE org.</p>
<p>Then I execute a &quot;getaccounts&quot; request.<br>
<img src="https://alexanderdevelopment.net/content/images/2018/02/4_get_accounts.png#img-thumbnail" alt="Building a simple service relay for Dynamics 365 CE with RabbitMQ and Python - part 3"></p>
<p>This request was also fulfilled in less than .2 seconds.</p>
<p>Finally I execute an invalid request to show what the error response looks like.<br>
<img src="https://alexanderdevelopment.net/content/images/2018/02/6_get_leads.png#img-thumbnail" alt="Building a simple service relay for Dynamics 365 CE with RabbitMQ and Python - part 3"></p>
<p>You'll note the total time from request to response is only about .05 seconds less than the total time for the valid queries. That indicates most of the time used in these samples is being spent on the round trips between my local PC and the RabbitMQ broker, which is not surprising.</p>
<p>Meanwhile, the queue listener wrote a simple status update for every request it received. If I were using this in production, I would use more sophisticated logging.<br>
<img src="https://alexanderdevelopment.net/content/images/2018/02/7_listener_output.png#img-thumbnail" alt="Building a simple service relay for Dynamics 365 CE with RabbitMQ and Python - part 3"></p>
<h4 id="wrappingup">Wrapping up</h4>
<p>That's it for now. In my next (and final) post in this series, I will show how you can use <a href="https://azure.microsoft.com/en-us/services/functions/">Azure Functions</a> to make a consumer service proxy so consuming applications don't have to access to your RabbitMQ broker directly, and I will also discuss some general thoughts on security and scalability for the service .</p>
</div>]]></content:encoded></item><item><title><![CDATA[Building a simple service relay for Dynamics 365 CE with RabbitMQ and Python - part 2]]></title><description><![CDATA[<div class="kg-card-markdown"><p>In my <a href="https://alexanderdevelopment.net/post/2018/01/30/relaying-external-queries-to-on-premise-dynamics-365-ce-orgs-with-rabbitmq-and-python/">last post</a> in this series, I outlined an approach for building a simple service relay with <a href="https://www.rabbitmq.com/">RabbitMQ</a> and <a href="https://www.python.org/">Python</a> to easily expose an on-premises Dynamics 365 Customer Engagement organization to external consumers. In this post I will walk through the prerequisites for building this out. I'm assuming you</p></div>]]></description><link>https://alexanderdevelopment.net/post/2018/02/01/building-a-simple-service-relay-for-dynamics-365-ce-with-rabbitmq-and-python-part-2/</link><guid isPermaLink="false">5a6ca8fec86c8900016cf351</guid><category><![CDATA[Microsoft Dynamics CRM]]></category><category><![CDATA[Dynamics 365]]></category><category><![CDATA[integration]]></category><category><![CDATA[Python]]></category><category><![CDATA[RabbitMQ]]></category><dc:creator><![CDATA[Lucas Alexander]]></dc:creator><pubDate>Fri, 02 Feb 2018 03:24:51 GMT</pubDate><media:content url="https://alexanderdevelopment.net/content/images/2018/02/simple-service-relay.png" medium="image"/><content:encoded><![CDATA[<div class="kg-card-markdown"><img src="https://alexanderdevelopment.net/content/images/2018/02/simple-service-relay.png" alt="Building a simple service relay for Dynamics 365 CE with RabbitMQ and Python - part 2"><p>In my <a href="https://alexanderdevelopment.net/post/2018/01/30/relaying-external-queries-to-on-premise-dynamics-365-ce-orgs-with-rabbitmq-and-python/">last post</a> in this series, I outlined an approach for building a simple service relay with <a href="https://www.rabbitmq.com/">RabbitMQ</a> and <a href="https://www.python.org/">Python</a> to easily expose an on-premises Dynamics 365 Customer Engagement organization to external consumers. In this post I will walk through the prerequisites for building this out. I'm assuming you have access to a Dynamics 365 CE organization, so I'm going to skip the setup for that and focus on just RabbitMQ and Python today.</p>
<h4 id="settinguprabbitmq">Setting up RabbitMQ</h4>
<p>Back in 2015 When I first blogged about RabbitMQ and Dynamics 365, I wrote a <a href="https://alexanderdevelopment.net/post/2015/01/14/using-rabbitmq-as-a-message-broker-in-dynamics-crm-data-interfaces-part-2/">detailed post</a> showing how to install and configure RabbitMQ. Since then I have discovered the joys of <a href="https://www.docker.com/">Docker</a>, which makes the process significantly easier. If you have access to Docker, I highly recommend using it. Once you have Docker running, you can use one of the <a href="https://hub.docker.com/_/rabbitmq/">official RabbitMQ images</a>. For this project, I initially used the rabbitmq:3-management image in <a href="https://www.docker.com/docker-windows">Docker for Windows</a> running on my local PC. After I got the basic functionality working, I then moved to an instance of Docker running in the cloud on a <a href="https://www.digitalocean.com" target="_blank">Digital Ocean</a> VPS.</p>
<p>If don't want to use Docker, you can use a full RabbitMQ install like I showed <a href="https://alexanderdevelopment.net/post/2015/01/14/using-rabbitmq-as-a-message-broker-in-dynamics-crm-data-interfaces-part-2/">previously</a>. The main thing to remember is that no matter how you set up your RabbitMQ server, if it is not accessible from the public internet, you will not be able to use it as a service relay between an on-premises Dynamics 365 org and external consumers.</p>
<h4 id="settinguppython">Setting up Python</h4>
<p>I'm assuming if you've gotten this far, you have a functional Python development environment (if not, give <a href="https://code.visualstudio.com/docs/languages/python">Visual Studio Code</a> a try), and the code I have written works in Python versions 2.7 or 3.x. In order to connect to both RabbitMQ and Dynamics 365, you will need a few additional packages. To connect to RabbitMQ, <a href="https://pika.readthedocs.io/en/0.11.2/">Pika</a> is the RabbitMQ team's recommended Python client, and you can get it using <a href="https://pypi.python.org/pypi/pip">pip</a>.</p>
<p>To communicate with Dynamics 365, you'll need to use the Web API, but authentication will be handled differently depending on whether you connect to an on-premises org or an online / IFD org. For online or IFD orgs, you can either use <a href="https://jlattimer.blogspot.com/2015/11/crm-web-api-using-python.html">ADAL</a> or this <a href="http://alexanderdevelopment.net/post/2016/11/27/dynamics-365-and-python-integration-using-the-web-api/">alternate approach</a> I described back in 2016. If you have an on-premises org, you can authenticate using the requests_ntlm package like I showed <a href="https://alexanderdevelopment.net/post/2018/01/15/connecting-to-an-on-premise-dynamics-365-org-from-python/">here</a>. As with the Pika client, all the packages you need to connect to Dynamics 365 are also available via pip.</p>
<h4 id="wrappingup">Wrapping up</h4>
<p>That's it for today. In my next post in this series I will show the Python code you need to make this service relay work.</p>
</div>]]></content:encoded></item><item><title><![CDATA[Building a simple service relay for Dynamics 365 CE with RabbitMQ and Python - part 1]]></title><description><![CDATA[<div class="kg-card-markdown"><p>Integrating with external systems is a common requirement in Dynamics 365 Customer Engagement projects, but when the project involves an on-premises instance of Dynamics 365, routing requests from external systems through your firewall can present an additional challenge. Over the course of the next few posts, I will show you</p></div>]]></description><link>https://alexanderdevelopment.net/post/2018/01/30/relaying-external-queries-to-on-premise-dynamics-365-ce-orgs-with-rabbitmq-and-python/</link><guid isPermaLink="false">5a636975e2df920001a88f8e</guid><category><![CDATA[Microsoft Dynamics CRM]]></category><category><![CDATA[Dynamics 365]]></category><category><![CDATA[integration]]></category><category><![CDATA[Python]]></category><category><![CDATA[RabbitMQ]]></category><dc:creator><![CDATA[Lucas Alexander]]></dc:creator><pubDate>Wed, 31 Jan 2018 01:01:10 GMT</pubDate><media:content url="https://alexanderdevelopment.net/content/images/2018/01/simple-service-relay-1.png" medium="image"/><content:encoded><![CDATA[<div class="kg-card-markdown"><img src="https://alexanderdevelopment.net/content/images/2018/01/simple-service-relay-1.png" alt="Building a simple service relay for Dynamics 365 CE with RabbitMQ and Python - part 1"><p>Integrating with external systems is a common requirement in Dynamics 365 Customer Engagement projects, but when the project involves an on-premises instance of Dynamics 365, routing requests from external systems through your firewall can present an additional challenge. Over the course of the next few posts, I will show you can easily build a simple service relay with <a href="https://www.rabbitmq.com/">RabbitMQ</a> and <a href="https://www.python.org/">Python</a> to handle inbound requests from external data interface consumers.</p>
<p>Here's how my approach works. A consumer writes a request to a cloud-hosted RabbitMQ request queue (either directly or through a proxy service) and starts waiting for a response. On the other end, a Python script monitors the request queue for inbound requests. When it sees a new one, it executes the appropriate request through the Dynamics 365 Web API and writes the response back to a client-specific RabbitMQ response queue. The consumer then picks up the response from the queue. This way the consumer doesn't need to know anything other than how to write the initial request, and no extra inbound firewall ports need to be opened.</p>
<p>This diagram shows an overview of the process. <img src="https://alexanderdevelopment.net/content/images/2018/01/simple-service-relay.png#img-thumbnail" alt="Building a simple service relay for Dynamics 365 CE with RabbitMQ and Python - part 1"></p>
<p>Although my original goal was to accelerate the deployment of data interfaces for on-premises Dynamics 365 CE instances, a simple service relay like this could also be useful for IFD or Dynamics 365 online deployments if you don't want to allow direct access to your organization. Because the queue monitoring process is single-threaded, it's an easy way to throttle requests, but you can run multiple instances of the queue monitor script if you want to increase the number of concurrent requests the relay can process.</p>
<h4 id="whyusethisapproach">Why use this approach?</h4>
<p>There are lots message brokers and service bus offerings (Azure Service Bus, IBM MQ, Amazon SQS, etc.) you could use to build a service relay. In fact there's even an Azure offering called <a href="https://docs.microsoft.com/en-us/azure/service-bus-relay/relay-what-is-it">Azure Relay</a> that aims to solve exactly the same problem that my approach does, but not just for Dynamics 365, so &quot;why use this?&quot; is a great question.</p>
<p>First, I think RabbitMQ is just a great tool, and I previously wrote a <a href="https://alexanderdevelopment.net/post/2015/01/27/using-rabbitmq-as-a-message-broker-in-dynamics-crm-data-interfaces-part-5/">five-part series</a> about using RabbitMQ with Dynamics 365 (back when it was still called Dynamics CRM). Second, using RabbitMQ instead of a cloud-specific service bus offering gives you maximum flexibility in where you host your request and response queues and how you chose to scale. For example, my RabbitMQ broker runs in a <a href="https://www.docker.com">Docker</a> container on a <a href="https://www.digitalocean.com/" target="_blank">Digital Ocean</a> VPS. If I ever decide to move off of Digital Ocean, I can easily switch to any IaaS or VPS provider. I can also configure a RabbitMQ cluster to achieve significantly faster throughput.</p>
<p>As for why I'm using Python instead of C#, which is probably more familiar to most Dynamics 365 developers, Python also makes this approach more flexible. Using Python means I'm not tied to the Dynamics 365 SDK client libraries or a Windows host for running my queue monitoring process, and I can easily package my monitoring process in a Docker image. <em>(Although I highly recommend Python, there are RabbitMQ clients for <a href="https://www.nuget.org/packages/RabbitMQ.Client">.Net</a>, and you can also find RabbitMQ tutorials for other languages including Java, Ruby and JavaScript <a href="https://www.rabbitmq.com/getstarted.html">here</a>.)</em></p>
<h4 id="wrappingup">Wrapping up</h4>
<p>That's it for now. In my next post in this series I will walk through the prerequisites for building the simple service relay.</p>
<p>How have you handled inbound data interfaces for on-premises Dynamics 365 CE organizations? Let us know in the comments!</p>
</div>]]></content:encoded></item><item><title><![CDATA[Using RabbitMQ as a message broker in Dynamics CRM data interfaces – part 5]]></title><description><![CDATA[<div class="kg-card-markdown"><p>This the final post in my five-part series on creating loosely coupled data interfaces for Dynamics CRM using RabbitMQ. In <a href="https://alexanderdevelopment.net/post/2015/01/20/using-rabbitmq-as-a-message-broker-in-dynamics-crm-data-interfaces-part-3">part 3</a> and <a href="https://alexanderdevelopment.net/post/2015/01/22/using-rabbitmq-as-a-message-broker-in-dynamics-crm-data-interfaces-part-4">part 4</a> I showed two approaches for building a Dynamics CRM plug-in that publishes notification messages to a RabbitMQ exchange. In today’s post I will show</p></div>]]></description><link>https://alexanderdevelopment.net/post/2015/01/27/using-rabbitmq-as-a-message-broker-in-dynamics-crm-data-interfaces-part-5/</link><guid isPermaLink="false">5a5837236636a30001b977c7</guid><category><![CDATA[Microsoft Dynamics CRM]]></category><category><![CDATA[CRM 2015]]></category><category><![CDATA[C#]]></category><category><![CDATA[JSON]]></category><category><![CDATA[Node.js]]></category><category><![CDATA[RabbitMQ]]></category><category><![CDATA[integration]]></category><dc:creator><![CDATA[Lucas Alexander]]></dc:creator><pubDate>Tue, 27 Jan 2015 18:00:00 GMT</pubDate><media:content url="https://alexanderdevelopment.net/content/images/2015/10/inbound-outbound-broker-1.png" medium="image"/><content:encoded><![CDATA[<div class="kg-card-markdown"><img src="https://alexanderdevelopment.net/content/images/2015/10/inbound-outbound-broker-1.png" alt="Using RabbitMQ as a message broker in Dynamics CRM data interfaces – part 5"><p>This the final post in my five-part series on creating loosely coupled data interfaces for Dynamics CRM using RabbitMQ. In <a href="https://alexanderdevelopment.net/post/2015/01/20/using-rabbitmq-as-a-message-broker-in-dynamics-crm-data-interfaces-part-3">part 3</a> and <a href="https://alexanderdevelopment.net/post/2015/01/22/using-rabbitmq-as-a-message-broker-in-dynamics-crm-data-interfaces-part-4">part 4</a> I showed two approaches for building a Dynamics CRM plug-in that publishes notification messages to a RabbitMQ exchange. In today’s post I will show how to create a Windows console application that reads messages from a queue and writes the data to Dynamics CRM. The code for this application is available on <a target="_blank" href="https://github.com/lucasalexander/Crm-Sample-Code/tree/master/CrmMessageQueuing" rel="nofollow">GitHub</a> in the LeadWriterSample project under the LucasCrmMessageQueueTools solution.</p>
<h4 id="theapproach">The approach</h4>
<p>This application is extraordinarily simple. On startup it prompts the user to supply connection information for the RabbitMQ queue that it will monitor as well as a Dynamics CRM connection string. It then monitors the queue for new JSON-formatted messages. When new messages arrive, it attempts to deserialize them into a lightweight &quot;leadtype&quot; object, and then it creates new lead records in CRM. Once a message is successfully processed and a lead is created, the application then sends a confirmation back to RabbitMQ so that the message can be removed from the queue.</p>
<p>The following code shows what happens after a connection to the RabbitMQ is established:<pre><code>//wait for some messages<br>
var consumer = new QueueingBasicConsumer(channel);<br>
channel.BasicConsume(_queue, false, consumer);<br>
 <br>
Console.WriteLine(&quot; [*] Waiting for messages. To exit press CTRL+C&quot;);<br>
 <br>
//instantiate crm org service<br>
using (OrganizationService service = new OrganizationService(_targetConn))<br>
{<br>
   while (true)<br>
   {<br>
     //get the message from the queue<br>
     var ea = (BasicDeliverEventArgs)consumer.Queue.Dequeue();<br>
 <br>
     var body = ea.Body;<br>
     var message = Encoding.UTF8.GetString(body);<br>
 <br>
     try<br>
     {<br>
       //deserialize message json to object<br>
       LeadType lead = JsonConvert.DeserializeObject&lt;LeadType&gt;(message);<br>
 <br>
       try<br>
       {<br>
         //create record in crm<br>
         Entity entity = new Entity(&quot;lead&quot;);<br>
         entity[&quot;firstname&quot;] = lead.FirstName;<br>
         entity[&quot;lastname&quot;] = lead.LastName;<br>
         entity[&quot;subject&quot;] = lead.Topic;<br>
         entity[&quot;companyname&quot;] = lead.Company;<br>
         service.Create(entity);<br>
 <br>
         //write success message to cli<br>
         Console.WriteLine(&quot;Created lead: {0} {1}&quot;, lead.FirstName, lead.LastName);<br>
 <br>
         //IMPORTANT - tell the queue the message was processed successfully so it doesn't get requeued<br>
         channel.BasicAck(ea.DeliveryTag, false);<br>
       }<br>
       catch (FaultException&lt;Microsoft.Xrm.Sdk.OrganizationServiceFault&gt; ex)<br>
       {<br>
         //return error - note no confirmation is sent to the queue, so the message will be requeued<br>
         Console.WriteLine(&quot;Could not create lead: {0} {1}&quot;, lead.FirstName, lead.LastName);<br>
         Console.WriteLine(&quot;Error: {0}&quot;, ex.Message);<br>
       }<br>
     }<br>
     catch(Exception ex)<br>
     {<br>
       //return error - note no confirmation is sent to the queue, so the message will be requeued<br>
       Console.WriteLine(&quot;Could not process message from queue&quot;);<br>
       Console.WriteLine(&quot;Error: {0}&quot;, ex.Message);<br>
     }<br>
   }<br>
}</code></pre></p>
<p>If this were to be used in production, I would have created a Windows service instead of a console application, but I wanted to make it easy to try out different connection parameters.</p>
<h4 id="verifyingtheapplication">Verifying the application</h4>
<p>The queuewriter.js application in the node-app directory in the <a target="_blank" href="https://github.com/lucasalexander/Crm-Sample-Code/tree/master/CrmMessageQueuing" rel="nofollow">GitHub repository</a> contains a sample web page that can be used to publish lead data to the CRM-Leads queue. If the application is running, you can access the web page at http://&lt;YOUR_SERVER_NAME&gt;:3000/leadform. When the form’s submit button is clicked, an AJAX call posts a JSON object to the Node.js POST endpoint I showed in my previous post. If the LeadWriterSample console application is running, it will take the message from the queue and you will see a new lead record created in CRM. The screenshots below show each piece working.</p>
<p><img src="https://alexanderdevelopment.net/content/images/2015/10/5-01-lead-form.PNG#img-thumbnail" alt="Using RabbitMQ as a message broker in Dynamics CRM data interfaces – part 5"><br>
<em>The lead has been submitted via the web form, and a success message has been received from the Node.js endpoint.</em></p>
<p><img src="https://alexanderdevelopment.net/content/images/2015/10/5-02-lead-queue.PNG#img-thumbnail" alt="Using RabbitMQ as a message broker in Dynamics CRM data interfaces – part 5"><br>
<em>The lead has landed in the CRM-Leads queue and is ready to be retrieved.</em></p>
<p><img src="https://alexanderdevelopment.net/content/images/2015/10/5-03-lead-processed.PNG#img-thumbnail" alt="Using RabbitMQ as a message broker in Dynamics CRM data interfaces – part 5"><br>
<em>The console application has retrieved and processed the submitted lead message.</em></p>
<p><img src="https://alexanderdevelopment.net/content/images/2015/10/5-04-lead-crm.PNG#img-thumbnail" alt="Using RabbitMQ as a message broker in Dynamics CRM data interfaces – part 5"><br>
<em>The lead record has been created in CRM.</em></p>
<p>One caveat about the demo lead form is that it has the RabbitMQ credentials embedded in the HTML source, so this code should not be used in production. My approach was originally formulated with the thought that a server-side process would build the JSON message to post to Node.js, so sensitive information would not be exposed. If you decide to use an AJAX post operation like is shown here, you would want to modify the queuewriter.js application to contain the credentials so they do not need to be passed from the end user’s web browser.</p>
<h4 id="wrappingup">Wrapping up</h4>
<p>That does it for this series, but I’ve just barely explored the capabilities of RabbitMQ. There’s so much more you can do with it than what I’ve shown here, and I hope I’ve piqued your interest about how you can use RabbitMQ or any other message broker in your Dynamics CRM projects. If you have any questions or want to continue the discussion, please share your thoughts in the comments.</p>
<p><em>A version of this post was originally published on the HP Enterprise Services Application Services blog.</em></p>
</div>]]></content:encoded></item><item><title><![CDATA[Using RabbitMQ as a message broker in Dynamics CRM data interfaces – part 4]]></title><description><![CDATA[<div class="kg-card-markdown"><p>Welcome back to my five-part series on creating loosely coupled data interfaces for Dynamics CRM using RabbitMQ. In my <a href="https://alexanderdevelopment.net/post/2015/01/20/using-rabbitmq-as-a-message-broker-in-dynamics-crm-data-interfaces-part-3">last post</a> I showed how to build a Dynamics CRM plug-in that publishes notification messages to a RabbitMQ exchange using the <a target="_blank" href="https://www.rabbitmq.com/dotnet.html" rel="nofollow">official RabbitMQ .Net client library</a>. Unfortunately, that plug-in can’t</p></div>]]></description><link>https://alexanderdevelopment.net/post/2015/01/22/using-rabbitmq-as-a-message-broker-in-dynamics-crm-data-interfaces-part-4/</link><guid isPermaLink="false">5a5837236636a30001b977bf</guid><category><![CDATA[Microsoft Dynamics CRM]]></category><category><![CDATA[CRM 2015]]></category><category><![CDATA[C#]]></category><category><![CDATA[JSON]]></category><category><![CDATA[Node.js]]></category><category><![CDATA[integration]]></category><category><![CDATA[RabbitMQ]]></category><dc:creator><![CDATA[Lucas Alexander]]></dc:creator><pubDate>Thu, 22 Jan 2015 18:00:00 GMT</pubDate><media:content url="https://alexanderdevelopment.net/content/images/2015/10/inbound-outbound-broker-2.png" medium="image"/><content:encoded><![CDATA[<div class="kg-card-markdown"><img src="https://alexanderdevelopment.net/content/images/2015/10/inbound-outbound-broker-2.png" alt="Using RabbitMQ as a message broker in Dynamics CRM data interfaces – part 4"><p>Welcome back to my five-part series on creating loosely coupled data interfaces for Dynamics CRM using RabbitMQ. In my <a href="https://alexanderdevelopment.net/post/2015/01/20/using-rabbitmq-as-a-message-broker-in-dynamics-crm-data-interfaces-part-3">last post</a> I showed how to build a Dynamics CRM plug-in that publishes notification messages to a RabbitMQ exchange using the <a target="_blank" href="https://www.rabbitmq.com/dotnet.html" rel="nofollow">official RabbitMQ .Net client library</a>. Unfortunately, that plug-in can’t successfully communicate with a RabbitMQ server if it’s executed inside the Dynamics CRM sandbox, so in today’s post I will show how to achieve the same results with a sandboxed plug-in. The code for this plug-in is available on <a target="_blank" href="https://github.com/lucasalexander/Crm-Sample-Code/tree/master/CrmMessageQueuing" rel="nofollow">GitHub</a> in the MessageQueueSandboxPlugin project under the LucasCrmMessageQueueTools solution.</p>
<h4 id="theapproach">The approach</h4>
<p>As I mentioned in my previous post, last month I wrote a series of blog posts about how to create a near real-time streaming API using plug-ins and Node.js. That plug-in worked fine in the Dynamics CRM sandbox, and Node.js can easily publish messages to a RabbitMQ exchange, so today’s plug-in will post a JSON-formatted message to a Node.js application, and then that Node.js application will do the actual publishing to RabbitMQ. As a result, I only need to make a couple of minor modifications to <a href="https://alexanderdevelopment.net/post/2014/12/09/creating-a-near-real-time-streaming-interface-for-dynamics-crm-with-node-js-part-3/">my earlier Node.js message-posting plug-in</a> so that it can pass the RabbitMQ connection parameters to my Node.js application. Additionally, the Node.js application that I described in my earlier series only needs a few changes to publish the message to a RabbitMQ exchange instead of sending it to Socket.IO clients.</p>
<h4 id="theplugin">The plug-in</h4>
<p>The plug-in is registered for an operation (create, update, delete, etc.) with a FetchXML query in its unsecure configuration. When the plug-in step is triggered, its associated FetchXML query is executed, and then the resulting fields are serialized into a JSON object, which is then sent to a Node.js application called queuewriter.js via an HTTP POST request. The JSON object also needs to contain RabbitMQ connection details, so I pass them as part of the plug-in step’s unsecure configuration. Here’s the configuration XML fragment to enable case notifications:</p>
<pre><code>&lt;nodeendpoint&gt;http://lucas-ajax.cloudapp.net:3000/rabbit_post_endpoint&lt;/nodeendpoint&gt;
&lt;endpoint&gt;lucas-ajax.cloudapp.net&lt;/endpoint&gt;
&lt;exchange&gt;CRM&lt;/exchange&gt;
&lt;routingkey&gt;Case&lt;/routingkey&gt;
&lt;user&gt;rabbituser&lt;/user&gt;
&lt;password&gt;PASSWORDHERE&lt;/password&gt;
&lt;query&gt;&lt;![CDATA[
&lt;fetch mapping='logical'&gt;
&lt;entity name='incident'&gt;
&nbsp;&lt;attribute name='ownerid'/&gt;
&nbsp;&lt;attribute name='modifiedby'/&gt;
&nbsp;&lt;attribute name='createdby'/&gt;
&nbsp;&lt;attribute name='title'/&gt;
&nbsp;&lt;attribute name='incidentid'/&gt;
&nbsp;&lt;attribute name='ticketnumber'/&gt;
&nbsp;&lt;attribute name='createdon'/&gt;
&nbsp;&lt;attribute name='modifiedon'/&gt;
&nbsp;&lt;filter type='and'&gt;
&nbsp; &lt;condition attribute='incidentid' operator='eq' value='{0}' /&gt;
&nbsp;&lt;/filter&gt;
&lt;/entity&gt;
&lt;/fetch&gt;
]]&gt;
&lt;/query&gt;
&lt;/config&gt;</code></pre>
<p>Just like in my <a href="https://alexanderdevelopment.net/post/2014/12/09/creating-a-near-real-time-streaming-interface-for-dynamics-crm-with-node-js-part-3/">earlier Node.js plug-in</a>, the FetchXML is extracted from the configuration XML, and the query is executed against Dynamics CRM. The results are then serialized to JSON using <a target="_blank" href="http://james.newtonking.com/json" rel="nofollow">Json.NET</a> just like before, except the serialized CRM data is included as a &quot;message&quot; object that is part of a parent JSON object that includes the RabbitMQ connection parameters. Here’s an example of the structure:<pre><code>{<br>
   &quot;endpoint&quot;:&quot;lucas-ajax.cloudapp.net&quot;,<br>
   &quot;username&quot;:&quot;rabbituser&quot;,<br>
   &quot;password&quot;:&quot;XXXXXXXX&quot;,<br>
   &quot;exchange&quot;:&quot;CRM&quot;,<br>
   &quot;routingkey&quot;:&quot;Lead&quot;,<br>
   &quot;message&quot;:{<br>
     &quot;property1&quot;:&quot;value 1&quot;,<br>
     &quot;property2&quot;:&quot;value 2&quot;,<br>
     &quot;property3&quot;:&quot;value 3&quot;<br>
   }<br>
}</code></pre></p>
<p>Because this plug-in uses the Json.NET client library, it has to be merged with the plug-in assembly before registering it in Dynamics CRM. I’ve included a batch script called ilmerge.bat in the project directory on <a target="_blank" href="https://github.com/lucasalexander/Crm-Sample-Code/tree/master/CrmMessageQueuing" rel="nofollow">GitHub</a>.</p>
<h4 id="thenodejsapplication">The Node.js application</h4>
<p>The Node.js application (queuewriter.js) waits to receive JSON messages via HTTP POST from a client. When it receives a POST request, it checks whether the message is valid JSON. If it is, the RabbitMQ connection parameters are extracted and then the notification &quot;message&quot; object is published to the RabbitMQ exchange. If everything is successful, it sends &quot;success&quot; back as a response to the client. If any errors are encountered, it sends back a descriptive error message. I am using the <a target="_blank" href="https://github.com/postwait/node-amqp" rel="nofollow">node-amqp</a> library for communicating with the RabbitMQ server, but the behavior isn’t that different from a .Net client. Here’s an extract with the relevant code:<pre><code>if (request.method == 'POST') {<br>
   request.on('data', function(chunk) {<br>
     //check if received data is valid json<br>
     if(IsJsonString(chunk.toString())){<br>
       //convert message to json object<br>
       var requestobject = JSON.parse(chunk.toString());<br>
      <br>
       //connect to rabbitmq<br>
       var connection = amqp.createConnection({ host: requestobject.endpoint<br>
       , port: 5672 //assumes default port<br>
       , login: requestobject.username<br>
       , password: requestobject.password<br>
       , connectionTimeout: 0<br>
       , authMechanism: 'AMQPLAIN'<br>
       , vhost: '/' //assumes default vhost<br>
       });<br>
      <br>
       //when connection is ready<br>
       connection.on('ready', function () {<br>
          //get the &quot;message&quot; property of the supplied request<br>
          var message = JSON.stringify(requestobject.message);<br>
         <br>
          //post it to the exchange with the supplied routing key<br>
          connection.exchange = connection.exchange(requestobject.exchange, {passive: true, confirm: true }, function(exchange) {<br>
            exchange.publish(requestobject.routingkey, message, {mandatory: true, deliveryMode: 2}, function () {<br>
              //if successful, write message to console<br>
              console.log('Message published: ' + message);<br>
             <br>
              //send &quot;success&quot; back in response<br>
              response.write('success');<br>
             <br>
              //close the rabbitmq connection and end the response<br>
              connection.end();<br>
              response.end();<br>
            });<br>
          });<br>
       });<br>
      <br>
       //if an error occurs with rabbitmq<br>
       connection.on('error', function () {<br>
          //send error message back in response and end it<br>
          response.write('failure writing message to exchange');<br>
          response.end();<br>
       });<br>
     }<br>
     else {<br>
       //if request contains invalid json<br>
       //send error message back in response and end it<br>
       response.write(&quot;invalid JSON&quot;);<br>
       response.end();<br>
     }<br>
   });<br>
}</code></pre></p>
<p>The complete queuewriter.js application is contained in the node-app directory in the <a target="_blank" href="https://github.com/lucasalexander/Crm-Sample-Code/tree/master/CrmMessageQueuing" rel="nofollow">GitHub repository</a>.</p>
<h4 id="wrappingup">Wrapping up</h4>
<p>In addition to registering the plugin and registering a step to publish a notification message to RabbitMQ, you need to deploy and start the queuewriter.js application to publish messages. Once that’s done, you can verify everything is working as expected either by looking at the Queues tab in the RabbitMQ management web UI or running the CliConsumer sample application I showed in <a href="https://alexanderdevelopment.net/post/2015/01/14/using-rabbitmq-as-a-message-broker-in-dynamics-crm-data-interfaces-part-2">part 2</a>.</p>
<p>Obviously using queuewriter.js as message proxy adds an extra layer of complexity, and you have to make sure that the application is up and running in order to process message, but it also offers a couple of advantages. First, by using queuewriter.js instead of a direct connection, you can easily use this same plug-in with different message brokers like Apache ActiveMQ and Microsoft’s Azure Service Bus. Second, the queuewriter.js application isn’t limited to just handling messages outbound from Dynamics CRM. You can also use it to process inbound messages without any changes. You just have to configure a client application to read messages from the queue and process them accordingly. A good example of this would be writing data submitted through a web form to Dynamics CRM via a RabbitMQ queue, and I will show that exact scenario in my next post!</p>
<p><em>A version of this post was originally published on the HP Enterprise Services Application Services blog.</em></p>
</div>]]></content:encoded></item><item><title><![CDATA[Using RabbitMQ as a message broker in Dynamics CRM data interfaces – part 3]]></title><description><![CDATA[<div class="kg-card-markdown"><p>This is the third post of a five-part series on creating loosely coupled data interfaces for Dynamics CRM using RabbitMQ.<br>
<a href="https://alexanderdevelopment.net/post/2015/01/14/using-rabbitmq-as-a-message-broker-in-dynamics-crm-data-interfaces-part-2">Last time</a> I showed how to install and configure a RabbitMQ server to support passing messages to and from Dynamics CRM. Today I will show how to build a Dynamics</p></div>]]></description><link>https://alexanderdevelopment.net/post/2015/01/20/using-rabbitmq-as-a-message-broker-in-dynamics-crm-data-interfaces-part-3/</link><guid isPermaLink="false">5a5837236636a30001b977b7</guid><category><![CDATA[Microsoft Dynamics CRM]]></category><category><![CDATA[JSON]]></category><category><![CDATA[C#]]></category><category><![CDATA[Node.js]]></category><category><![CDATA[RabbitMQ]]></category><category><![CDATA[CRM 2015]]></category><category><![CDATA[integration]]></category><dc:creator><![CDATA[Lucas Alexander]]></dc:creator><pubDate>Tue, 20 Jan 2015 18:00:00 GMT</pubDate><media:content url="https://alexanderdevelopment.net/content/images/2015/10/inbound-outbound-broker-3.png" medium="image"/><content:encoded><![CDATA[<div class="kg-card-markdown"><img src="https://alexanderdevelopment.net/content/images/2015/10/inbound-outbound-broker-3.png" alt="Using RabbitMQ as a message broker in Dynamics CRM data interfaces – part 3"><p>This is the third post of a five-part series on creating loosely coupled data interfaces for Dynamics CRM using RabbitMQ.<br>
<a href="https://alexanderdevelopment.net/post/2015/01/14/using-rabbitmq-as-a-message-broker-in-dynamics-crm-data-interfaces-part-2">Last time</a> I showed how to install and configure a RabbitMQ server to support passing messages to and from Dynamics CRM. Today I will show how to build a Dynamics CRM plug-in that publishes notification messages to a RabbitMQ exchange using the <a target="_blank" href="https://www.rabbitmq.com/dotnet.html" rel="nofollow">official RabbitMQ .Net client library</a>. The code for this plug-in is available on <a target="_blank" href="https://github.com/lucasalexander/Crm-Sample-Code/tree/master/CrmMessageQueuing" rel="nofollow">GitHub</a> in the MessageQueuePlugin project under the LucasCrmMessageQueueTools solution.</p>
<p>Before going any further, let’s get some bad news out of the way. Plug-ins that execute in the Dynamics CRM sandbox cannot use RabbitMQ .Net client library to publish messages to a RabbitMQ server, so you can’t use today’s plug-in approach from a CRM Online organization. In my next post, I will be showing an alternate mechanism for publishing messages that you can use from a sandboxed plug-in, but today I want to focus on the most direct integration method. Now that we’re clear on the limitations of this approach, let’s get started!</p>
<h4 id="theapproach">The approach</h4>
<p>Last month I wrote a series of blog posts about how to create a near real-time streaming API using plug-ins and Node.js. For this plug-in I’m going to basically copy the logic I used for the plug-in in that series.</p>
<p><a href="https://alexanderdevelopment.net/post/2014/12/09/creating-a-near-real-time-streaming-interface-for-dynamics-crm-with-node-js-part-3/">This post</a> outlines the approach in detail, but if you don’t want to read the entire thing, the basic idea was to create a plug-in that is registered for an operation (create, update, delete, etc.) with a FetchXML query in its unsecure configuration. When the plug-in step is triggered, its associated FetchXML query is executed, and then the resulting fields are serialized into a JSON object, which is then sent to the Node.js application via an HTTP POST request. Today’s plug-in operates in the exact same way, except instead of sending the JSON object to a Node.js endpoint, the JSON object will be published as a message to a RabbitMQ exchange.</p>
<h4 id="configuringtheplugin">Configuring the plug-in</h4>
<p>To make the plug-in easily useable in any organization without needing to be recompiled, all the RabbitMQ connection parameters are stored in the unsecure configuration along with the FetchXML query for the data to retrieve. Here’s the configuration XML fragment to enable case notifications:</p>
<pre><code>&lt;config&gt;
&lt;endpoint&gt;lucas-ajax.cloudapp.net&lt;/endpoint&gt;
&lt;exchange&gt;CRM&lt;/exchange&gt;
&lt;routingkey&gt;Case&lt;/routingkey&gt;
&lt;user&gt;rabbituser&lt;/user&gt;
&lt;password&gt;PASSWORDHERE&lt;/password&gt;
&lt;query&gt;&lt;![CDATA[
&lt;fetch mapping='logical'&gt;
&lt;entity name='incident'&gt;
&nbsp;&lt;attribute name='ownerid'/&gt;
&nbsp;&lt;attribute name='modifiedby'/&gt;
&nbsp;&lt;attribute name='createdby'/&gt;
&nbsp;&lt;attribute name='title'/&gt;
&nbsp;&lt;attribute name='incidentid'/&gt;
&nbsp;&lt;attribute name='ticketnumber'/&gt;
&nbsp;&lt;attribute name='createdon'/&gt;
&nbsp;&lt;attribute name='modifiedon'/&gt;
&nbsp;&lt;filter type='and'&gt;
&nbsp; &lt;condition attribute='incidentid' operator='eq' value='{0}' /&gt;
&nbsp;&lt;/filter&gt;
&lt;/entity&gt;
&lt;/fetch&gt;
]]&gt;
&lt;/query&gt;
&lt;/config&gt;</code></pre>
<h4 id="generatingthenotificationmessage">Generating the notification message</h4>
<p>Just like in my Node.js plug-in, the FetchXML is extracted from the configuration XML, and the query is executed against Dynamics CRM. The results are then serialized to JSON using <a target="_blank" href="http://james.newtonking.com/json" rel="nofollow">Json.NET</a>.</p>
<h4 id="publishingthemessage">Publishing the message</h4>
<p>The endpoint, exchange name, RabbitMQ user, RabbitMQ password and routing key values from the configuration XML are then used to establish a connection to RabbitMQ and publish the notification message to the exchange like so:</p>
<pre><code>try
{
&nbsp;&nbsp;&nbsp;&nbsp; //connect to rabbitmq
&nbsp;&nbsp;&nbsp;&nbsp; var factory = new ConnectionFactory();
&nbsp;&nbsp;&nbsp;&nbsp; factory.UserName = \_brokerUser;
&nbsp;&nbsp;&nbsp;&nbsp; factory.Password = \_brokerPassword;
&nbsp;&nbsp;&nbsp;&nbsp; factory.VirtualHost = "/";
&nbsp;&nbsp;&nbsp;&nbsp; factory.Protocol = Protocols.DefaultProtocol;
&nbsp;&nbsp;&nbsp;&nbsp; factory.HostName = \_brokerEndpoint;
&nbsp;&nbsp;&nbsp;&nbsp; factory.Port = AmqpTcpEndpoint.UseDefaultPort;
&nbsp;&nbsp;&nbsp;&nbsp; IConnection conn = factory.CreateConnection();
&nbsp;&nbsp;&nbsp;&nbsp; using (var connection = factory.CreateConnection())
&nbsp;&nbsp;&nbsp;&nbsp; {
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; using (var channel = connection.CreateModel())
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; {
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; //tell rabbitmq to send confirmation when messages are successfully published
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; channel.ConfirmSelect();
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; channel.WaitForConfirmsOrDie();
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; //prepare message to write to queue
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; var body = Encoding.UTF8.GetBytes(jsonMsg);
&nbsp;
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; var properties = channel.CreateBasicProperties();
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; properties.SetPersistent(true);
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; //publish the message to the exchange with the supplied routing key
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; channel.BasicPublish(_exchange, _routingKey, properties, body);
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; }
&nbsp;&nbsp;&nbsp;&nbsp; }
}
catch (Exception e)
{
&nbsp;&nbsp;&nbsp;&nbsp; tracingService.Trace("Exception: {0}", e.ToString());
&nbsp;&nbsp;&nbsp;&nbsp; throw;
}</code></pre>
<p>If any errors are encountered, the message is captured via the tracing service, and then an exception is thrown.</p>
<p>Because this plug-in uses both the RabbitMQ .Net and Json.NET client libraries, they have to be merged with the plug-in assembly before registering it in Dynamics CRM. I’ve included a batch script called ilmerge.bat in the project directory on <a target="_blank" href="https://github.com/lucasalexander/Crm-Sample-Code/tree/master/CrmMessageQueuing" rel="nofollow">GitHub</a>.</p>
<h4 id="wrappingup">Wrapping up</h4>
<p>After you register the plugin and register a step to publish a notification message to RabbitMQ, you can verify everything is working as expected either by looking at the Queues tab in the RabbitMQ management web UI or running the CliConsumer sample application I showed in<br>
<a href="https://alexanderdevelopment.net/post/2015/01/14/using-rabbitmq-as-a-message-broker-in-dynamics-crm-data-interfaces-part-2">part 2</a>.</p>
<p><em>A version of this post was originally published on the HP Enterprise Services Application Services blog.</em></p>
</div>]]></content:encoded></item><item><title><![CDATA[Using RabbitMQ as a message broker in Dynamics CRM data interfaces – part 2]]></title><description><![CDATA[<div class="kg-card-markdown"><p>Welcome back to this five-part series on creating loosely coupled data interfaces for Dynamics CRM using RabbitMQ. In my <a href="https://alexanderdevelopment.net/post/2015/01/12/using-rabbitmq-as-a-message-broker-in-dynamics-crm-data-interfaces-part-1">last post</a> I discussed why you would want to incorporate a message broker into your Dynamics CRM data interfaces, and today I will show how to install and configure RabbitMQ to</p></div>]]></description><link>https://alexanderdevelopment.net/post/2015/01/14/using-rabbitmq-as-a-message-broker-in-dynamics-crm-data-interfaces-part-2/</link><guid isPermaLink="false">5a5837236636a30001b977af</guid><category><![CDATA[Microsoft Dynamics CRM]]></category><category><![CDATA[CRM 2015]]></category><category><![CDATA[RabbitMQ]]></category><category><![CDATA[Node.js]]></category><category><![CDATA[C#]]></category><category><![CDATA[integration]]></category><category><![CDATA[JSON]]></category><dc:creator><![CDATA[Lucas Alexander]]></dc:creator><pubDate>Wed, 14 Jan 2015 18:00:00 GMT</pubDate><media:content url="https://alexanderdevelopment.net/content/images/2015/10/inbound-outbound-broker-4.png" medium="image"/><content:encoded><![CDATA[<div class="kg-card-markdown"><img src="https://alexanderdevelopment.net/content/images/2015/10/inbound-outbound-broker-4.png" alt="Using RabbitMQ as a message broker in Dynamics CRM data interfaces – part 2"><p>Welcome back to this five-part series on creating loosely coupled data interfaces for Dynamics CRM using RabbitMQ. In my <a href="https://alexanderdevelopment.net/post/2015/01/12/using-rabbitmq-as-a-message-broker-in-dynamics-crm-data-interfaces-part-1">last post</a> I discussed why you would want to incorporate a message broker into your Dynamics CRM data interfaces, and today I will show how to install and configure RabbitMQ to support the examples I’ll be presenting in the rest of the series.</p>
<h4 id="installation">Installation</h4>
<p>First, you’ll need to download the installation files from here: <a target="_blank" href="http://www.rabbitmq.com/download.html" rel="nofollow">http://www.rabbitmq.com/download.html</a>. The RabbitMQ server runs on Windows, Linux, UNIX and Mac OS X, and there are installation guides for each supported platform. Because RabbitMQ is written in Erlang, you will need to install an Erlang VM before you can install RabbitMQ, but there is a download link provided in the installation guide. I set up my RabbitMQ server on a Windows 2012 server, and I was up and running in less than 10 minutes.</p>
<p>Once you’ve installed RabbitMQ and started the server, the easiest way to manage it is via the <a target="_blank" href="http://www.rabbitmq.com/management.html" rel="nofollow">web-based management interface</a> that’s included with the server distribution. You can enable the management interface with the <a target="_blank" href="https://www.rabbitmq.com/man/rabbitmq-plugins.1.man.html" rel="nofollow">rabbitmq-plugins tool</a>. Run the following command to enable it: <em>rabbitmq-plugins enable rabbitmq_management</em>.</p>
<p>After the management plugin is enabled, you can access the web management UI from your server at <a href="http://localhost:15672">http://localhost:15672</a>. The default username is &quot;guest&quot; with &quot;guest&quot; as the password.</p>
<p>You’ll also need to configure any firewall rules necessary to allow access to your RabbitMQ server if it’s running on a server separate from your Dynamics CRM server. The default port is 5672, but that can be changed if you like. <a target="_blank" href="https://www.rabbitmq.com/configure.html" rel="nofollow">This page</a> discusses RabbitMQ configuration in great detail.</p>
<h4 id="settingupusersqueuesandexchanges">Setting up users, queues and exchanges</h4>
<p>The first thing you should do after the install is complete is change your default guest user password via the management UI. Then you can add additional users as necessary. For the examples in the rest of this series, you’ll need a user with full permissions on the default &quot;/&quot; virtual host. Here is what my &quot;rabbituser&quot; user account looks like:<br>
<img src="https://alexanderdevelopment.net/content/images/2015/10/2-00-user.PNG#img-thumbnail" alt="Using RabbitMQ as a message broker in Dynamics CRM data interfaces – part 2"></p>
<p>Next you need to create the entities required to broker the messages between publishers and consumers. Before continuing, I recommend you take a moment to skim this <a target="_blank" href="https://www.rabbitmq.com/tutorials/amqp-concepts.html" rel="nofollow">Advanced Message Queuing Protocol (AMQP) overview document</a>. If nothing else, at least read through the &quot;hello, world&quot; example section because it’s a great introduction to concepts that will be important throughout the rest of this series.</p>
<p><u>Queues</u><br>
In the management UI, navigate to the Queues tab, and create two new durable queues named CRM-Cases and CRM-Leads. (You can create any queues you want, but my examples in the rest of this series use queues with those names.) The screenshot below shows the queues in my system.<br>
<img src="https://alexanderdevelopment.net/content/images/2015/10/2-01-queues.PNG#img-thumbnail" alt="Using RabbitMQ as a message broker in Dynamics CRM data interfaces – part 2"></p>
<p><u>Exchanges</u><br>
After your queues are created, you can create an exchange and bindings to your queues so messages get routed correctly. Navigate to the Exchanges tab and create a new, durable exchange named CRM. After your CRM exchange is created, you should see something like the screenshot below.</p>
<p>Next, click on the name of the CRM exchange to open its edit screen. Scroll to the &quot;add binding&quot; section toward the bottom of the page and add a binding to the CRM-Cases queue for a routing key value of &quot;Case&quot; live in the following picture and click &quot;bind.&quot;<br>
<img src="https://alexanderdevelopment.net/content/images/2015/10/2-02-exchanges-1.PNG#img-thumbnail" alt="Using RabbitMQ as a message broker in Dynamics CRM data interfaces – part 2"></p>
<p>Do the same for the CRM-Leads queue with &quot;Lead&quot; as the routing key. You should then see the two queues bound to the exchange.<br>
<img src="https://alexanderdevelopment.net/content/images/2015/10/2-02-exchanges-2.PNG" alt="Using RabbitMQ as a message broker in Dynamics CRM data interfaces – part 2"></p>
<h4 id="checkingtheconfiguration">Checking the configuration</h4>
<p>At this point you should have everything in place to start publishing and consuming messages. You can verify your configuration works with the CliProvider and CliConsumer sample applications included in my <a target="_blank" href="https://github.com/lucasalexander/Crm-Sample-Code/tree/master/CrmMessageQueuing" rel="nofollow">GitHub repository</a> as part of the LucasCrmMessageQueueTools solution.</p>
<p>First, build and run the CliProvider application. You will be prompted to supply basic connection details, and then you can type a message to publish to your RabbitMQ server.<br>
<img src="https://alexanderdevelopment.net/content/images/2015/10/2-03a-cliprovider.PNG#img-thumbnail" alt="Using RabbitMQ as a message broker in Dynamics CRM data interfaces – part 2"></p>
<p>Once the message has been published, you can verify there’s a message waiting in the correct queue on the Queues tab of the management UI.<br>
<img src="https://alexanderdevelopment.net/content/images/2015/10/2-03b-message-ready.PNG#img-thumbnail" alt="Using RabbitMQ as a message broker in Dynamics CRM data interfaces – part 2"></p>
<p>Next, build and run the CliConsumer application. Once it connects to the CRM-Cases queue, the message will be retrieved and displayed.<br>
<img src="https://alexanderdevelopment.net/content/images/2015/10/2-03c-cliconsumer.PNG#img-thumbnail" alt="Using RabbitMQ as a message broker in Dynamics CRM data interfaces – part 2"></p>
<p>When the CliConsumer application processes a message, it sends a confirmation back to queue that triggers removal of the message from the queue. You can check the Queues tab in the management UI to verify that the CRM-Cases queue is empty.<br>
<img src="https://alexanderdevelopment.net/content/images/2015/10/2-03d-no-message-ready.PNG#img-thumbnail" alt="Using RabbitMQ as a message broker in Dynamics CRM data interfaces – part 2"></p>
<h4 id="wrappingup">Wrapping up</h4>
<p>That’s it for today. Your RabbitMQ server is now fully configured and ready for use with the examples in the rest of this series. Next time I will show how to send messages to a RabbitMQ exchange from a plug-in using the RabbitMQ .Net client library. See you then!</p>
<p><em>A version of this post was originally published on the HP Enterprise Services Application Services blog.</em></p>
</div>]]></content:encoded></item><item><title><![CDATA[Using RabbitMQ as a message broker in Dynamics CRM data interfaces – part 1]]></title><description><![CDATA[<div class="kg-card-markdown"><p>One of the things I love about Dynamics CRM is how easy it is to create data interfaces to enable integration with other systems. If you’ve worked with Dynamics CRM for any length of time, you’ve probably seen multiple web service integrations that enable interoperability with other line-of-business</p></div>]]></description><link>https://alexanderdevelopment.net/post/2015/01/12/using-rabbitmq-as-a-message-broker-in-dynamics-crm-data-interfaces-part-1/</link><guid isPermaLink="false">5a5837236636a30001b977a7</guid><category><![CDATA[Microsoft Dynamics CRM]]></category><category><![CDATA[CRM 2015]]></category><category><![CDATA[C#]]></category><category><![CDATA[JSON]]></category><category><![CDATA[Node.js]]></category><category><![CDATA[RabbitMQ]]></category><category><![CDATA[integration]]></category><dc:creator><![CDATA[Lucas Alexander]]></dc:creator><pubDate>Mon, 12 Jan 2015 18:00:00 GMT</pubDate><media:content url="https://alexanderdevelopment.net/content/images/2015/10/inbound-outbound-broker-5.png" medium="image"/><content:encoded><![CDATA[<div class="kg-card-markdown"><img src="https://alexanderdevelopment.net/content/images/2015/10/inbound-outbound-broker-5.png" alt="Using RabbitMQ as a message broker in Dynamics CRM data interfaces – part 1"><p>One of the things I love about Dynamics CRM is how easy it is to create data interfaces to enable integration with other systems. If you’ve worked with Dynamics CRM for any length of time, you’ve probably seen multiple web service integrations that enable interoperability with other line-of-business and legacy systems. A typical pair of inbound and outbound integrations might look like the picture below.<br>
<img src="https://alexanderdevelopment.net/content/images/2015/10/inbound-outbound.png#img-thumbnail" alt="Using RabbitMQ as a message broker in Dynamics CRM data interfaces – part 1"></p>
<p>Using a tightly coupled connection between the source and target systems is usually the easiest (thus the quickest and cheapest) way to establish an integration, but this is often a bad idea. Consider the inbound scenario in which an external application is sending data to Dynamics CRM. What happens if the calling application misbehaves and starts sending thousands of requests per second? This has the potential to overload your CRM server and make it completely unusable. Now consider the outbound scenario in which a CRM plug-in calls an external web service. If the destination application’s web service is offline for a few minutes, the update from the CRM plug-in will not get received unless there’s some sort of error handling and retry logic built into the plug-in</p>
<h4 id="analternateapproach">An alternate approach</h4>
<p>For these reasons, and lots of others (logging, security, scalability, just to name a few), it’s considered a best practice to create loosely coupled integrations that rely on a message broker that sits between the source and destination systems. Though the formal definition is more complicated, for our purposes a message broker can be thought of as a collection of queues that hold messages. Publishers write messages to queues, and then consumers pick up the messages and process them appropriately. Additionally, the message broker can be configured to keep messages in their queues until the consumers provide confirmation of successful processing.</p>
<p>Here’s an example of what the integrations I showed earlier would look like with a message broker.<br>
<img src="https://alexanderdevelopment.net/content/images/2015/10/inbound-outbound-broker.png#img-thumbnail" alt="Using RabbitMQ as a message broker in Dynamics CRM data interfaces – part 1"></p>
<p>For the outbound call from the CRM plug-in, the plug-in writes the message to a broker. The message is routed to a queue where it waits to be processed. A separate processing service application retrieves the message from the queue and sends it to the destination application. For the inbound call to CRM, the process works exactly the same, except the source and destination applications are reversed.</p>
<h4 id="whyisamessagebrokerbetter">Why is a message broker better?</h4>
<p>In the inbound call scenario, an effective message broker would typically be expected to handle a larger volume of inbound messages than Dynamics CRM because all it’s doing is receiving and routing the data without any additional processing. The processing service can then process the messages in the queue at a speed that doesn’t overload the Dynamics CRM server. In the case of the outbound call, the combination of a message broker and processing service can enable complex retry logic and custom logging without having to store it in the plugin layer. As an added bonus to either scenario, a message broker can provide a guarantee that messages don’t get lost between the source and destination systems as long as the message is successfully published to the broker.</p>
<h4 id="wheredowegofromhere">Where do we go from here?</h4>
<p>Over the course of my next four blog posts, I will show how to use <a target="_blank" href="https://www.rabbitmq.com/" rel="nofollow">RabbitMQ</a> as a message broker in your Dynamics CRM data interfaces. I chose RabbitMQ for this series for several reasons:</p><ol><li>It’s open source.</li><li>It runs on multiple platforms.</li><li>It’s easy to install and configure.</li><li>It’s fast at processing messages.</li></ol><p></p>
<p>If you already have a different message broker in place in your organization or you would like to try a different message broker like Apache ActiveMQ or Microsoft’s Azure Service Bus, most of the approaches and a lot of the code I’m going to show in this series will still be applicable, with the notable exception of the post that discusses how to install and configure RabbitMQ.</p>
<p>Here’s the roadmap for the rest of the series:</p><ul><li>Part 2 – basic installation and configuration of a RabbitMQ</li><li>Part 3 – creating a Dynamics CRM plug-in that publishes messages using the RabbitMQ .Net client library</li><li>Part 4 – creating a sandboxed Dynamics CRM plug-in that publishes messages to RabbitMQ via Node.js</li><li>Part 5 – reading messages from a queue and writing them to Dynamics CRM</li></ul><p></p>
<p>If you just can’t wait to dig into the code, I’ve already posted everything to my <a target="_blank" href="https://github.com/lucasalexander/Crm-Sample-Code#crmmessagequeuing" rel="nofollow">repository on GitHub</a>, so you can go ahead and take a look.</p>
<p>See you next time!</p>
<p><em>A version of this post was originally published on the HP Enterprise Services Application Services blog.</em></p>
</div>]]></content:encoded></item></channel></rss>