Using RabbitMQ as a message broker in Dynamics CRM data interfaces – part 4

Welcome back to my five-part series on creating loosely coupled data interfaces for Dynamics CRM using RabbitMQ. In my last post I showed how to build a Dynamics CRM plug-in that publishes notification messages to a RabbitMQ exchange using the official RabbitMQ .Net client library. Unfortunately, that plug-in can’t successfully communicate with a RabbitMQ server if it’s executed inside the Dynamics CRM sandbox, so in today’s post I will show how to achieve the same results with a sandboxed plug-in. The code for this plug-in is available on GitHub in the MessageQueueSandboxPlugin project under the LucasCrmMessageQueueTools solution.

The approach

As I mentioned in my previous post, last month I wrote a series of blog posts about how to create a near real-time streaming API using plug-ins and Node.js. That plug-in worked fine in the Dynamics CRM sandbox, and Node.js can easily publish messages to a RabbitMQ exchange, so today’s plug-in will post a JSON-formatted message to a Node.js application, and then that Node.js application will do the actual publishing to RabbitMQ. As a result, I only need to make a couple of minor modifications to my earlier Node.js message-posting plug-in so that it can pass the RabbitMQ connection parameters to my Node.js application. Additionally, the Node.js application that I described in my earlier series only needs a few changes to publish the message to a RabbitMQ exchange instead of sending it to Socket.IO clients.

The plug-in

The plug-in is registered for an operation (create, update, delete, etc.) with a FetchXML query in its unsecure configuration. When the plug-in step is triggered, its associated FetchXML query is executed, and then the resulting fields are serialized into a JSON object, which is then sent to a Node.js application called queuewriter.js via an HTTP POST request. The JSON object also needs to contain RabbitMQ connection details, so I pass them as part of the plug-in step’s unsecure configuration. Here’s the configuration XML fragment to enable case notifications:

<nodeendpoint>http://lucas-ajax.cloudapp.net:3000/rabbit_post_endpoint</nodeendpoint>
<endpoint>lucas-ajax.cloudapp.net</endpoint>
<exchange>CRM</exchange>
<routingkey>Case</routingkey>
<user>rabbituser</user>
<password>PASSWORDHERE</password>
<query><![CDATA[
<fetch mapping='logical'>
<entity name='incident'>
 <attribute name='ownerid'/>
 <attribute name='modifiedby'/>
 <attribute name='createdby'/>
 <attribute name='title'/>
 <attribute name='incidentid'/>
 <attribute name='ticketnumber'/>
 <attribute name='createdon'/>
 <attribute name='modifiedon'/>
 <filter type='and'>
  <condition attribute='incidentid' operator='eq' value='{0}' />
 </filter>
</entity>
</fetch>
]]>
</query>
</config>

Just like in my earlier Node.js plug-in, the FetchXML is extracted from the configuration XML, and the query is executed against Dynamics CRM. The results are then serialized to JSON using Json.NET just like before, except the serialized CRM data is included as a "message" object that is part of a parent JSON object that includes the RabbitMQ connection parameters. Here’s an example of the structure:

{
   "endpoint":"lucas-ajax.cloudapp.net",
   "username":"rabbituser",
   "password":"XXXXXXXX",
   "exchange":"CRM",
   "routingkey":"Lead",
   "message":{
     "property1":"value 1",
     "property2":"value 2",
     "property3":"value 3"
   }
}

Because this plug-in uses the Json.NET client library, it has to be merged with the plug-in assembly before registering it in Dynamics CRM. I’ve included a batch script called ilmerge.bat in the project directory on GitHub.

The Node.js application

The Node.js application (queuewriter.js) waits to receive JSON messages via HTTP POST from a client. When it receives a POST request, it checks whether the message is valid JSON. If it is, the RabbitMQ connection parameters are extracted and then the notification "message" object is published to the RabbitMQ exchange. If everything is successful, it sends "success" back as a response to the client. If any errors are encountered, it sends back a descriptive error message. I am using the node-amqp library for communicating with the RabbitMQ server, but the behavior isn’t that different from a .Net client. Here’s an extract with the relevant code:

if (request.method == 'POST') {
   request.on('data', function(chunk) {
     //check if received data is valid json
     if(IsJsonString(chunk.toString())){
       //convert message to json object
       var requestobject = JSON.parse(chunk.toString());
      
       //connect to rabbitmq
       var connection = amqp.createConnection({ host: requestobject.endpoint
       , port: 5672 //assumes default port
       , login: requestobject.username
       , password: requestobject.password
       , connectionTimeout: 0
       , authMechanism: 'AMQPLAIN'
       , vhost: '/' //assumes default vhost
       });
      
       //when connection is ready
       connection.on('ready', function () {
          //get the "message" property of the supplied request
          var message = JSON.stringify(requestobject.message);
         
          //post it to the exchange with the supplied routing key
          connection.exchange = connection.exchange(requestobject.exchange, {passive: true, confirm: true }, function(exchange) {
            exchange.publish(requestobject.routingkey, message, {mandatory: true, deliveryMode: 2}, function () {
              //if successful, write message to console
              console.log('Message published: ' + message);
             
              //send "success" back in response
              response.write('success');
             
              //close the rabbitmq connection and end the response
              connection.end();
              response.end();
            });
          });
       });
      
       //if an error occurs with rabbitmq
       connection.on('error', function () {
          //send error message back in response and end it
          response.write('failure writing message to exchange');
          response.end();
       });
     }
     else {
       //if request contains invalid json
       //send error message back in response and end it
       response.write("invalid JSON");
       response.end();
     }
   });
}

The complete queuewriter.js application is contained in the node-app directory in the GitHub repository.

Wrapping up

In addition to registering the plugin and registering a step to publish a notification message to RabbitMQ, you need to deploy and start the queuewriter.js application to publish messages. Once that’s done, you can verify everything is working as expected either by looking at the Queues tab in the RabbitMQ management web UI or running the CliConsumer sample application I showed in part 2.

Obviously using queuewriter.js as message proxy adds an extra layer of complexity, and you have to make sure that the application is up and running in order to process message, but it also offers a couple of advantages. First, by using queuewriter.js instead of a direct connection, you can easily use this same plug-in with different message brokers like Apache ActiveMQ and Microsoft’s Azure Service Bus. Second, the queuewriter.js application isn’t limited to just handling messages outbound from Dynamics CRM. You can also use it to process inbound messages without any changes. You just have to configure a client application to read messages from the queue and process them accordingly. A good example of this would be writing data submitted through a web form to Dynamics CRM via a RabbitMQ queue, and I will show that exact scenario in my next post!

A version of this post was originally published on the HP Enterprise Services Application Services blog.

comments powered by Disqus