Creating a near real-time streaming interface for Dynamics CRM with Node.js – part 3

This is the third post in my four-part series about creating a near real-time streaming interface for Microsoft Dynamics CRM using Node.js and Socket.IO. In my last post I showed how to create the Node.js component of the solution to process messages received from Dynamics CRM and send notifications to connected clients. In today’s post I will show plug-in code to send messages from CRM to the Node.js application. As I mentioned in part 1, I’ve already posted the code for the complete solution to GitHub if you want to skip ahead. You’ll find the files for today’s post in the “plugin-src” directory.

I had two goals when I set out to develop this plug-in.

  1. First, I wanted to be able to use the same plug-in code to send notifications for any operation (create, update, assign, etc.) for any Dynamics CRM entity.
  2. Second, I wanted to send JSON-formatted messages so that the clients of the Node.js application that relays the messages would be able to easily parse and process them.

I experimented with a few different approaches, but I ultimately chose to create a plug-in that is registered for an operation with a FetchXML query in its unsecure configuration. When the plug-in step is triggered, its associated FetchXML query is executed, and then the resulting fields are serialized into a JSON object, which is then sent to the Node.js application via an HTTP POST request.

Configuring the plug-in

Here is the text value of the plug-in's unsecure configuration property:

<config>
<endpoint>http://lucas-ajax.cloudapp.net:3000/post_endpoint</endpoint>
<query><![CDATA[
<fetch mapping='logical'>
<entity name='incident'> 
 <attribute name='ownerid'/> 
 <attribute name='modifiedby'/> 
 <attribute name='createdby'/> 
 <attribute name='title'/> 
 <attribute name='incidentid'/> 
 <attribute name='ticketnumber'/> 
 <attribute name='createdon'/> 
 <attribute name='modifiedon'/> 
 <filter type='and'> 
  <condition attribute='incidentid' operator='eq' value='{0}' /> 
 </filter> 
</entity> 
</fetch>
]]>
</query>
</config>

You’ll note that this XML contains not only the FetchXML query in a CDATA block, but it also contains the URL for the Node.js application endpoint. If I wanted to send a notification for newly created contacts, I would just create a new plug-in step for contact create events with appropriate FetchXML in the query element. Now that I’ve shown how the plug-in steps are configured, let’s take a closer look at how the code actually works.

Parsing the plug-in configuration

First, the plug-in has to parse the endpoint and FetchXML from the step configuration. Here’s how that’s done:

private string webAddress;
private string fetchXml;
/// <summary>
/// The plug-in constructor.
/// </summary>
/// <param name="config"></param>
public Notifier(string config)
{
    if (String.IsNullOrEmpty(config))
    {
        throw new Exception("must supply configuration data");
    }
    else
    {
        XmlDocument doc = new XmlDocument();
        doc.LoadXml(config);
        XmlNodeList endpointnodes = doc.DocumentElement.SelectNodes("/config/endpoint");
        if (endpointnodes.Count == 1)
        {
            webAddress = endpointnodes[0].InnerText;
        }
        else
        {
            throw new Exception("config data must contain exactly one 'endpoint' element");
        }
        XmlNodeList querynodes = doc.DocumentElement.SelectNodes("/config/query");
        if (querynodes.Count == 1)
        {
            fetchXml = querynodes[0].InnerText;
        }
        else
        {
            throw new Exception("config data must contain exactly one 'query' element");
        }
    }
}

Executing the query

Once the FetchXML has been extracted, actually running the query and retrieving the results is trivial. Here is the relevant code from the plug-in’s Execute method:

// Obtain the target entity from the input parameters.
Entity entity = (Entity)context.InputParameters["Target"];
//set up the org service reference for our retrieve
IOrganizationServiceFactory serviceFactory = (IOrganizationServiceFactory)serviceProvider.GetService(typeof(IOrganizationServiceFactory));
IOrganizationService service = serviceFactory.CreateOrganizationService(context.UserId);
//retrieve some results using the fetchxml supplied in the configuration
EntityCollection results = service.RetrieveMultiple(new Microsoft.Xrm.Sdk.Query.FetchExpression(string.Format(fetchXml, entity.Id.ToString())));
//we should have one and only one result
if(results.Entities.Count!=1)
{
    throw new Exception("query did not return a single result");
}
Entity retrieved = results.Entities[0];

Serializing the query results

Unfortunately serializing it to JSON is more complicated. My initial inclination was to use the System.Runtime.Serialization.Json.DataContractJsonSerializer, however, as the name implies, that requires using data contracts. Regardless of how I tried to set up custom classes and data contracts to account for different attributes being returned from varying FetchXML queries, I was never able to get satisfactory results from the DataContractJsonSerializer.

I then put together a quick proof-of-concept console application that executed FetchXML queries and serialized the output with the System.Web.Script.Serialization.JavaScriptSerializer, and the serialized data looked exactly how I thought it should. When I built and deployed my plug-in, though, I ran into issues with the System.Web assembly not being available to plug-ins running in the sandbox. As I found from this thread on the Dynamics CRM forums, I wasn’t the first person to have this problem, and there is apparently no good solution.

In all my searching for solutions to my serialization problems, I kept turning up references to Json.NET. I had used Json.NET a few years back in a different project, but I was initially reluctant to use it here because I didn’t want to deal with the hassle of using ILMerge to create a single plug-in assembly for deployment. Seeing as I didn’t have any better options, though, I decided to give it a try, and it worked great. Here’s the code that turns the FetchXML results into proper JSON with Json.NET:

//set up our json writer
StringBuilder sb = new StringBuilder();
StringWriter sw = new StringWriter(sb);
JsonWriter jsonWriter = new JsonTextWriter(sw);
jsonWriter.Formatting = Newtonsoft.Json.Formatting.Indented;
jsonWriter.WriteStartObject();
//loop through the retrieved attributes
foreach (string attribute in retrieved.Attributes.Keys)
{
    //generate different output for different attribute types
    switch (retrieved[attribute].GetType().ToString())
    {
        //if we have a lookup, return the id and the name
        case "Microsoft.Xrm.Sdk.EntityReference":
            jsonWriter.WritePropertyName(attribute);
            jsonWriter.WriteValue(((EntityReference)retrieved[attribute]).Id);
jsonWriter.WritePropertyName(attribute + "_name");
            jsonWriter.WriteValue(((EntityReference)retrieved[attribute]).Name);
            break;
        //if we have an optionset value, return the value and the formatted value
        case "Microsoft.Xrm.Sdk.OptionSetValue":
    jsonWriter.WritePropertyName(attribute);
    jsonWriter.WriteValue(((OptionSetValue)retrieved[attribute]).Value);
            if (retrieved.FormattedValues.Contains(attribute))
            {
        jsonWriter.WritePropertyName(attribute + "_formatted");
    jsonWriter.WriteValue(retrieved.FormattedValues[attribute]);
            }
            break;
        //if we have money, return the value
        case "Microsoft.Xrm.Sdk.Money":
    jsonWriter.WritePropertyName(attribute);
    jsonWriter.WriteValue(((Money)retrieved[attribute]).Value);
            break;
        //if we have a datetime, return the value
        case "System.DateTime":
    jsonWriter.WritePropertyName(attribute);
    jsonWriter.WriteValue(retrieved[attribute]);
            break;
        //for everything else, return the value and a formatted value if it exists
        default:
    jsonWriter.WritePropertyName(attribute);
    jsonWriter.WriteValue(retrieved[attribute]);
            if (retrieved.FormattedValues.Contains(attribute))
            {
        jsonWriter.WritePropertyName(attribute + "_formatted");
    jsonWriter.WriteValue(retrieved.FormattedValues[attribute]);
            }
            break;
    }
}
//always write out the message name (update, create, etc.), entity name and record id
jsonWriter.WritePropertyName("operation");
jsonWriter.WriteValue(context.MessageName);
jsonWriter.WritePropertyName("entity");
jsonWriter.WriteValue(retrieved.LogicalName);
jsonWriter.WritePropertyName("id");
jsonWriter.WriteValue(retrieved.Id);
jsonWriter.WriteEndObject();
//generate the json string
string jsonMsg = sw.ToString();
jsonWriter.Close();
sw.Close();

As you can see, instead of actually serializing an object to JSON, I am using Json.NET to create a JSON string dynamically to offer maximum flexibility. I created a batch script called ilmerge.bat to merge the assemblies prior to registration. It’s included in the solution code on GitHub.

Sending the message

Once the JSON message is ready, the plug-in posts the message to the Node.js application like so:

//create the webrequest object and execute it (and post jsonmsg to it)
System.Net.WebRequest req = System.Net.WebRequest.Create(webAddress);
//must set the content type for json
req.ContentType = "application/json";
//must set method to post
req.Method = "POST";
//create a stream
byte[] bytes = System.Text.Encoding.ASCII.GetBytes(jsonMsg.ToString());
req.ContentLength = bytes.Length;
System.IO.Stream os = req.GetRequestStream();
os.Write(bytes, 0, bytes.Length);
os.Close();
//get the response
System.Net.WebResponse resp = req.GetResponse();

Wrapping up

That’s all it takes to configure Dynamics CRM to start sending messages for near real-time notifications. In my next (and final) post in this series, I will show how to configure a client to receive and process notifications from the Node.js application, and I’ll also discuss some general considerations related to this solution.

As for the plug-in, I was pleasantly surprised with how simple it ended up being once I worked through the serialization issues. What do you think about my approach? Would you have done anything differently? Please share your thoughts in the comments!

A version of this post was originally published on the HP Enterprise Services Application Services blog.

comments powered by Disqus