I had previously written about Social Web-to-Lead, a Node.js application running on Heroku that allows Facebook users to enter their contact information via a ‘Contact Me’ link. That information is then captured as a Lead record in Salesforce. That post described the basic application architecture that uses the nforce library for making REST API calls from a Node.js application. The entire application codebase is available here and you can also watch a recording of the Dreamforce 2012 session where I demoed and discussed the application.

The application described in that post uses real-time, synchronous integration between Heroku and Force.com. Every time a Facebook user hits the Submit button on the ‘Contact Me’ web page, a Lead record is created in Salesforce via a REST API call. While that design works great in most cases, it has a potentially fatal flaw – API request limits. As most Force.com developers know, you get a certain number of API calls in a given 24 hour period (the exact number depends on your Salesforce Edition and number of licenses). If the Social Web-to-Lead Facebook app were to go truly viral for a company (which after all is the whole point of a ‘social’ app), there can be potentially thousands of people clicking the ‘Contact Me’ button, with each click consuming an API call. It would literally be a case of the application being too good/popular for its own good! Lets see how we can tweak the application architecture and mitigate this potential API bottleneck.

My buddy Pat has written an excellent post on optimizing Force.com API calls from a Heroku application. In that post he described how he used the Heroku Memcache add-on to cache data on Heroku and avoid having to query Salesforce every time a user refreshes the web application. The use case for Social Web-to-Lead is a little different though. Instead of querying Salesforce data, we need to insert data (new Lead records) from the Heroku app. In order to avoid running into API limits, we need to insert that Lead data in bulk. The before and after architecture of Social Web-to-Lead shows how we’ll implement this new asynchronous/bulk design.

As shown in the ‘after’ picture above, instead of invoking the REST API and inserting a new Lead record every time that a user clicks ‘Submit’, we’ll save the contact information in a local Redis datastore on Heroku. We’ll then have a scheduled background process read the Redis datastore at predetermined intervals (say every 10 minutes) and insert all the Lead records in bulk (using a single API call). Lets walk through this design change in more detail.

 

Caching Lead information in Redis

Lets start by adding the Heroku Redis add-on to our Facebook application. Redis is one of the more popular NoSQL databases and stores its data as simple name–>value pairs. Its a perfect match for our use case since the Lead information that we need to cache/store on Heroku is very simple and does not really require a full featured relational database like Postgres. Adding a Redis datastore to your Heroku application is as simple as running the following command via the Heroku CLI tool (which you can install using the Heroku toolbelt).

$ heroku addons:add redistogo

That command will provision a free Redis datastore (paid versions are also available for production use) for your application. The command will also add a REDISTOGO_URL Environment Variable that you can read in your application code for connecting to the datastore. Here’s a snippet from the web.js script for Social Web-to-Lead that shows how the Node.js application connects to and stores its Lead information in the Redis datastore.

if (process.env.REDISTOGO_URL) {
	var rtg   = require('url').parse(process.env.REDISTOGO_URL);
	var redis = require('redis').createClient(rtg.port, rtg.hostname);
	redis.auth(rtg.auth.split(':')[1]);
} else {
  	var redis = require('redis').createClient();
}

app.post('/lead',function(request, response) {
    request.body.leadRec.Phone = request.body.Phone;
    addLead2Queue(request.body.leadRec, request, response);
});

function addLead2Queue(leadRec, request, response){
	redis.lpush('leads',JSON.stringify(leadRec));
	response.send('');
}

Lines 1-7 show how simple it is to connect to a Redis datastore using the Redis NPM module for Node.js. Note that you’ll first need to install the Redis module (just run $ npm install redis from the command line) in order to use it in the application. Line 11 shows how instead of synchronous inserting the Lead record into Salesforce each time that a user hits the ‘Submit’ button, we’re simply storing the contact information in Redis via a call to the addLead2Queue method. In that method we use the lpush Redis command to add the new contact information to the array of Lead records stored under the ‘leads’ key.

 

Adding a Scheduled worker process to clear the Redis cache

Now that we’re storing all the contact information being entered via the ‘Contact Me’ form in Redis, we need a way to eventually push that data to Salesforce. We’re do so by implementing a worker process in Heroku. A worker process is simply a piece of code that does some asynchronous processing in the background (as opposed to the application code that synchronously processes incoming web requests). For Social Web-two-Lead, I created worker.js to do this asynchronous processing. Here is a small snippet from that file.

function sfdcAuthenticate(callback){
  // authenticate using username-password oauth flow
  sfdcOrg.authenticate({
           username: process.env.FORCE_DOT_COM_USERNAME,
           password: process.env.FORCE_DOT_COM_PASSWORD },
           function(err, resp){
              if(err) {
                  console.log('Error: ' + err.message);
              } else {
                  oauth = resp;
                  redis.llen('leads', function(err, total) {
                  var leads = new Array();
                  for(var i=0;i<total;i++) {
                     redis.lpop('leads', function(err, x) {
                         leads.push(JSON.parse(x));
                         if (leads.length == total){
                             insertLeads(leads);
                         }
                     });
                  }
              });
           }
           if(callback){
              callback();
           }
    });
}

The sfdcAuthenticate method uses the nforce library to do the OAuth authentication with Salesforce (described in greater detail in my previous post). In the callback function that is invoked by nforce once the authentication is completed, I use the lpop command (line 14) to extract all the saved Lead records from Redis into the ‘leads’ array. The insertLeads method is then invoked to bulk insert those Lead records into Salesforce.

Next up, how to schedule worker.js to run at predetermined intervals. First, we need to add worker.js to the Procfile for the Heroku application.

web: node web.js
worker: node worker.js

Next, we’re going to use the free Heroku Scheduler add-on to run worker.js at set intervals and flush the Redis ‘cache’ of all saved Lead data. In order to add the add-on, simply run

$ heroku addons:add scheduler:standard

from the Heroku CLI tool. You can then run

$ heroku addons:open scheduler

to open the Scheduler dashboard, select ‘Add Job’, enter heroku run worker, select a frequency and next run time. Note that the Scheduler Add-on only supports simple scheduling tasks (e.g. daily or every 10 minutes). If you’d like to implement a more complex run schedule for worker.js, you can write some simple Node.js logic (say in clock.js) that wakes up at predetermined intervals and executes worker.js. For testing purposes where you don’t want to wait for the next scheduled run of worker.js to kick in, you can also execute

$ heroku run worker

via the Heroku CLI tool to run a one-off Heroku process and execute the logic immediately.

 

Inserting bulk records in Salesforce using REST

Before breaking out the bubbly for Social Web-to-Lead, there is one last missing piece of the puzzle. The previous section shows how worker.js calls the insertLeads method to bulk insert Lead records in Salesforce. However, the Force.com REST API does not currently support bulk inserts or updates. We could use the SOAP API (which does support bulk inserts), but that would mean having to parse WSDLs and process XML in Node.js – not a task for the faint of heart. Instead, I implemented a custom Apex REST service to accept an array of Lead records and insert them into Salesforce. This in fact is a classic use case for Apex REST where our application requirements (in this case bulk insert) are not met by the standard Force.com REST API, thus necessitating a custom RESTful interface/service.

Here is the simple Apex class that implemented the bulk Lead insert. You can read this article for further details on creating custom Apex REST services.

@RestResource(urlMapping='/bulk_lead_insert/*')
global class BulkLeadInsertSvc {

    @HttpPost
    global static void insertLeads (List<Lead> leads) {
        Database.DMLOptions dlo = new Database.DMLOptions();
        dlo.optAllOrNone = true;
        Database.insert(leads, dlo);
    }
}

With that custom REST endpoint defined on Force.com, lets switch back to the Heroku side. Here is a snippet from the insertLeads method of worker.js that invokes the custom Apex REST service defined above.

function insertLeads(leadRecords) {
  var data = '';
  var host = (require('url').parse(oauth.instance_url))['host'];
  var options = {
    host: host,
    path: '/services/apexrest/bulk_lead_insert/',
    method: 'POST',
    headers: {
      'Host': host,
      'Authorization': 'OAuth '+oauth.access_token,
      'Accept':'application/jsonrequest',
      'Cache-Control':'no-cache,no-store,must-revalidate',
      'Content-type':'application/json'
    }

  }

  //Issue the Apex REST API call to add the Lead records
  var req = http.request(options, function(res) {
      res.on('data', function(_data) {
        data += _data;
      });

      res.on('end', function(d) {
        if (res.statusCode != 200){
          //Force.com returned an error. Display it on the console
          console.log('Error from Force.com:'+data);
        }
      });

  }).on('error', function(e) {
      console.log(e);
  });

  var l = {};
  l.leads=leadRecords;
  req.write(JSON.stringify(l));
  req.end();
}

The code is fairly straightforward and self explanatory and so I won’t delve into it further. This finally completes our new and improved design for Social Web-to-Lead. To recap, we implemented a more efficient asynchronous/bulk design in order to conserve API calls. Each time that a Facebook user enters their contact information and hits ‘Submit’, we save the data in a local Redis datastore on Heroku. We then have a scheduled background worker process ‘flush’ the Redis datastore periodically. All lead records found in the datastore by the worker process are bulk inserted into Salesforce (using a single API call) via a custom Apex REST service.

In the final installment of this series, I’ll explore how Social Web-to-Lead implemented real-time push notifications from Force.com to Heroku using WebSockets. Till then, comments and questions are welcome.

tagged , , , , , , , Bookmark the permalink. Trackbacks are closed, but you can post a comment.