Last time, I described how we passed a call to the REST API through, a ‘declarative, data-retrieval and aggregation gateway for quickly consuming HTTP APIs’ created by eBay’s platform engineering group. In this blog entry, I’ll explain how we used to mash up the REST API with Google’s Geocoding API.

We wanted to create a web service that, given an account name, returned its billing address in the usual form (for example, 1 Market St, San Francisco, CA 94105, USA) and as latitude/longitude coordinates. The docs provided a table definition for the Google Geocoding API, to which we added a test query:

We plugged this into the console (click here to try it yourself), and got a whole bunch of information:

We extracted just the data we needed by being more specific in our query:


Now this worked just great for 1 Market St, SF, but we noticed that some addresses gave us more than we wanted:

Looking at the full output from the select * query on this address (omitted here, but visible in the console by clicking this link), we saw that there are multiple locations that match this address. In this case, it’s because this is a fictitious address on a demo account, but we wanted to handle the possibility that Google would return multiple coordinates for any given account address. provides a ‘monkey patch‘ mechanism that allowed us to do just this. Running the server locally, we defined the following patch in a file, geocode.js, in the tables directory:

So, if Google returns more than one set of coordinates, we simply take the first. Now we modified our table definition to use the patch, and saved it as google.geocode.ql, also in the tables directory:

Now the same query returns a single result:

Next we turned our attention to getting the relevant account. Based on the earlier work to retrieve all accounts, we quickly developed a table definition to return the relevant fields from a single account, given its name. We saved this as force.account.ql in our tables directory:

That query parameter looks pretty complex, but if we run it through a URL decoder, all becomes clear:

Now we set up an ad-hoc query in our console by defining authorization and instance_url, and providing name in the where clause:

Now we needed to mash up those two services. This is really where comes into its own – it allows SQL-like joins of two tables. In our case, a little documentation reading and experimentation led to:

The final step was to define a route to expose the query as a web service – we saved this as account.ql in our server’s routes directory:

Now we could query the server using curl from the command line, passing instance_url and authorization as HTTP headers, as before:

With the result (formatted for clarity):

A successful mashup! In the third and final part of this series, we’ll explore user defined functions, and see how they allow us to augment the functionality of an existing web service, filtering Chatter posts on the body field. Watch this space!

Get the latest Salesforce Developer blog posts and podcast episodes via Slack or RSS.

Add to Slack Subscribe to RSS