Bigger Apex Limits with Enhanced Futures

As a Salesforce platform developer, you know and love our transaction limits.  You had the initial furrowed-brow moment, and then took the limits as a challenge to your coding bona fides. You conquered the limits, because you, dear reader, you rock. You now go through most days not paying much attention to the limits, since they typically only catch inefficient code. You don’t write bad code like that, of course. For the most part, the limits are there to protect you from the other guy, and you appreciate them.

Sometimes, though, we catch some dolphins in the tuna nets. When you have a legitimate use case that does not fit neatly inside the existing transaction limits, it can be very frustrating.  (Also, PETA gets very upset.)

This One Goes To Eleven

What do you do when your transaction is at the limit? You might need to query for a large attachment and do something with it, which you can’t break into smaller transactions; this breaks the heap limit.  You may be doing heavy accounting processing which all needs to be committed in a single transaction; this breaks the CPU limit.  You may be doing a very large parent-child hierarchy insert / update, where the whole tree must be committed in a single transaction; this breaks the DML limit. You feel like Nigel Tufnel, who knew that playing at 10 wasn’t enough.

If you can identify these use cases, you can specify that your transaction needs to be granted additional limits.

Yes, you read that correctly. We will allow the occasional transaction to consume larger limits than the normal transaction. You want to do more, and we want to encourage that. For this purpose, we are piloting a new feature called Enhanced Futures.

The Boring Mechanical Details, Please

Enhanced Futures introduces a new annotation for methods marked as @future. This annotation can be used to specify a particular limit you need to have increased for this transaction.  For example, you might do one of the following:

@future (limits=2xHEAP)
public static void myMemoryHog() { }

@future (limits=3xCPU)
public static void myIntenseLogicalProcessing() { }

When these methods are run, the value used for the specified limits will be larger.  For the first, the heap limit would be 24MB, double the usual 12MB asynchronous limit.  For the second, the CPU limit would be 180 seconds, triple the usual 60-second asynchronous limit. Now you can do what you need to do to make your logic happen.

So Salesforce has spent years drilling into your programmer grey matter that limits were The Bomb. If we relax the limits, bad things will happen. And now, you’re saying that the limits are now intentionally avoidable?  What’s next? The fifth dentist now also recommends Trident sugarless chewing gum? We SHOULD cross the streams?

We (Still) Must Protect This House

Enhanced Futures will allow you to go beyond the Apex transaction limits. If this causes you to clap while sitting at your desk, I understand!  That said, it does not provide a free-for-all. There are a few guidelines that you will have to follow. (Guidelines, not limits!)

  • You will only be allowed to increase one limit in a given transaction
  • You need to know in advance that you want larger transaction limits
  • You will only be allowed to do this on asynchronous methods

The guidelines here allow us to provide the same service protection usually provided by the limits, which you are being allowed to circumvent. Service protection is always top priority for us; we have an opportunity to relax a service protection boundary, but we need to ensure that other things are in place to protect the service.

You will only be allowed to increase one limit in a given transaction. The idea here is that you know that your logic will run close to a particular limit, so you ask for more.  This feature not intended to allow you to just say “all=2x” so you can be sloppy with your code!

In addition, the cost to our service is magnified when multiple items are simultaneously increased. If you hold double the memory for three times as long, the average amount of memory in use is going to be larger, and we would like to make sure that there’s enough to go around.

You need to know in advance that you want larger transaction limits.  If you leave the house in the morning in your little 2-seater sports car, and later want to offer a few friends a ride somewhere, you cannot have your sports car magically transform into the SUV you left at home.  You need to know, before you leave the house, that you will need that SUV. If you didn’t know in advance, you are stuck deciding which of your friends you like the most.

The same will be true of Enhanced Futures. We cannot magically increase the limits on an in-flight transaction.  Well, we could, but not in a reliably safe way. To be fast about running your transactions, we don’t spend much time reporting about in-flight statistics across transactions. If a bunch of Mini Cooper transactions simultaneously were transformed into enhanced SUV transactions, we might permit an org to take up the whole internet. Even if we did report across transactions, the behavior would be that the first transformer @future call would get extra limits, and the second transformer call would be terminated to protect the service. That would be a bad experience. If you specify in advance, you’ll know that everything will succeed, and we’ll be able to protect the service.

You will only be allowed to do this on asynchronous methods. Just as we can offer much higher transaction limits in async due to the power of the message queue (“MQ”, to its close friends), we can offer enhanced limits through the power of MQ. If a synchronous trigger needed more limits, we couldn’t provide a safe and positive experience. Either we end up with lots of SUVs on the road for a single org (unsafe for the service), or we block execution of the save operation for some number of people (unpositive for them). With asynchronous, the message queue can throttle these requests like it does for all other @future methods.

For the pilot, this enhancement is only going to be made available on @future methods.  It will be difficult to utilize this on batch apex, given the previous restriction on needing to know in advance that you need larger limits.  You’d need to specify every execution as enhanced, which would make your large batch jobs awfully slow. We are looking at some alternatives to enable this enhancement on more asynchronous processing options.

Nom Nom Nom

In an Economics class I took once upon a time, we discussed the “tragedy of the commons” problem. Briefly, this is when common land gets trampled because everyone perceives that it is free for them to graze their flock. In each person’s eyes, there is no additional cost to using the common land, but when everyone does that, the land is ruined for all. This is why you can’t buy wild Atlantic Salmon anymore, and we can’t allow the multi-tenant service to end up like the Atlantic fisheries.

One answer to this problem is placing costs on an apparently free resource to ensure that it is appropriately priced against the alternatives. In this case, we will have a tighter throttle on the enhanced transactions than we do for regular old @future methods.  Exact numbers will fluctuate based on system resource availability and your org’s particular consumption pattern, but in general you can expect about ten times more throughput for regular @future methods than for enhanced futures.

In addition, when things get busy, enhanced future processing will be slowed or stopped while the service returns to normal levels. This will be done more aggressively than it will for other asynchronous processes, due to the larger potential impact of each transaction.

We want you to utilize this feature, but we don’t want you to think of it as an all-you-can-eat buffet. We didn’t want to limit how many of these you could do in a 24-hour period or something to that effect; we are trying to get rid of limits, not add them.  We instead wanted to let you use this as much as you want to, but with the expectation that these transactions may be more of an @distantfuture than the other @future methods.  Use them when you need to, but only when you actually need the excess limits.

Put Me In, Coach!

Now that I have scared you off, I will tell you about how to use the feature!  Enhanced Futures is a pilot program for Summer ‘14, so we can gather live statistics on how the feature might impact the service. As with all pilot programs, please consult the person you usually consult for requests like this to nominate yourself for consideration.

The Enhanced Futures pilot is running concurrently with the FlexQueue pilot program. These transactions are intended to be included in that structure, eventually. As such, we are working on how we can appropriately include these with correct relative costs. This may take some time; the general availability of this feature will wait for that calculation to be honed. The current hope is for Winter ‘15 or Spring ‘15, pending the outcome of the pilot program.

tagged , , Bookmark the permalink. Trackbacks are closed, but you can post a comment.
  • Daniel Ballinger

    No sign of a 2x callout multiplier for the current 10 call limit. Maybe sometime in the future 🙂 –

  • Samuel De Rycke

    It’s awesome to see this attention being given to Apex and the developers using it. Good work.

  • Robert Soesemann

    Any reason which this is only planned for @future methods and not Batch Apex? I would see high value in temporarily increasing limits there as well.

  • Kresimir Tonkovic

    Would be great if it were possible to track the future job by its id, like it is possible with batches. See

  • Jason Venable

    Curious about batch as well.

  • Cory Cowgill

    Has there been any update on this for the roadmap? This article was posted a year ago but haven’t seen it released in Spring 15. Is this being consider for Winter 16 or Spring 16? #safeHarbor