Error
This content is not currently supported on this browser.
+ Start a Discussion
tinman44tinman44 

Heap size?

All-
 
So I am getting this error on some of my Apex scripts "Apex heap size too large: 216581". I am trying to update a object that is linked to many accounts in a trigger/class. It proccesses about 14 of the 100+ accounts and then fails. At this point the documentation is not that straight forward about this error. I see the limit is 100,000 but that is about it.
 
Anybody have any insight about this error? How to clear the heap? How you have gotten around it?
 
Thanks in aadvance for any help/direction!
Best Answer chosen by Admin (Salesforce Developers) 
MiddhaMiddha
I was able to fix my issue. In my trigger, I had a List of string where i was adding a huge chunk of data in each of the list element. As per the governor limit, we can only have 100,000 bytes at a time. I reduced the amount of data i was adding in the list and it worked.

So its like you can not store more then 100,000 bytes of data in all your collection. You need to add logic in your code so that you process your collections before it reaches the limit and initialize them again to make more space!!

GreatG

All Answers

mikefmikef
tinman44:

I got this same error when my code logic didn't handle a corner data case.
The method just didn't know what to do with this "special" record.

To fix this I changed my method to return false if anything was different about the record.

I know this answer is vague but that's all I got.
kjpetersonkjpeterson
Has anyone else heard more about this?  I just recently started getting this with some of my scripts.
jpwagnerjpwagner

I have also gotten this message.  Actually in multiple scripts.  I've basically avoided this only once seen by rewriting what I did in a slightly different way.  I cannot explain why this occurs and any further insight from a Salesforce representative would be helpful...

 

Thanks.

jpizzalajpizzala
I just received this error as well.  We have a class that is called from a trigger (so we are working with max heap size of 1,000,000 bytes, as opposed to a trigger's limit of 100,000 bytes).

As far as I can tell, our code is pretty compliant with the governor limitations, so I don't know if there is much we can do to make it more efficient (although I'm sure someone out here could devise a more efficient method).

Is there any way to batch the processing requests, taking the heap size into consideration?  Or, is this a limitation that we will have to live with for the time being?
MiddhaMiddha
I am facing the same issue, did anyone found some reasons/solution to this. Or it seems to be an issue with APEX, cause no one from salesforce is getting back!! :smileywink:
kevinbnpowerkevinbnpower
Add me to the list of people getting this error from code that never used to produce this error.

Hello? Salesforce? Are you there???

MiddhaMiddha
I was able to fix my issue. In my trigger, I had a List of string where i was adding a huge chunk of data in each of the list element. As per the governor limit, we can only have 100,000 bytes at a time. I reduced the amount of data i was adding in the list and it worked.

So its like you can not store more then 100,000 bytes of data in all your collection. You need to add logic in your code so that you process your collections before it reaches the limit and initialize them again to make more space!!

GreatG
This was selected as the best answer
jpwagnerjpwagner

Hey,

Is there a good way to check the heap size at a given time?

Or a method to check all governor limits at once?  that would be sweet.

aalbertaalbert

Yes, you can check the heap size in runtime and check it against the limit.

Check out the LIMIT class in Apex, specifically the getHeapSize()  and getLimitHeapSize() methods. 

 

 

Scott.MScott.M
Add me to the list of people running into the heap size limit, what's the best solution? Is there anyway to clear part of the heap if the limit is close to being reached?