+ Start a Discussion
Sunny NarulaSunny Narula 

Map size error, need to blukify the Trigger.

Getting the following error when I m trying to create a MAP out of Account object.
There are many Accounts in the ORG but I need to get and Map of all so cross check a condition on trigger and update few accounts based on the conditions..

But I need to get the accounts MAP first to get the existing Account records, is there a way to get all the accounts

Sharing the code below...

Requesting please let me how to proceed with this..

 

public with sharing class AccountReleationShipTriggerHandler {    
    public static void getAccountFromTrigger(List<Account_Relationships__c>  lstAccountRelation)
    {        
        System.debug('>>>>>>> lstAccountRelation : '+lstAccountRelation.size());
        ID rtId = [SELECT Id FROM RecordType WHERE SobjectType='Account' and developerName='Site'].id;        
        Map<ID, Account> allAccountMap = new Map<ID, Account>([SELECT Id, Name, Portfolio_Type__c, Source_Zone__c, RecordTypeID FROM Account]);        

        for (Account_Relationships__c ar : lstAccountRelation)
        {
            // will perform the operations here... 
            // As here I need to refer the MAP to find the account details and update
..
            System.debug('test ar Source_Account__c : '+ ar.Source_Account__c);
            System.debug('test ar Related_Account__c : '+ ar.Related_Account__c);
            System.debug('test ar Type__c : '+ ar.Type__c);        
        }         
   }    
}

 

Getting following error : 

Error: Invalid Data.
Review all error messages below to correct your data.
Apex trigger trig_AccountUpdate caused an unexpected exception, contact your administrator: trig_AccountUpdate: execution of AfterInsert caused by: System.QueryException: Non-selective query against large object type (more than 200000 rows). Consider an indexed filter or contact salesforce.com about custom indexing. Even if a field is indexed a filter might still not be selective when: 1. The filter value includes null (for instance binding with a list that contains null) 2. Data skew exists whereby the number of matching rows is very large (for instance, filtering for a particular foreign key value that occurs many times): Class.AccountReleationShipTriggerHandler.getAccountFromTrigger: line 6, column 1

Vidol ChalamovVidol Chalamov
I don't think its possible to query more than 200 000 rows. Even if you can the amount of time that will take to literate though them all will be huge.
If you are going you are going to come to the records values against other values, try adding WHERE clause. If you are going to update something across all records, just use batch, since there the limitation is much higher.
v varaprasadv varaprasad
Hi Sunny,

I think using batch apex you can solve your issue.Just pass your trigger records to batch apex do your logic in the excute method.

More info this check the following link:
https://developer.salesforce.com/forums/?id=906F000000092RIIAY


Thanks
Varaprasad
Pankaj PPankaj P
Hi Sunny,
Your error itself has the answer to your query.
System.QueryException: Non-selective query against large object type (more than 200000 rows). 
Consider an indexed filter or contact salesforce.com about custom indexing. Even if a field is indexed a filter might still not be selective when: 
1. The filter value includes null (for instance binding with a list that contains null) 
2. Data skew exists whereby the number of matching rows is very large (for instance, filtering for a particular foreign key value that occurs many times)
You should always filter the records you process in a trigger at least for few standard objects like Accounts, Contacts, Leads, Opportunities, Cases, etc because these objects hold veritable amount of data or grows to huge amount of data. Even if your trigger may have worked today but would have failed in future with increased records.

On the other hand when you have hard requirement to process for all records for objects like these then always opt for a Asynchronous process like batch apex where in background you can process on loads of data.

Hope this helps you.
Thanks,
Pankaj.