Data loader error on upsert - Duplicate value found
I am batching the data loader to do upserts to five objects, using external IDs that need to be unique so that I can make sure to reference the same salesforce records every time I run the upsert, since each time, some of the records will have some different values.
The problem I'm having is that sometimes the records are updated when a match is found that already exists, and sometimes it gives me the error "duplicate value found". It's about a 50/50 split between the outcome that I want and the error messages. In theory, I shouldn't ever get any error messages when I re-run the upsert, since it should just update the records to the new values.
Removing the unique restriction on the external ID is not an option for me, as it just defeats the purpose of me using one completely if more than one record can have it.
I haven't been able to find anything so far, so any help on this would be greatly appreciated.
Set the Batch Size to 1 in the data loader settings. This will cause the import to take longer since only 1 record is being sent per API call, but it will guarantee duplicate external Ids are not being submitted in the same API call (per upsert call).
Does your dataset have duplicates in it? For example, does the external id "abc" exist more than once in the dataset you are importing? That might cause that error to be thrown. Try setting the batch size to 1 if you have to (a setting in the data loader) to prove that is the issue.
I have run some tests with a couple of smaller data sets and that looks like the issue.
I have multiples of the external ID in my data set (phone numbers), so is there any way of making it so that they all make it through without error and without manipulating the data before upserting it?
Set the Batch Size to 1 in the data loader settings. This will cause the import to take longer since only 1 record is being sent per API call, but it will guarantee duplicate external Ids are not being submitted in the same API call (per upsert call).
Set the Batch Size to 1 in the data loader settings. This will cause the import to take longer since only 1 record is being sent per API call, but it will guarantee duplicate external Ids are not being submitted in the same API call (per upsert call).
All Answers
Does your dataset have duplicates in it? For example, does the external id "abc" exist more than once in the dataset you are importing? That might cause that error to be thrown. Try setting the batch size to 1 if you have to (a setting in the data loader) to prove that is the issue.
I have run some tests with a couple of smaller data sets and that looks like the issue.
I have multiples of the external ID in my data set (phone numbers), so is there any way of making it so that they all make it through without error and without manipulating the data before upserting it?
Set the Batch Size to 1 in the data loader settings. This will cause the import to take longer since only 1 record is being sent per API call, but it will guarantee duplicate external Ids are not being submitted in the same API call (per upsert call).
It seems that has taken care of it.
Thanks for the help!
Thanks! Helped me too.
Hi All,
Without setting batchsize 1, is there any other option to get rid of this error while using external id filds update.
Same thing i am using here, Account key ( External id) is updating using upsert in trigger, gives duplicate account key error.
Thanks in advance, please help sounds very bad for me now.