Most Common Load Errors on Salesforce and how to handle them

Pratima Shrivastav
4 min readJul 20, 2021

Who is this article for?

This is for you if you are starting on large data migration to Salesforce or are curious about these errors. You will also find this helpful if you are a Data Architect but do not necessarily work on the Salesforce platform.

Here are the most common errors I have come across on large migration projects: I am excluding Validation errors, and other types of required field type errors but focusing on System type or Governor limit errors.

System.LimitException: Apex CPU time limit exceeded:

This is one of the most errors during data load in a complex org with multiple automation running at once, even on deleting or updating.

The most logical thing to do is make a list of automatios that you can turn off to reduce the CPU time. Talk to your Salesforce admin or developer.

The easiest way to solve this is by turning off the majority of the automation including code and config.

Note: there might be triggers that can’t be turned off in your production instance unless they are written in such a way or sometimes the business doesn’t want them to be turned off due to business reasons. In such cases, you can also solve this issue by reducing the batch size of your data load. If you are using Boomi or some ETL tool or Data loader, there is always an option to set the batch size.

Try different sizes starting from the largest number like 1000, 500, 200, or sometimes even 50 or below. And yes, this will increase the load time substantially but this is quite normal for a complex org. Keep a track of load times for each object to give you a better estimate of the timeline. It's about finding the right batch size while removing errors.

Always perform mock tests and loads in Full sandboxes.

CANNOT_INSERT_UPDATE_ACTIVATE_ENTITY:

This error is quite common but it's not an error quite enough to debug where the error is coming from.

Using Bulk load using an ETL tool, doesn't give you a specific error message(in my case it was Boomi), so it’s absolutely fine to ask your Boomi expert to run the job in few batches so you can get a specific error. This is just a status code not an error message by itself.

It is usually combined with errors like ‘TOO Many query rows: 500001, or Unable to lock row or some triger failure or even process builder

UNABLE_TO_LOCK_ROW unable to obtain exclusive access to this record:

This error most commonly occurs when you’re trying to load children records related to a parent object(Master Detail). The error message is clear and it means exactly that. When you load a child record, salesforce usually places a brief lock on the parent record for ownership calculation, roll up summary fields, other processes that might exist. When you create a single record this is not very significant and most of the times you won’t notice it unless you have a very complex org. But When you load a huge dataset of children records, this is quite significant. Refer to this cheatsheet to learn specific objects and which objects have high chances like Opportunity, Contact, Account team member, Case.

Few things you can try to solve this error.

If the owner(OwnerId) of Parent record is an integration user then make sure this user is placed on top of hierarchy or even better has no role assigned so the sharing recalculation doesn’t happen.

If you are using bulk API then enable serial mode so that records are not being updated in Parallel which reduces the chances of two child records being inserted on the same parent at the same time. This will increase load time considerably as you are not utilizing the parallel load feature and hence should be used as a last resort.

Again reducing batch size helps as well. Let's say if you reduce the batch size to 500 or so, it definitely reduces the chances of lock error. This can be used in combination with the above. Be careful when you reduce batch size because this means you will use more batches and will quickly end up consuming 10000 batches per 24 hours. If you are making your batch size less than 200, then you don't need to use Bulk Api because Salesforce performs the operations in a chunk of 200.

If the objects have workflows, process builders, triggers, and if its possible deactivate them during the load because when there is a field update on a related record, salesforce also locks that record.

If there are lookup relations, consider unchecking, delete this field also on the lookup field. This actually removes the lock on insertion, deletion or update.

I highly recommend to have a ticket open with Salesforce during huge data migration so they can help you increase the size or relax the limits. I have learned this the hard way.

You can sort the records that you are loading based on ParentIds if possible. This will reduce the chances of the same parent being updated in different batches at the same time.

If you are still with me, here are some helpful links:

--

--

Pratima Shrivastav

Salesforce enthusiast, learning and sharing on the Platform, Architect at heart, love to help young women navigating in tech career