Question

I'm trying to write an "after update" trigger that does a batch update on all child records of the record that has just been updated. This needs to be able to handle 15k+ child records at a time. Unfortunately, the limit appears to be 100, which is so far below my needs it's not even close to acceptable. I haven't tried splitting the records into batches of 100 each, since this will still put me at a cap of 10k updates per trigger execution. (Maybe I could just daisy-chain triggers together? ugh.)

Does anyone know what series of hoops I can jump through to overcome this limitation?

Edit: I tried calling following @future function in my trigger, but it never updates the child records:

global class ParentChildBulkUpdater
{
    @future 
    public static void UpdateChildDistributors(String parentId) {
        Account[] children = [SELECT Id FROM Account WHERE ParentId = :parentId];

        for(Account child : children)
            child.Site = 'Bulk Updater Fired';
        update children;

    }
}
Was it helpful?

Solution

It's worst than that, you're not even going to be able to get those 15k records in the first place, because there is a 1,000 row query limit within a trigger (This scales to the number of rows the trigger is being called for, but that probably doesnt help)

I guess your only way to do it is with the @future tag - read up on that in the docs. It gives you much higher limits. Although, you can only call so many of those in a day - so you may need to somehow keep track of which parent objects have their children updating, and then process that offline.

A final option may be to use the API via some external tool. But you'll still have to make sure everything in your code is batched up.

I thought these limits were draconian at first, but actually you can do a hell of a lot within them if you batch things correctly, we regularly update 1,000's of rows from triggers. And from an architectural point of view, much more than that and you're really talking batch processing anyway which isnt normally activated by a trigger. One things for sure - they make you jump through hoops to do it.

OTHER TIPS

The best (and easiest) route to take with this problem is to use Batch Apex, you can create a batch class and fire it from the trigger. Like @future it runs in a separate thread, but it can process up to 50,000,000 records!

You'll need to pass some information to your batch class before using database.executeBatch so that it has the list of parent IDs to work with, or you could just get all of the accounts of course ;)

I've only just noticed how old this question is but hopefully this answer will help others.

I think Codek is right, going the API / external tool route is a good way to go. The governor limits still apply, but are much less strict with API calls. Salesforce recently revamped their DataLoader tool, so that might be something to look into.

Another thing you could try is using a Workflow rule with an Outbound Message to call a web service on your end. Just send over the parent object and let a process on your end handle the child record updates via the API. One thing to be aware of with outbound messages, it is best to queue up the process on your end somehow, and immediately respond to Salesforce. Otherwise Salesforce will resend the message.

@future doesn't work (does not update records at all)? Weird. Did you try using your function in automated test? It should work and and the annotation should be ignored (during the test it will be executed instantly, test methods have higher limits). I suggest you investigate this a bit more, it seems like best solution to what you want to accomplish.

Also - maybe try to call it from your class, not the trigger?

Daisy-chaining triggers together will not work, I've tried it in the past.

Your last option might be batch Apex (from Winter'10 release so all organisations should have it by now). It's meant for mass data update/validation jobs, things you typically run overnight in normal databases (it can be scheduled). See http://www.salesforce.com/community/winter10/custom-cloud/program-cloud-logic/batch-code.jsp and release notes PDF.

I believe in version 18 of the API the 1000 limit has been removed. (so the documentations says but in some cases I still hit a limit)

So you may be able to use batch apex. With a single APEX update statement

Something like:

List children = new List{};

for(childObect__c c : [SELECT ....]) {

c.foo__c = 'bar';

children.add(c);

} update(children);;

Besure you bulkify your tigger also see http://sfdc.arrowpointe.com/2008/09/13/bulkifying-a-trigger-an-example/

Maybe a change to your data model is the better option here. Think of creating a formula on the children object where you access the data from the parent. This would be far more efficient probably.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top