I've created a trigger for an Opportunity object that updates a lookup field with the value of the ownerID.
Here is the trigger code:
trigger trig_Opportunity_CreateOppOwner on Opportunity (before update,before insert) {
List<id>OppsID = new List<id>(); //List that will hold all id of all new Opportunities owners
for(Opportunity Opp:Trigger.new){ //Loop that only retrives new versions of the Opportunity
if(trigger.isInsert)
OppsId.add(Opp.ID);
else if (trigger.oldMap.get(Opp.id).Op_Owner__c != trigger.oldMap.get(Opp.id).OwnerId)//trigger.isInsert and trigger.isUpdate added to account when it's null, avoids NullPointerException. Now all inserted opportunities are processed.
OppsId.add(Opp.ID);//adds all new opportunity owners to OppsId list. //Checks to see if they are new owners and not already on the owner list.
}
List<Opportunity> oppsFromDb = [SELECT id, OwnerId FROM Opportunity WHERE id IN:OppsID];
for(Opportunity useOpps:oppsFromDb){
trigger.newMap.get(useOpps.Id).Op_Owner__c = useOpps.OwnerId;
}
This trigger works, but when I test a Talend batch file that would update 2000 Opportunity records per request with an approximate total of 10,000 Opportunities, it gives me a SOQL query 101 error because the trigger is automatically called and goes over the 100 SQL query limit.
I was thinking of fixing it using a batch apex file process. Here is what the pseudocode would look like:
if(OppsID.size() <= 100) {
List<Opportunity> oppsFromDb = [SELECT id, OwnerId FROM Opportunity WHERE id IN:OppsID];
for(Opportunity useOpps:oppsFromDb){
trigger.newMap.get(useOpps.Id).Op_Owner__c = useOpps.OwnerId;
}
}
else {
//Call a batch apex class that would handle it}
Would this be a correct approach to take to try and bypass this problem? Thanks