Tuesday, November 15, 2016

Salesforce SOQL tricks

General Governor Limits

Max query records 50,000
Max update records 10,000
Max Batch Size 2,000
SOQL for loop 200
Sensitive Selection Threshold 200,000
Number of SOQL (Sync) 100
Number of SOQL (Batch) 200


For testing, we have 200,000+ records in EventQueue__c table


Scenario 1

Query Editor
Select id from EventQueue__c

Anonymous Block
List<EventQueue__c> eventQueues = [Select id from EventQueue__c];

Issue
  • The Query from Query Editor would work fine
  • Apex anonymous block will error out as it will return more than 50k records.

Possible Fix

Limit
List<EventQueue__c> eventQueues = [Select id from eventQueue__c limit 50000];  

SOQL for loop
It is better as it batches up in 200. Internally it uses batch api if you look at the logs
for(List eventQueues : [Select id from EventQueue__c limit 50000] ) {
     System.debug(' eventQueues ' + eventQueues.size() );
}




Scenario 2

Query Editor
select count() from EventQueue__c

Anonymous Block
Integer eventCount = Database.countQuery('select count() from EventQueue__c');

Issue
  • The Query from Query Editor would work fine
  • Apex anonymous block will error out as count() has to work against 50000+ rows

Possible Fix

Limit
Integer eventCount = Database.countQuery('select count() from EventQueue__c limit 50000');




Scenario 3

Anonymous Block
List eventQueues = new List();
for(List eventQueuesBatch : [Select id from EventQueue__c limit 50000] ) {
 eventQueues.addAll( eventQueuesBatch );
}
update eventQueues;

Issue
  • Apex anonymous block will error out on update statement because we can not update more than 10k records in DML.

Possible Fix

Limit to 10k
List eventQueues = new List();
for(List eventQueuesBatch : [Select id from EventQueue__c limit 10000] ) {
 eventQueues.addAll( eventQueuesBatch );
}
update eventQueues;





Scenario 4

Query Editor
select count() from EventQueue__c where deleteFlag__c = true limit 1

Anonymous Block
Integer eventCount = Database.countQuery('select count() from EventQueue__c where deleteFlag__c = true limit 1');

Issue
  • The Query from Query Editor would work fine
  • Apex anonymous block will error out because when object has more than 200,000+ records, select clause has to be to efficient. E.g. in above query deleteFlag__c is not indexed and more than half of the records have deleteFlag__c.
  • Error Message : Non-selective query against large object type (more than 200000 rows). Consider an indexed filter or contact salesforce.com about custom indexing. Even if a field is indexed a filter might still not be selective when: 1. The filter value includes null (for instance binding with a list that contains null) 2. Data skew exists whereby the number of matching rows is very large (for instance, filtering for a particular foreign key value that occurs many times) 

Possible Fix
As suggested, Use indexed column (e.g. lookup field - foreign key to constraint the return results) as suggested in error.

Thursday, October 27, 2016

Apex Describe Methods

In dynamic SOQL, it is very critical to check if Object exists and related field also exists. Nothing fancy, but below code comes quite handy to check if Object and field exists:

      public static boolean fieldExists(String objectName, String fieldName) {  
           try {  
                Schema.SObjectType salesforceObject = Schema.getGlobalDescribe().get(objectName);  
                Map<String, Schema.SObjectField> fields = salesforceObject.getDescribe().fields.getMap();  
                for(String field : fields.keySet() ) {  
                     if( field.equalsIgnoreCase(fieldName) ) {  
                          return true;  
                     }  
                }  
           } catch(Exception e) {  
                System.debug(e);  
                return false;  
           }  
           return false;  
      }  
      public static boolean relationshipExists(String objectName, String relationshipName) {  
           try {  
                Schema.SObjectType salesforceObject = Schema.getGlobalDescribe().get(objectName);  
                Map<String, Schema.SObjectField> fields = salesforceObject.getDescribe().fields.getMap();  
                for(String field : fields.keySet() ) {  
                     if( fields.get(field).getDescribe().getType() == Schema.DisplayType.Reference && fields.get(field).getDescribe().getRelationshipName() != null && fields.get(field).getDescribe().getRelationshipName().equalsIgnoreCase(relationshipName) ) {  
                          return true;  
                     }  
                }  
           } catch(Exception e) {  
                System.debug(e);  
                return false;  
           }  
           return false;  
      }  


To Verify:

 Boolean b1 = relationshipExists('Account','Owner');  
 Boolean b2 = relationshipExists('Quote','Owner');  
 Boolean b3 = relationshipExists('ACCOUNT','OWNER');  
 Boolean b4 = relationshipExists('ACCOUNT','OWNER-');  
 System.debug(' b1 ' + b1 );  
 System.debug(' b2 ' + b2 );  
 System.debug(' b3 ' + b3 );  
 System.debug(' b4 ' + b4 );  

Results:

 b1 true  
 b2 false  
 b3 true  
 b4 false  

Wednesday, September 7, 2016

Salesforce Batch Processing

Problem statement

Every time there is change (create/update/delete) on salesforce data, we had to do quite a bit of processing and then update the external system over WS or HTTP callout. When the record is updated, we might be in normal trigger, visual force controller, scheduled job, batch, queue, or future. Sometime we can not update the external system (e.g. trigger has update waiting or in scheduled job), and sometime we should not, as it will delay the current operation and user experience. It is preferable to have it done in asynchronous manner. We also have to support huge batches of changes. E.g. 5000 accounts can be updated, and we will need to do processing and update external system based on that.

Current Salesforce Solution Limitation 



Batch Processing
Let's say we use batch processing and combination of trigger. Upon update of records, we start batch processing.

  • We can only initiate 100 batches. 
  • If change occurs in future method, then we can not call batch directly

Similar concerns for majority of approaches. Hence, we had to use below approach to do the processing.

Approach
  • We created EventQueue table
  • When there is change in record, we push the record to EventQueue table
  • Wrote batch processing to process the data from Event Queue table
  • At the end of batch process (finish) method, restart the batch if there is still data in Event Queue
Approach Add On
  • We also wanted to start the batch when we insert records in EventQueue table, hence we don't have to start or stop the batch manually.


  • Code initiation can be either from trigger or someone can call our API directly with List of record ids
  • If data is more than 1
    • Insert all records in Event Queue (1 row per record)
    • Start Batch Processing
  • Else if current context in Queue ( System.isQueue() == true )
    • Insert record in Event Queue
    • Start Batch Processing
  • Else if current context in Queue ( System.isScheduled() == true )
    • Insert record in Event Queue
    • Start Batch Processing
  • Else if we are in Batch
    • If it is our Batch (Even Queue Processing Batch)
      • Run the main code to do processing and call external system
    • Else
      • Insert record in Event Queue
      • Start Batch Processing
  • Else if we are in Future
    • Insert record in Event Queue
    • Start Batch Processing
  • Else
    • Call the main code in via Future to do processing

Here we can ensure, Main code is running either in a separate Future call (one at a time), or in Batch (one at a time). The caller is never blocked because of this processing.


Batch Processing

start()
Queries the EventQueue table for all record

execute()
Call the entry point via API (mentioned as Direct call in above figure)

finish()
If EventQueue still has records (more records got inserted while we were processing), then initial another job.



Approach Add On - Implementation
  • We also wanted to start the batch when we insert records in EventQueue table, hence we don't have to start or stop the batch manually.
This turned out to be quite complex. As you can start batch processing as you like. 
E.g. if we are future context, we can add records in EventQueue table, but can not start Batch processing - Salesforce limitation.

Similar if there is custom batch which is calling our API with let's 20 records, we put in EventQueue, but can not start batch. 

Hence, we used below algorithm to solve the problem:


Start Batch Processing Call

If we are in Future or Batch
  • Can not start Batch directly, hence use indirect route to start batch via Queue
  • Check if EventQueue count
  • Check AsyncApexJob is Queue already exists
  • Check AsyncApexJob is Batch already exists
  • if count > 0 and no Batch and no Queue
    • Queue the Event
Else
  • Check AsyncApexJob if batch already exists 
  • Check EventQueue table count
  • If count > 0 and Batch doesn't exist, start Batch processing

Start Queue Processing Call
  • Check AsyncApexJob if batch already exists 
  • Check EventQueue table count
  • If count > 0 and Batch doesn't exist, start Batch processing

This ensures that as soon as we put data in EvenQueue table, batch is started to process those records and once all records are processed, batch is finished.

Tuesday, August 16, 2016

SOSL in Org with namespace or managed package

When we had some SOSL in managed package


An internal server error has occurred
An error has occurred while processing your request. The salesforce.com support team has been notified of the problem. If you believe you have additional information that may be of help in reproducing or correcting the error, please contact Salesforce Support. Please indicate the URL of the page you were requesting, any error id shown on this page as well as any other related information. We apologize for the inconvenience. 

Thank you again for your patience and assistance. And thanks for using salesforce.com! 

Error ID: 368384005-116910 (1415744368)

If you have namespace prefix for your org, the SOQL issued from APEX already have the prefix assigned e.g. below works just fine and doesn't throw any error

List<Account> accounts = [select id, name from Account limit 5];
List<logMessage__c> logs = [select id from logMessage__c limit 5];
System.debug( accounts );
System.debug( logs );

However, if you have namespace prefix for your org (or managed package), the SOSL issued from APEX doesn't have namespace prefix, e.g. below breaks

List<List<sObject>> soslResults1 = search.query('FIND \'Exception*\' IN ALL FIELDS RETURNING LogMessage__C(Id, name, message__c)');
List<List<sObject>> soslResults2 = [FIND 'Exception*' IN ALL FIELDS RETURNING LogMessage__C(Id, name, message__c)];
logMessage__c [] log1 = ((List<LogMessage__c>)soslResults1[0]);
logMessage__c [] log2 = ((List<LogMessage__c>)soslResults2[0]);
System.debug( log1 );
System.debug( log2 );



The fix would be to add prefix:

List<List<sObject>> soslResults1 = search.query('FIND \'Exception*\' IN ALL FIELDS RETURNING mynamespace__LogMessage__C(Id, name, message__c)');
List<List<sObject>> soslResults2 = [FIND 'Exception*' IN ALL FIELDS RETURNING mynamespace__LogMessage__C(Id, name, message__c)];
logMessage__c [] log1 = ((List<LogMessage__c>)soslResults1[0]);
logMessage__c [] log2 = ((List<LogMessage__c>)soslResults2[0]);
System.debug( log1 );
System.debug( log2 );



And of course, we can add the namespace prefix problematically, hence we don't need to hard code:

ApexClass cs =[select NamespacePrefix from ApexClass where Name = 'TestClass'];
String nameSpacePrefix = cs.NamespacePrefix;
if( nameSpacePrefix != null && nameSpacePrefix != '' ) {
    nameSpacePrefix += '__';
} else {
    nameSpacePrefix = '';
}