Friday, June 23, 2017

Post Install Script Framework

As an ISV, many times we have to write post install scripts for upgrade. Salesforce provides facility to do that : https://developer.salesforce.com/docs/atlas.en-us.apexcode.meta/apexcode/apex_install_handler.htm

However are quite a few limitations with this:

  • It is hard to make sure if post install code is already executed or not
  • When developing upgrade script, we don't know what version it is going to be when it gets published
  • Hard to stop the execution or do the retry


Hence we tried to implement same approach using change set and integrated with Salesforce Post Install Handler. Changeset is industry wide practice used for a long time and here are some benefits:


  • Multiple Post Install Scripts
  • Execution of Post Install Scripts in Order - and only once
  • On Error in any Script
    • Stop/Halt the execution
    • Send Error Email
  • On Successful completion of all scripts
    • send summary email of all scripts
  • Each script gets full set of governance limit
    • In case of Salesforce, entire Batch is devoted to a given script


With above points in mind, we created below framework, where ISV can just plugin in any post install script with minimal effort:


Framework



1. Entry point class - which implements Salesforce interface InstallHandler
   This class just calls PostInstallService.startService
 
2. PostInstallService.startService
   This class scans for all the classes which extends PostInstallScriptTemplate in current namespace
   It inserts them into PostInstallScript__c object, if it doesn't exist already.

3. PostInstallService.startService Calls PostInstallService.executeNextScript
 
4. PostInstallService.executeNextScript
   Based on the data in PostInstallScript__c, it will call next PostInstallScriptTemplate(N)
   PostInstallScriptTemplate is batch interface so execution will be done asynchronous fashion
   [Note: There will be callback to PostInstallService, when PostInstallScriptTemplate(N) is completed/errored]
 
5. Once the PostInstallScriptTemplate(N) is completed/errored
   It will update the PostInstallScript__c object with Status and Execution Log
 
6. PostInstallScriptTemplate(N) will call back the framework PostInstallService.executeNextScript

7. PostInstallService.executeNextScript
   based on data in PostInstallScript__c in, it will either:
a) Halt execution (if error)
b) Move on to the next script
c) Move on to finish if all of post install scripts are successfully completed

8. A user interface to display currently pending/Completed/Errored Scripts along with Execution Logs

9. A user interface provides facility to resubmit if Errored
 

Post Install Scripts



 
ISV can write Post Install script in two ways:

1) If Batch context is not needed, we can write Post install script as below,
description, sequence number and execution log is stored in database.
Actual post install logic is in executeScript method


/**
 * Created by cshah on 5/30/2017.
 */

public with sharing class PostInstallScript1 extends PostInstallScriptTemplate {

    private String executionLog;

    public override void executeScript(Database.BatchableContext bc, List<SObject> sObjects) {
        System.debug('PostInstallScript1 : execute : hoping to get executed only once ');
        executionLog  = 'script 1 done. ';
    }

    public override Integer getSequenceNumber() {
        System.debug('PostInstallScript1 : getSequenceNumber ');
        return 1;
    }

    public override String getExecutionLog() {
        System.debug('PostInstallScript1 : getExecutionLog ');
        return executionLog;
    }

    public override String getDescription() {
        System.debug('PostInstallScript1 : getDescription ');
        return 'Script 1 Description ';
    }
}


2) If we need to query 50k+ records or update 10k+ records, we have to use Batch context and here is another way to write post install Script. In this case, we will need to override start and finish methods just like we do for any Salesforce Batch


/**
 * Created by cshah on 5/30/2017.
 */

public without sharing class PostInstallScript3 extends PostInstallScriptTemplate {

    private String executionLog;
    private Integer processedCount = 0;

    public override void executeScript(Database.BatchableContext bc, List<SObject> sObjects) {
        System.debug('PostInstallScript3 : execute : hoping to get executed only once ');
        executionLog  = 'script 3 done. ';
        processedCount += sObjects.size();
    }

    public override Integer getSequenceNumber() {
        System.debug('PostInstallScript3 : getSequenceNumber ');
        return 3;
    }

    public override String getExecutionLog() {
        System.debug('PostInstallScript3 : getExecutionLog ');
        executionLog = ' Processed ' + processedCount + ' records ';
        return executionLog;
    }

    public override String getDescription() {
        System.debug('PostInstallScript3 : getDescription ');
        return 'Script 3 Description ';
    }

    public override Integer getBatchSize() {
        System.debug('PostInstallScript3 : getBatchSize ');
        return 1;
    }

    public override Database.QueryLocator startScript(Database.BatchableContext bc) {
        System.debug('PostInstallScript3 : startScript ');
        return Database.getQueryLocator('select id, name from account limit 201');
    }

    public override void  finishScript(Database.BatchableContext bc) {
        System.debug('PostInstallScript3 : finishScript ');
    }

}



User Interface


User interface allows to view the post install script, and their execution log. 
It allows to resubmit in case of error

Source code 

1. It can be found at github : https://github.com/c-shah/PostInstallScriptFramework
2. Or as unmanaged package : https://login.salesforce.com/packaging/installPackage.apexp?p0=04tf400000099bW

Monday, May 1, 2017

OData/Heroku with Salesforce - Integrate differently

As we usually come across, below is standard pattern when we integrate external app talking to Salesforce or Salesforce talking to external application.  We use different API to talk to Salesforce and use workflow outbound, or rest/soap calls to make outbound call. 

Below is different approach using OData, and in many cases it can make the integration very simple and minimal to no code on salesforce.

What is OData?
It is a standard way to represent data. Detail can be found at : http://www.odata.org/documentation/ however, at very high level it is web service way of representing data like relational database. Main features:

Metadata:
There is metadata ($metdata) to get information about all schemas, tables, columns, and procedures.

SQL like operation
We could do SQL like operation instead of creating new operation for each type (e.g. get employee by first name, by last name, etc..)

Here is naming convention (left: Classic Relational Database, right: OData 4.0 naming)



Heroku
Heroku is very well known and a lot of documentation could be found at https://www.heroku.com/, so would be focus only on two items:

Heroku Connect
Single or bi directional link to Salesforce tables/fields to Heroku tables/fields. Any changes to Salesforce is migrated to Heroku Postgres database over extremely fast and efficient SQL link. And if bidirectional link is configured, any changes on Heroku is posted back to Salesforce.




Heroku App Engine
We could host custom Java/Node and other supported language  application on Heroku with just click of a button. Hence, I wrote custom Java app using Apache Olingo to host on Heroku platform.


This app generates metadata and connects to heroku postgres to get data and exposes everything as OData service using Apache Olingo framework.

Notes:
  • Had to use Tomcat (instead of http://sparkjava.com) as Olingo requires servlet 
  • Need to implement two interfaces
    • Metadata Interface (to render schema, entity, entity set)
    • Data Processor (to fetch and return the data) 
  • Had to remove http header accept, as causing issue with Apache Olingo

Putting everything together

  • Once it is exposed as OData Service, Salesforce can connect and all the EntitySet exposed in OData would be available as external object on Salesforce (ends with __x)
  • We can do SOQL, SOSL, indirect lookup on those Salesforce Object 
  • This would be zero code on Salesforce and on Heroku side, we can get data from Cache, PostGres or External App using Rest or SOAP api

On Salesforce side, it would be just providing URL and it will automatically list down all objects and it will be able to create objects as selected.




Final Take



  •  Reiterating the first diagram, Heroku Connect provides alternative to accessing and updating data via API and it is super helpful if app is living on Heroku or would like to connect directly to postgres database
  • Salesforce can connect to external app via Odata on heroku, and that would reduce the code on Salesforce org and promote more click over code approach

Code for Apache Olingo implementation can be found at :
https://github.com/spring-work/odata

Heroku App:
http://odata-cshah.herokuapp.com/odata.svc/

Thursday, February 2, 2017

Call Salesforce REST API from Apex

Nothing new but just below example helps make call to Salesforce REST API from Apex. If you need to know Org limits at run time, most them are available via Limits call, but some are available via rest api at : /services/data/v37.0/limits (e.g. daily async limits, email or bulk email limits, etc.)  and it is easy to get those information from workbench.developerforce.com, however if you need to add this to the code, below is the code:


Add your Org URL to remote site setting
Note: if you don't know, then either you can look at browser or via below call:
System.debug( URL.getSalesforceBaseUrl().toExternalForm() );


Run Anonymous block below, which is broken down into three pieces
1. Get the base URL
2. Get Auth Token
3. Actual HTTP Request

 /* 1. get base URL */  
 public static String getSalesforceInstanceUrl() {  
       return URL.getSalesforceBaseUrl().toExternalForm();
 }  
   
 public static String getRestResponse(String url) {   
       HttpRequest httpRequest = new HttpRequest();  
       httpRequest.setEndpoint(url);  
       httpRequest.setMethod('GET');  
       /* 2. set the auth token */
       httpRequest.setHeader('Authorization', 'OAuth ' + UserInfo.getSessionId());        
       httpRequest.setHeader('Authorization', 'Bearer ' + UserInfo.getSessionID());  
       try {  
             Http http = new Http();  
             /* initiate the actual call */  
             HttpResponse httpResponse = http.send(httpRequest);  
             if (httpResponse.getStatusCode() == 200 ) {  
                   return JSON.serializePretty( JSON.deserializeUntyped(httpResponse.getBody()) );  
             } else {  
                   System.debug(' httpResponse ' + httpResponse.getBody() );  
                   throw new CalloutException( httpResponse.getBody() );  
             }   
       } catch( System.Exception e) {  
             System.debug('ERROR: '+ e);  
             throw e;  
       }  
       return null;  
 }  
   
 System.debug(' -- limit method code block -- : start ');  
 String baseUrl = getSalesforceInstanceUrl();  
 System.debug(' -- baseUrl-- : ' + baseUrl );  
 String limitsUrl = baseUrl + '/services/data/v37.0/limits';  
 System.debug(' -- limitsUrl-- : ' + limitsUrl );  
 String response = getRestResponse(limitsUrl);  
 System.debug(' -- response-- : ' + response );  
   

Tuesday, November 15, 2016

Salesforce SOQL tricks

General Governor Limits

Max query records 50,000
Max update records 10,000
Max Batch Size 2,000
SOQL for loop 200
Sensitive Selection Threshold 200,000
Number of SOQL (Sync) 100
Number of SOQL (Batch) 200


For testing, we have 200,000+ records in EventQueue__c table


Scenario 1

Query Editor
Select id from EventQueue__c

Anonymous Block
List<EventQueue__c> eventQueues = [Select id from EventQueue__c];

Issue
  • The Query from Query Editor would work fine
  • Apex anonymous block will error out as it will return more than 50k records.

Possible Fix

Limit
List<EventQueue__c> eventQueues = [Select id from eventQueue__c limit 50000];  

SOQL for loop
It is better as it batches up in 200. Internally it uses batch api if you look at the logs
for(List eventQueues : [Select id from EventQueue__c limit 50000] ) {
     System.debug(' eventQueues ' + eventQueues.size() );
}




Scenario 2

Query Editor
select count() from EventQueue__c

Anonymous Block
Integer eventCount = Database.countQuery('select count() from EventQueue__c');

Issue
  • The Query from Query Editor would work fine
  • Apex anonymous block will error out as count() has to work against 50000+ rows

Possible Fix

Limit
Integer eventCount = Database.countQuery('select count() from EventQueue__c limit 50000');




Scenario 3

Anonymous Block
List eventQueues = new List();
for(List eventQueuesBatch : [Select id from EventQueue__c limit 50000] ) {
 eventQueues.addAll( eventQueuesBatch );
}
update eventQueues;

Issue
  • Apex anonymous block will error out on update statement because we can not update more than 10k records in DML.

Possible Fix

Limit to 10k
List eventQueues = new List();
for(List eventQueuesBatch : [Select id from EventQueue__c limit 10000] ) {
 eventQueues.addAll( eventQueuesBatch );
}
update eventQueues;





Scenario 4

Query Editor
select count() from EventQueue__c where deleteFlag__c = true limit 1

Anonymous Block
Integer eventCount = Database.countQuery('select count() from EventQueue__c where deleteFlag__c = true limit 1');

Issue
  • The Query from Query Editor would work fine
  • Apex anonymous block will error out because when object has more than 200,000+ records, select clause has to be to efficient. E.g. in above query deleteFlag__c is not indexed and more than half of the records have deleteFlag__c.
  • Error Message : Non-selective query against large object type (more than 200000 rows). Consider an indexed filter or contact salesforce.com about custom indexing. Even if a field is indexed a filter might still not be selective when: 1. The filter value includes null (for instance binding with a list that contains null) 2. Data skew exists whereby the number of matching rows is very large (for instance, filtering for a particular foreign key value that occurs many times) 

Possible Fix
As suggested, Use indexed column (e.g. lookup field - foreign key to constraint the return results) as suggested in error.

Thursday, October 27, 2016

Apex Describe Methods

In dynamic SOQL, it is very critical to check if Object exists and related field also exists. Nothing fancy, but below code comes quite handy to check if Object and field exists:

      public static boolean fieldExists(String objectName, String fieldName) {  
           try {  
                Schema.SObjectType salesforceObject = Schema.getGlobalDescribe().get(objectName);  
                Map<String, Schema.SObjectField> fields = salesforceObject.getDescribe().fields.getMap();  
                for(String field : fields.keySet() ) {  
                     if( field.equalsIgnoreCase(fieldName) ) {  
                          return true;  
                     }  
                }  
           } catch(Exception e) {  
                System.debug(e);  
                return false;  
           }  
           return false;  
      }  
      public static boolean relationshipExists(String objectName, String relationshipName) {  
           try {  
                Schema.SObjectType salesforceObject = Schema.getGlobalDescribe().get(objectName);  
                Map<String, Schema.SObjectField> fields = salesforceObject.getDescribe().fields.getMap();  
                for(String field : fields.keySet() ) {  
                     if( fields.get(field).getDescribe().getType() == Schema.DisplayType.Reference && fields.get(field).getDescribe().getRelationshipName() != null && fields.get(field).getDescribe().getRelationshipName().equalsIgnoreCase(relationshipName) ) {  
                          return true;  
                     }  
                }  
           } catch(Exception e) {  
                System.debug(e);  
                return false;  
           }  
           return false;  
      }  


To Verify:

 Boolean b1 = relationshipExists('Account','Owner');  
 Boolean b2 = relationshipExists('Quote','Owner');  
 Boolean b3 = relationshipExists('ACCOUNT','OWNER');  
 Boolean b4 = relationshipExists('ACCOUNT','OWNER-');  
 System.debug(' b1 ' + b1 );  
 System.debug(' b2 ' + b2 );  
 System.debug(' b3 ' + b3 );  
 System.debug(' b4 ' + b4 );  

Results:

 b1 true  
 b2 false  
 b3 true  
 b4 false  

Wednesday, September 7, 2016

Salesforce Batch Processing

Problem statement

Every time there is change (create/update/delete) on salesforce data, we had to do quite a bit of processing and then update the external system over WS or HTTP callout. When the record is updated, we might be in normal trigger, visual force controller, scheduled job, batch, queue, or future. Sometime we can not update the external system (e.g. trigger has update waiting or in scheduled job), and sometime we should not, as it will delay the current operation and user experience. It is preferable to have it done in asynchronous manner. We also have to support huge batches of changes. E.g. 5000 accounts can be updated, and we will need to do processing and update external system based on that.

Current Salesforce Solution Limitation 



Batch Processing
Let's say we use batch processing and combination of trigger. Upon update of records, we start batch processing.

  • We can only initiate 100 batches. 
  • If change occurs in future method, then we can not call batch directly

Similar concerns for majority of approaches. Hence, we had to use below approach to do the processing.

Approach
  • We created EventQueue table
  • When there is change in record, we push the record to EventQueue table
  • Wrote batch processing to process the data from Event Queue table
  • At the end of batch process (finish) method, restart the batch if there is still data in Event Queue
Approach Add On
  • We also wanted to start the batch when we insert records in EventQueue table, hence we don't have to start or stop the batch manually.


  • Code initiation can be either from trigger or someone can call our API directly with List of record ids
  • If data is more than 1
    • Insert all records in Event Queue (1 row per record)
    • Start Batch Processing
  • Else if current context in Queue ( System.isQueue() == true )
    • Insert record in Event Queue
    • Start Batch Processing
  • Else if current context in Queue ( System.isScheduled() == true )
    • Insert record in Event Queue
    • Start Batch Processing
  • Else if we are in Batch
    • If it is our Batch (Even Queue Processing Batch)
      • Run the main code to do processing and call external system
    • Else
      • Insert record in Event Queue
      • Start Batch Processing
  • Else if we are in Future
    • Insert record in Event Queue
    • Start Batch Processing
  • Else
    • Call the main code in via Future to do processing

Here we can ensure, Main code is running either in a separate Future call (one at a time), or in Batch (one at a time). The caller is never blocked because of this processing.


Batch Processing

start()
Queries the EventQueue table for all record

execute()
Call the entry point via API (mentioned as Direct call in above figure)

finish()
If EventQueue still has records (more records got inserted while we were processing), then initial another job.



Approach Add On - Implementation
  • We also wanted to start the batch when we insert records in EventQueue table, hence we don't have to start or stop the batch manually.
This turned out to be quite complex. As you can start batch processing as you like. 
E.g. if we are future context, we can add records in EventQueue table, but can not start Batch processing - Salesforce limitation.

Similar if there is custom batch which is calling our API with let's 20 records, we put in EventQueue, but can not start batch. 

Hence, we used below algorithm to solve the problem:


Start Batch Processing Call

If we are in Future or Batch
  • Can not start Batch directly, hence use indirect route to start batch via Queue
  • Check if EventQueue count
  • Check AsyncApexJob is Queue already exists
  • Check AsyncApexJob is Batch already exists
  • if count > 0 and no Batch and no Queue
    • Queue the Event
Else
  • Check AsyncApexJob if batch already exists 
  • Check EventQueue table count
  • If count > 0 and Batch doesn't exist, start Batch processing

Start Queue Processing Call
  • Check AsyncApexJob if batch already exists 
  • Check EventQueue table count
  • If count > 0 and Batch doesn't exist, start Batch processing

This ensures that as soon as we put data in EvenQueue table, batch is started to process those records and once all records are processed, batch is finished.

Tuesday, August 16, 2016

SOSL in Org with namespace or managed package

When we had some SOSL in managed package


An internal server error has occurred
An error has occurred while processing your request. The salesforce.com support team has been notified of the problem. If you believe you have additional information that may be of help in reproducing or correcting the error, please contact Salesforce Support. Please indicate the URL of the page you were requesting, any error id shown on this page as well as any other related information. We apologize for the inconvenience. 

Thank you again for your patience and assistance. And thanks for using salesforce.com! 

Error ID: 368384005-116910 (1415744368)

If you have namespace prefix for your org, the SOQL issued from APEX already have the prefix assigned e.g. below works just fine and doesn't throw any error

List<Account> accounts = [select id, name from Account limit 5];
List<logMessage__c> logs = [select id from logMessage__c limit 5];
System.debug( accounts );
System.debug( logs );

However, if you have namespace prefix for your org (or managed package), the SOSL issued from APEX doesn't have namespace prefix, e.g. below breaks

List<List<sObject>> soslResults1 = search.query('FIND \'Exception*\' IN ALL FIELDS RETURNING LogMessage__C(Id, name, message__c)');
List<List<sObject>> soslResults2 = [FIND 'Exception*' IN ALL FIELDS RETURNING LogMessage__C(Id, name, message__c)];
logMessage__c [] log1 = ((List<LogMessage__c>)soslResults1[0]);
logMessage__c [] log2 = ((List<LogMessage__c>)soslResults2[0]);
System.debug( log1 );
System.debug( log2 );



The fix would be to add prefix:

List<List<sObject>> soslResults1 = search.query('FIND \'Exception*\' IN ALL FIELDS RETURNING mynamespace__LogMessage__C(Id, name, message__c)');
List<List<sObject>> soslResults2 = [FIND 'Exception*' IN ALL FIELDS RETURNING mynamespace__LogMessage__C(Id, name, message__c)];
logMessage__c [] log1 = ((List<LogMessage__c>)soslResults1[0]);
logMessage__c [] log2 = ((List<LogMessage__c>)soslResults2[0]);
System.debug( log1 );
System.debug( log2 );



And of course, we can add the namespace prefix problematically, hence we don't need to hard code:

ApexClass cs =[select NamespacePrefix from ApexClass where Name = 'TestClass'];
String nameSpacePrefix = cs.NamespacePrefix;
if( nameSpacePrefix != null && nameSpacePrefix != '' ) {
    nameSpacePrefix += '__';
} else {
    nameSpacePrefix = '';
}