Wednesday, October 11, 2017

Salesforce Streaming API Summary

Was just working on streaming API and just putting thoughts together with different variations of streaming API :


  1. Push Topic
  2. Generic Topic
  3. Platform Event 

Even though underneath the cover, they all use same technology stack, they provide very different features.


Push TopicGeneric TopicPlatform Events
OverviewPush topic is used for SOQL based subscriptionGeneric Topic is used to subscribe and publish arbitrary events
Platform Events is used for structured publish and subscribe

There is a lot more native support for both publish and subscribe

Full control over structure (payload) of the event

Replay : 

When client disconnect and reconnect again, they can replay from last 24 hours with id where they left from
special ids : -1 from the beginning, -2 : all new events

Supported (version 36.0 +)Supported (version 36.0 +)Supported (version 36.0 +)
Create (Setup)
1) Using Apex Insert Statements

or 

2) workbench 
- it defaults all the param except SOQL query

Using Streaming Channel Tab

or

2) Workbench
Create __e object
Support for Trigger SubscriptionNoNoYes
How to Publishupon SOQLUsing Rest API
EventBus.publish;

API : Post like sobject /services/data/v41.0/sobjects/Low_Ink__e/ 

Process Builder

Flow
How to Subscribe
workbench (36.0) (later are causing problems)

using Java (cometd lib)

Using Javascript (cometd lib)
workbench

using Java (cometd lib)

Using Javascript (cometd lib)
using Java (cometd)

Using Javascript (cometd)

Using Visualforce (cometd)

Process Flow

Flow

Trigger

Workbench
Channel Name

This is useful when we subscribe using cometd library
/topic/<>

e.g.
/topic/AccountUpdatePushTopic
/u/<>

e.g.
/u/GenericUpdateTopic
/event/<>

e.g.
/event/UpdateObjectEvent__e


Push Topic


Push Topic Setup


Workbench
we can setup a push topic using workbench, but it doesn't allow granular control over topic configuration.



APEX

  • NotifyForFields can be Referenced (part of query - default), Where (in where condition), All (all changes)
  • Once topic is inserted, we can use PushTopic object to make any update
 PushTopic pushTopic = new PushTopic();  
 pushTopic.Name = 'AccountUpdatePushTopic';  
 pushTopic.Query = 'SELECT Id, Name, AccountNumber from Account';  
 pushTopic.ApiVersion = 40.0;  
 pushTopic.NotifyForOperationCreate = true;  
 pushTopic.NotifyForOperationUpdate = true;  
 pushTopic.NotifyForOperationUndelete = true;  
 pushTopic.NotifyForOperationDelete = true;  
 pushTopic.NotifyForFields = 'Referenced';  
 insert pushTopic;  
   



Push Topic Publish

There is no API support for publish. Any change in the data based on condition (e.g. Notify Operation and fields configured) would fire the event on the topic.


Push Topic Subscribe

Workbench

  • This is quick way to test and also record the channel, as we will need it later for cometd library

Java/JavaScript
covered later


Generic Topic



Generic Topic Setup

  • We must setup generic toipc using salesforce UI
  • There is no apex or workbench support to create generic topic



Generic Topic Publish

Rest API (Workbench)

URL : /services/data/v/sobjects/StreamingChannel//push

We can find streaming channel id using query (SELECT Name, ID FROM StreamingChannel)

e.g. /services/data/v40.0/sobjects/StreamingChannel/0M61I000000TN1FSAW/push

payload : 

{
  "pushEvents": [
      {
          "payload": "Broadcast message to all subscribers",
          "userIds": []
      }
   ]

}



Generic Topic Subscribe

WorkBench


Java/JavaScript:
Covered Later


Platform Events


Platform Event Setup


  • Essentially created two attributes (ObjectName__c and RecordId__c) so that we can publish the event having those attributes


Platform Event Publish

Apex


  • Publish event call doesn't fail, hence we have to go through the results
  • Also publish call doesn't participate in transaction, meaning if transaction has to fail, event will still publish


 List events = new List();  
 UpdateObjectEvent__e event = new UpdateObjectEvent__e(objectName__c='Account', recordId__c='1234');  
 events.add( event );  
 List results = EventBus.publish(events);  
 if( results != null && results.size() > 0 ) {  
   for (Database.SaveResult sr : results) {  
     if (sr.isSuccess()) {  
       System.debug('Successfully published event. ' + results.size() );  
     } else {  
       for(Database.Error err : sr.getErrors()) {  
         System.debug('Error returned: ' + err.getStatusCode() + ' - ' + err.getMessage());  
       }  
     }  
   }  
 } else {  
   System.debug(' Noting is published. ');  
 }  
   


Rest API

Endpoint  : /services/data/v40.0/sobjects/UpdateObjectEvent__e/
Payload    : { "RecordId__c" : "123455634343", "ObjectName__c" : "Account" }



Soap API
Similar to rest api, it is just call to insert into sObject

Process Builder 
not covered, but straightforward

Visual Flow 
not covered, but straightforward


Platform Event Subscribe

Trigger

  • only after insert is supported

 trigger UpdateObjectEventTrigger on UpdateObjectEvent__e (after insert) {  
   System.debug(' UpdateObjectEventTrigger ' );  
     
   for (UpdateObjectEvent__e event : Trigger.New) {  
     System.debug(' : ' + event.RecordId__c + ' ' + event.ObjectName__c );  
   }  
     
 }  

Process Builder 
not covered, but straightforward

Visual Flow 
not covered, but straightforward

Java/JavaScript 
Covered later


Platform Event Debug

  • debug statements in trigger doesn't show up in debug logs, we need to enable them using below







Generic Java Subscriber Client



We need to download and build the EMP connector from salesforce 
  • download : https://github.com/forcedotcom/EMP-Connector
  • unzip, and run "mvn clean install"
  • Use emp-connector-0.0.1-SNAPSHOT-phat.jar in the new project that we are going to create
  • Create a new project (java 1.8) and use below code. 
  • Please note that channel name can be changed as per which topic we are subscribing



 package com.spring.client;  
   
 import com.salesforce.emp.connector.BayeuxParameters;  
 import com.salesforce.emp.connector.EmpConnector;  
 import com.salesforce.emp.connector.TopicSubscription;  
   
 import java.util.Map;  
 import java.util.concurrent.TimeUnit;  
 import java.util.function.Consumer;  
   
 import static com.salesforce.emp.connector.LoginHelper.login;  
   
 public class StreamingClient {  
   
   public static void main(String args[]) throws Exception {  
     long replayFrom = EmpConnector.REPLAY_FROM_EARLIEST;  
     BayeuxParameters params = login("streaming@springsoa.com", "Welcome1");  
     EmpConnector connector = new EmpConnector(params);  
     Consumer<Map<String, Object>> consumer = event -> System.out.println(String.format("Received:\n%s", event));  
     connector.start().get(5, TimeUnit.SECONDS);  
     TopicSubscription subscription = connector.subscribe("/event/UpdateObjectEvent__e", replayFrom, consumer ).get(5, TimeUnit.SECONDS);  
     System.out.println(String.format("Subscribed: %s", subscription));  
     //subscription.cancel();  
     //connector.stop();  
   }  
 }  
   



Generic Javascript Subscriber Client


We can use cometD library to listen to the event, I had to write customer wrapper (cometdCustom.js) to greatly simply the visualforce page. We can also use this in independent HTML page,as long as we can get oauth session id.






Source Code

Java code                       :   https://github.com/c-shah/streaming-java-client    
Salesforce and JS code  :   https://github.com/c-shah/salesforce-streaming

Refresh VisualForce Page with Platform Events

Problem Statement 

A visual force page is embedded inside the standard page layout to display additional information from third party application as below.

A few use cases :
- Let's say there is change inside third party application content and we want to refresh the entire page
- If third party application is making change to sales force data on this page using API and we need to refresh the page



A few failed solutions

1) If third party application is rendered inside the iframe, we can not access the parent salesforce page.

E.g. if we try one of the below we get the error message, as salesforce will prevent third party iframe to access the parent page.

 window.parent.location.href = URL  
 parent.location.href=parent.location.href  
 parent.location.reload();  
 window.parent.location.href = window.parent.location.href  

error message:

 Unsafe JavaScript attempt to initiate navigation for frame with URL 'https://c.na59.visual.force.com/servlet/servlet.Integration?lid=066f4000001yE8F&ic=1&...'.   
 The frame attempting navigation is neither same-origin with the target, nor is it the target's parent or opener  


2) Polling : in parent visualforce page we can constantly do polling on server side to listen for changes and refresh the VF page as needed. Polling in general is not good idea and not very scalable.




Platform Event to Rescue

Event based solution comes quite handy here, where we can publish the event on server side and visualforce page can listen to event and on right criteria, it can alert the end user on change or refresh the page so that we can see the fresh data.

Platform event is glorified version of streaming api, and we can get more details on a separate post here.



Solution:

1. Create platform event
2. Upon change, publish event using rest api or on Apex
3. On visualforce page, listen to those event and refresh the page




Create platform event 

  • Essentially created two attributes (ObjectName__c and RecordId__c) so that we can publish the event having those attributes





Publish event using rest api or on Apex


1) Publish via Apex
  • Publish event call doesn't fail, hence we have to go through the results
  • Also publish call doesn't participate in transaction, meaning if transaction has to fail, event will still publish

 List<UpdateObjectEvent__e> events = new List<UpdateObjectEvent__e>();  
 UpdateObjectEvent__e event = new UpdateObjectEvent__e(objectName__c='Account', recordId__c='1234');  
 events.add( event );  
 List<Database.SaveResult> results = EventBus.publish(events);  
 if( results != null && results.size() > 0 ) {  
   for (Database.SaveResult sr : results) {  
     if (sr.isSuccess()) {  
       System.debug('Successfully published event. ' + results.size() );  
     } else {  
       for(Database.Error err : sr.getErrors()) {  
         System.debug('Error returned: ' + err.getStatusCode() + ' - ' + err.getMessage());  
       }  
     }  
   }  
 } else {  
   System.debug(' Noting is published. ');  
 }  
   

2) Publish via Rest API

Endpoint  : /services/data/v40.0/sobjects/UpdateObjectEvent__e/
Payload    : { "RecordId__c" : "123455634343", "ObjectName__c" : "Account" }





Listen to event on Visualforce Page 

We can use cometD library to listen to the event, I had to write customer wrapper (cometdCustom.js) to greatly simply the visualforce page. 




Source code 
It can be found at : https://github.com/c-shah/salesforce-streaming


Monday, September 18, 2017

Environment Hub - One stop shop

It is always a nightmare to keep working with username / password / tokens with different dev orgs or sandboxes.

- Keeping eye on different username / passwords for dev orgs and sandboxes
- Giving Access to your sandbox / dev org to others
- White listing the IPs to avoid tokens being asked
- Sandbox Refresh - changing email and verification
- Scripts to reset passwords or profiles or emails, etc.

I believe Environment Hub is quite sleek solution.

Install Environment Hub Application


  • You will need to contact customer support to have that App installed
  • In case of production org (non ISV), it should be installed in production org
  • for ISV, it is more flexible, but prefer to be at same place as LMA org


Configuration


  • Select Environment Hub App
  • Add Environment Hub tab
  • In case of production org, all sandboxes should be auto discovered 
  • In case of ISV, we might want to register different Developer org to Environment hub
  • We should give all users who need to use Environment Hub, appropriate access to their profile
    • Manage Environment Hub
    • Connect Organization to Environment Hub


Sandboxes


  • Sandboxes are auto discovered by Environment Hub
  • We should enable the SSO on it
  • Once SSO is enabled, it is required to refresh this sandbox
  • Once that is done, any production user (with Connect Organization permission) will be able to login to that org!
  • No more email reset, password rest, white listing IP, ...



Development Orgs


  • We can connect any dev org to Environment Hub
  • We should enable SSO on it
  • Now there are 3 different method to map Environment Hub user to Dev org
    • User name mapping - we can map the user name from Env hub to Dev org - manually
    • Federation Id : in case of SSO, as long as federation id matches between dev org and env hub org
    • User name formula field - apply the user name formula field so that env hub user can be converted to one of the Dev org user
In most cases, there is only one user that we care about, hence I use third approach (formula field) to give all users in Environment hub access to dev org.

E.g. if dev org user is dev2@ot.com, I will make formula field to be "dev2@ot.com", hence all environment hub user will evaluate to "dev2@ot.com" and would have full access to my dev org.





Salesforce Big Objects

Was just experimenting with Salesforce Big Objects and found it quite interesting. It is mainly used for big data (100M+) analytics and mostly for asynchronous data crunching like Hadoop. However, there are very critical distinction before we go with Big Object.


  • Currently Big Object only support fields and permission, and that's about it
  • We can not have
    • triggers
    • page layouts
    • extensive SOQL (indexed SOQL is supported but that's extremely limited - and make sense as we are dealing with humongous data set)
    • no workflows, process builders, etc..
    • no reports
  • Basically it is completely non UI, and just for back end data stores for big data analytics - and that's about it.

Use case

In org, we can have survey on and Object record (e.g. Account, Opportunity, etc..), and would want to store those survey data in Big Object and later analyze it. 


How to user it :


1. Create Big Object

  • There is no User Interface to create the Big Object and its fields. We must use metadata API (using Ant Migration tool or workbench) to create such artifacts. Obviously, workbench makes it a lot easier.
  • Create object file 
  • Create permission set file
  • Create package.xml file
  • Nicely bundle them in .zip file and in right directory structure (you can download it from here)
  • Use workbench to deploy the .zip file and BigObject should like below
  • You can assign permission set (Survey_BigObject) to right user so they can query and update the data



  • Pay close attention to indexed fields, this is used for inserting records - identifying duplicates, and issuing synchronous SOQL queries. 

2. Insert Data

  • We can insert data just like we do in Apex for any other Object, or we can use Bulk API
  • There is no upsert operation, salesforce will automatically check the record being inserted against the indexed value, and if index values are same, then it will do update. otherwise insert.
  • Upon failure, there are no errors, we just need to look at saveResult or saveResults.

 Account a = [ select id, name from account limit 1 ];  
 Survey__b survey = new Survey__b();  
 survey.WhatID__c = a.id;  
 survey.WhatTime__c = System.today() + 1;  
 survey.WhatObject__c = 'Account';  
 survey.Question1__c = 'What is the rating';  
 survey.Answer1__c = '1';  
 Database.SaveResult saveResult = database.insertImmediate(survey);  
 System.debug( ' success ' + saveResult.isSuccess() + ' ' + saveResult );  
   



3. Query Data

  • Querying data is quite tricky with Big Object. Either you can query all records, which most probably is going to fail when we have millions of records
  • Or synchronous SOQL can only be issued against indexed fields. And also indexed fields must be in order in the query. See below for example:
 List<Survey__b> surveys = [ select id, WhatId__c, WhatObject__c, WhatTime__c, Question1__c, Answer1__c from Survey__b ];  
 for(Survey__b survey : surveys ) {  
   System.debug( survey );  
 }  
   
   
 System.debug(' -------------- indexed query -------------- ');  
 /** no gap is allowed and only indexed field in exact order can be used for query , we can skip but no gap is allowed, e.g. below  
 *  [select id from Survey__b] is fine  
 *  [select id from Survey__b where WhatID__c = :a.id ] is fine  
 *  [select id from Survey__b where WhatTime__c = :System.today() ] is NOT fine, as you can't jump to index2 without having index1 in the query.  
 * **/  
 Account a = [ select id, name from account limit 1 ];  
 List<Survey__b> surveys2 = [ select id, WhatId__c, WhatObject__c, WhatTime__c, Question1__c, Answer1__c from Survey__b where WhatID__c = :a.id and WhatTime__c = :System.today() ];  
 for(Survey__b survey : surveys2 ) {  
   System.debug( survey );  
 }  
   
   

4. Asynchronous SOQL


  • Asynchronous soql is only supported using Rest API
  • We have to provide asynchronous SOQL and then the custom Object to store the result
  • It seems like only one Async SOQL can run at any given time - at least in org I worked on

4.1 Create Custom Object To Store Async Result


  • Created suvey analysis object to store the analysis of the query with counts

4.2 Run Asynchronous SOQL

  • Below is how asynchronous SOQL looks like, we need to provide SOQL, and then target table and mapping between selected field and target table's field.

 {  
   "query": "select Question1__c, Answer1__c, count(whatId__c) c from Survey__B where WhatObject__c = 'Account' group by Question1__c, Answer1__c",  
   "operation": "insert",  
   "targetObject": "Survey_Analysis__c",  
   "targetFieldMap": {  
     "Question1__c": "Question1__c",  
     "Answer1__c": "Answer1__c",  
     "c":"Count__c"  
   }  
 }  



  • We can execute it using Rest API on Workbench


Once Asynchonous SOQL query job is completed, we can query Survey_Analysis__c  object for accumulated result.


Sunday, September 17, 2017

Salesforce Platform Cache

Nothing new, just short summary on platform cache :

To over simplify, Platform cache is glorified hash map. Platform cache is first divided into different partitions. These are hard partition, cache usage in one partition will not overflow in another partition. Usually different partition is used for different project.


Partitions are further divided into Org cache (available to all users) and Session cache (used for user session cache). These are again hard partition, cache from Org will not overflow to Session. Minimum cache size is 5MB for Org or Session Cache.




To store key in the Cache

1:  User u = [ select id, username from user where username = 'chintan_shah@abc.com' ];  
2:  Cache.Org.put( 'local.MyPartition1.key1', u );  
3:  System.debug(' key1 is stored ');  
4:    
5:  String name = 'Chintan';  
6:  Cache.OrgPartition MyPartition1 = Cache.Org.getPartition('local.MyPartition1');  
7:  MyPartition1.put('key2', name );  
8:  System.debug(' key2 is stored ');  

  • We can either put key directly using Cache.Org.put or get access to a specific partition using Cache.Org.getPartition
  • To work with Session partition, Org would just change to Session in above code.


To retrieve the key from Cache

1:  Object u = Cache.Org.get( 'local.MyPartition1.key1');  
2:  System.debug(' key1 is stored ' + u );  
3:    
4:  Cache.OrgPartition MyPartition1 = Cache.Org.getPartition('local.MyPartition1');  
5:  System.debug(' key2 is stored ' + MyPartition1.get('key2' ) );  




To clean the Cache


  • Cache is not guaranteed persistence store, it can be cleaned any time by Salesforce 
  • Can also be wiped out during code deployment
  • Session cache max TTL is 8 hours and Org is 24 hours
  • Internally it uses LRU when it hits size limit to clean up old data
Hence, we don't ever have to do the clean up, but if we need to for certain reason, we can use below code. It also has limitation if cache is stored using cache builder.

1:  for(String key : Cache.Org.getKeys() ) {  
2:    Cache.Org.remove(key);  
3:  }  
4:    
5:  for(String key : Cache.Session.getKeys() ) {  
6:    Cache.Session.remove(key);  
7:  }  



Cache Builder


Instead of storing and retrieving cache, it is better to provide loading strategy to Platform cache, so upon cache miss, Salesforce automatically calls the class to load the cache for that key. This reduces the code and handles cache miss much more gracefully.

We have to specify cache loading strategy as class. Below is small class which loads the user information based on username. Idea is username is being the key, we need to load user data if not already in the cache.


1:  /**  
2:   * Created by chshah on 9/14/2017.  
3:   */  
4:    
5:  public class UserInfoCache implements Cache.CacheBuilder {  
6:    
7:    public Object doLoad(String usernameKey) {  
8:      String username = keyToUserName(usernameKey);  
9:      System.debug(' UserInfoCache load usernameKey ' + usernameKey + ' userName ' + username );  
10:      User u = (User)[SELECT Id, firstName, LastName, IsActive, username FROM User WHERE username =: username];  
11:      return u;  
12:    }  
13:    
14:    public static String usernameToKey(String username) {  
15:      return username.replace('.','DOT').replace('@','ATRATE');  
16:    }  
17:    
18:    public static String keyToUserName(String key) {  
19:      return key.replace('DOT','.').replace('ATRATE','@');  
20:    }  
21:  }  


1:  String usernameKey = UserInfoCache.usernameToKey('chintan_shah@abc');  
2:  User u = (User) Cache.Org.get(UserInfoCache.class, 'local.MyPartition1.' + usernameKey );  
3:  System.debug( ' u ' + u );  


  • The reason for converting username to usernameKey, is special characters are not allowed in platform cache key. 
  • In line 2, we provide cache loading strategy, hence if key is not found, it will call our class to load the key.


Consideration


  • ISV (managed package) can supply their own cache, hence it will use different namespace than "local"
  • Cache put follows same transaction boundary as SOQL updates, so any rollback due to failure will not put data in cache
  • Cache TTL limits (8/24 hours), and plus limit on how much data we can store per transaction (usually 1MB)



Saturday, September 16, 2017

Bad @Future

When writing apex methods, it is very tempting to go for @future if we want some work to be done asynchronously in separate transaction. @future is definitely easy to use but comes with a lot of limitation.


 Let's say I want expose an API to all developers and want to do some work which is quite resource intensive, hence I breakdown my code and put it in @future which is going to take a bit long time and might need higher governor limit.

1:  /**  
2:   * Created by chshah on 9/15/2017.  
3:   */  
4:  public with sharing class MyCoolApex {  

5:    /**  
6:     * I plan to provide this method to rest of the developers to consume.  
7:     */  

8:    public static void myExposedMethod() {  
9:      // do task 1  
10:      // do task 2  
11:      // do task 3  
12:      // do future task  
13:      System.debug(LoggingLevel.INFO, 'Inside myExposedMethod - Calling future task for additional work.');  
14:      doFutureTask();  
15:    }  

16:    @future  
17:    private static void doFutureTask() {  
18:      System.debug(LoggingLevel.INFO, 'Doing Future Resoruce Intensive Task');  
19:    }  

20:  }  

Here  I am exposing MyCoolApex.myExposedMethod for other developers to consume, and calling doFutureTask to delay some resource intensive processing.

It works fine most of the time (until it doesn't), e.g. if someone calling :

 MyCoolApex.myExposedMethod();  

However the problem comes if someone is trying to call this method from Batch, Schedule or even Future. If our method is called from Batch process, Scheduled process or Future (which makes nested Future), salesforce will halt the processing with error. e.g. Client code :


 /**  
  * Created by chshah on 9/16/2017.  
  */  
 public with sharing class TestInBatch implements Database.Batchable<sObject> {  
   public Database.QueryLocator start(Database.BatchableContext BC) {  
     return Database.getQueryLocator('select id from user limit 1');  
   }  
   public void execute(Database.BatchableContext BC, List<sObject> accounts) {  
     MyCoolApex.myExposedMethod();  
   }  
   public void finish(Database.BatchableContext BC) {  
   }  
 }  


 TestInBatch tb = new TestInBatch();  
 Database.executeBatch(tb);  


This would result in error. In order to solve it, we could put some catches inside our API, e.g. below, but that doesn't really solve the problem.

1:    public static void myExposedMethod() {  
2:      // do task 1  
3:      // do task 2  
4:      // do task 3  
5:      // do future task  
6:      if( System.isBatch() || System.isFuture() || System.isScheduled() ) {  
7:        System.debug(LoggingLevel.INFO, 'Inside myExposedMethod - Unable to call future method.');  
8:      } else {  
9:        System.debug(LoggingLevel.INFO, 'Inside myExposedMethod - Calling future task for additional work.');  
10:        doFutureTask();  
11:      }  
12:    }  


Correct Solution (Queueable) 

The right way to solve the problem would be to use Queueable. Queueable has least amount of restrictions, you can call Queueable from Future, Batch, Schedulable, and even Queueable. In Dev org, we can nest Queueable upto 5 times and in Enterprise Org, there is no limit on nested Queueable.


1:  /**  
2:   * Created by chshah on 9/15/2017.  
3:   */  
4:  public with sharing class MyCoolApex {  
5:    /**  
6:     * I plan to provide this method to rest of the developers to consume.  
7:     */  
8:    public static void myExposedMethod() {  
9:      // do task 1  
10:      // do task 2  
11:      // do task 3  
12:      // do future task  
13:      System.debug(LoggingLevel.INFO, 'Calling Queuable');  
14:      System.enqueueJob( new MyCoolApexQueuable() );  
15:    }  
16:  }  

1:  /**  
2:   * Created by chshah on 9/16/2017.  
3:   */  
4:  public with sharing class MyCoolApexQueuable implements Queueable, Database.AllowsCallouts {  
5:    public void execute(QueueableContext context) {  
6:      doFutureTask();  
7:    }  
8:    private static void doFutureTask() {  
9:      System.debug(LoggingLevel.INFO, 'Doing Resoruce Intevensive Task');  
10:    }  
11:  }  


In above example, the actual API method (myExposedMethod),  just calls Queue to defer the resource intensive work in asynchronous fashion. Now, we don't have to worry who (or which context) our method is called.




Thursday, September 14, 2017

Ten Commandments for Salesforce Test

1. Thou shalt not use "Test.isRunning" in actual code
  • Use Mock e.g. Test.setMock(HttpCalloutMock.class, new MyMockHttpService())
  • There might be an extream case, but it is always best practice to keep actual code clean without Test.isRunning


2. Thou shalt use Test.start and stop
  • It resets governer limit!
  • In some cases it is necessary, e.g. if you require to do some DML to setup data and actual test code has callout
  • All asynchronous code gets executed right away - and we never know which code will have asynchronous code now or in future


3. Thou shalt use Test runAs whenever possible
        It makes sure code is running fine as expected profile.

 Profile p = [SELECT Id FROM Profile WHERE Name='Standard User'];   
      User u = new User(Alias = 'standt', Email='standarduser@testorg.com', EmailEncodingKey='UTF-8', LastName='Testing', LanguageLocaleKey='en_US', LocaleSidKey='en_US', ProfileId = p.Id, TimeZoneSidKey='America/Los_Angeles', UserName='standarduser@testorg.com');  
      System.runAs(u) {  
           System.debug('Current User: ' + UserInfo.getUserName());  
           System.debug('Current Profile: ' + UserInfo.getProfileId());   
      }       


4. Thou shalt use System.assert - after Test.stopTest, and should have meaningful message

  • Must use System.assert
  • It should always be after Test.stopTest, as asynchronous code gets executed on Test.stopTest line
  • Have a meaningful error message : System.assertEquals( A, B, "Message");


5. Thou shalt use @testSetup
  • Only one method can have @testSetup
  • It is called in separate transaction, so don't try to set variable, just use for data prepeartion


6. Thou shalt use private for class
  • class should be marked @isTest and private 


7. Thou shalt use private for method
  • Method should be marked @isTest and private


8. Thou shalt use Test Data Factory
  • Instead of creating test data in class, it should be seperate routine - as chances are same data would be created again.
  • We can use  Test.loadData(Account.sObjectType, 'myResource')  or create based on parameters, but it is good to externalize


9. Thou shalt never use @seeAllData
  • obviously!


10. Test Behavior over Coverage



Based on above, here is my template I use :

My Sample Class:

 /**  
  * Created by chshah on 9/14/2017.  
  */  
 public with sharing class My {  
   public static List<Contact> changeContactName(List<Contact> contacts) {  
     for(Contact c : contacts ) {  
       c.firstName = c.firstName.toUpperCase();  
     }  
     update contacts;  
     return contacts;  
   }  
 }  


Test Class:

 /**  
  * Created by chshah on 9/14/2017.  
  */  
 @isTest  
 private class MyTest {  
   @testSetup  
   private static void testSetup() {  
   }  
   @isTest  
   private static void testChangeContactName() {  
     Profile p = [SELECT Id FROM Profile WHERE Name='Standard User'];  
     User u = new User(Alias = 'standt', Email='standarduser@testorg.com', EmailEncodingKey='UTF-8', LastName='Testing', LanguageLocaleKey='en_US', LocaleSidKey='en_US', ProfileId = p.Id, TimeZoneSidKey='America/Los_Angeles', UserName='standarduser@testorg.com.testorg');  
     System.runAs(u) {  
       Contact c = MyTestFactory.createContact('Chintan','Shah');  
       Test.startTest();  
       List<Contact> contacts = My.changeContactName( new List<Contact> {c} );  
       Test.stopTest();  
       for(Contact con : contacts ) {  
         System.assertEquals(con.firstName, con.firstName.toUpperCase(), ' firstName must be in upper case ' + con.firstName );  
       }  
     }  
   }  
 }  


Data Factory:


 /**  
  * Created by chshah on 9/14/2017.  
  */  
 @isTest  
 public class MyTestFactory {  
   @TestVisible  
   private static Contact createContact(String firstName, String lastName) {  
     Contact c = new Contact(firstName = firstName, lastName = lastName );  
     insert c;  
     return c;  
   }  
 }  


Monday, August 14, 2017

Salesforce send admin email for error/success

Simple reference code for sending success/failure email in Salesforce

 public static List<String> toAddresses = new List<String>();  
 /** to cache transaction data and reduce SOQLs  
  */  
 public static List<String> getNotificationEmailAddress() {  
   if( toAddresses == null || toAddresses.size() == 0 ) {  
     Profile sysAdminProfile = [SELECT Id FROM Profile WHERE Name = 'System Administrator' limit 10];  
     List<User> sysAdmins = [SELECT id, Email FROM User WHERE ProfileId = :sysAdminProfile.id];  
     for( User sysAdmin : sysAdmins ) {  
       toAddresses.add ( sysAdmin.Email );  
     }  
   }  
   return toAddresses;  
 }  
 public static void sendMail(String subject, String body, List<String> recipients) {  
   try {  
     Messaging.SingleEmailMessage message = new Messaging.SingleEmailMessage();  
     message.toAddresses = recipients;  
     message.optOutPolicy = 'FILTER';  
     message.subject = subject;  
     message.plainTextBody = body;  
     Messaging.SingleEmailMessage[] messages =  new List<Messaging.SingleEmailMessage> {message};  
     Messaging.SendEmailResult[] results = Messaging.sendEmail(messages);  
     if (results[0].success) {  
       System.debug(LoggingLevel.INFO, 'The email was sent successfully . to ' + recipients + ' subject ' + subject );  
     } else {  
       System.debug(LoggingLevel.ERROR, 'The email failed to send: ' + results[0].errors[0].message + ' to ' + recipients + ' subject ' + subject );  
     }  
   } catch(Exception e) {  
     System.debug(LoggingLevel.ERROR, 'The email failed to send: ' + ' to ' + recipients + ' subject ' + subject + ' body ' + body );  
     System.debug( e.getMessage() + '\n' + e.getStackTraceString() );  
   }  
 }  
 sendMail('test email','test body', new List<String> { 'chintanjshah@gmail.com' } );  
 sendMail('test email 2','test body 2', getNotificationEmailAddress() );  

Monday, August 7, 2017

Trigger framework with hierarchical kill switches

An enhancement on existing matured trigger framework from Hari K.

It is very common scenario to disable trigger logic on certain user (e.g. batch) or profile or for entire org. I have enhanced the trigger framework and source code is available at :

https://github.com/c-shah/trigger-framework

How to use it :

The framework already comes with one of the hiearchical setting called : TriggerFrameworkSettings__c.AllTriggersDisabled - this is false by default. If you need to disable all triggers, you can just check this checkbox. and all triggers will be disabled for a given user/profile/org.

This method can also be extended for individual sObject.
You can add <sObject>TriggerDisabled (e.g. AccountTriggerDisabled) custom setting under TriggerFrameworkSettings__c, and that setting will be dynamically be read for that sObject for a given user/profile/org. The trigger at individual sObject will be enabled/disabled based on flag value.

Friday, June 23, 2017

Post Install Script Framework

As an ISV, many times we have to write post install scripts for upgrade. Salesforce provides facility to do that : https://developer.salesforce.com/docs/atlas.en-us.apexcode.meta/apexcode/apex_install_handler.htm

However are quite a few limitations with this:

  • It is hard to make sure if post install code is already executed or not
  • When developing upgrade script, we don't know what version it is going to be when it gets published
  • Hard to stop the execution or do the retry


Hence we tried to implement same approach using change set and integrated with Salesforce Post Install Handler. Changeset is industry wide practice used for a long time and here are some benefits:


  • Multiple Post Install Scripts
  • Execution of Post Install Scripts in Order - and only once
  • On Error in any Script
    • Stop/Halt the execution
    • Send Error Email
  • On Successful completion of all scripts
    • send summary email of all scripts
  • Each script gets full set of governance limit
    • In case of Salesforce, entire Batch is devoted to a given script


With above points in mind, we created below framework, where ISV can just plugin in any post install script with minimal effort:


Framework



1. Entry point class - which implements Salesforce interface InstallHandler
   This class just calls PostInstallService.startService
 
2. PostInstallService.startService
   This class scans for all the classes which extends PostInstallScriptTemplate in current namespace
   It inserts them into PostInstallScript__c object, if it doesn't exist already.

3. PostInstallService.startService Calls PostInstallService.executeNextScript
 
4. PostInstallService.executeNextScript
   Based on the data in PostInstallScript__c, it will call next PostInstallScriptTemplate(N)
   PostInstallScriptTemplate is batch interface so execution will be done asynchronous fashion
   [Note: There will be callback to PostInstallService, when PostInstallScriptTemplate(N) is completed/errored]
 
5. Once the PostInstallScriptTemplate(N) is completed/errored
   It will update the PostInstallScript__c object with Status and Execution Log
 
6. PostInstallScriptTemplate(N) will call back the framework PostInstallService.executeNextScript

7. PostInstallService.executeNextScript
   based on data in PostInstallScript__c in, it will either:
a) Halt execution (if error)
b) Move on to the next script
c) Move on to finish if all of post install scripts are successfully completed

8. A user interface to display currently pending/Completed/Errored Scripts along with Execution Logs

9. A user interface provides facility to resubmit if Errored
 

Post Install Scripts



 
ISV can write Post Install script in two ways:

1) If Batch context is not needed, we can write Post install script as below,
description, sequence number and execution log is stored in database.
Actual post install logic is in executeScript method


/**
 * Created by cshah on 5/30/2017.
 */

public with sharing class PostInstallScript1 extends PostInstallScriptTemplate {

    private String executionLog;

    public override void executeScript(Database.BatchableContext bc, List<SObject> sObjects) {
        System.debug('PostInstallScript1 : execute : hoping to get executed only once ');
        executionLog  = 'script 1 done. ';
    }

    public override Integer getSequenceNumber() {
        System.debug('PostInstallScript1 : getSequenceNumber ');
        return 1;
    }

    public override String getExecutionLog() {
        System.debug('PostInstallScript1 : getExecutionLog ');
        return executionLog;
    }

    public override String getDescription() {
        System.debug('PostInstallScript1 : getDescription ');
        return 'Script 1 Description ';
    }
}


2) If we need to query 50k+ records or update 10k+ records, we have to use Batch context and here is another way to write post install Script. In this case, we will need to override start and finish methods just like we do for any Salesforce Batch


/**
 * Created by cshah on 5/30/2017.
 */

public without sharing class PostInstallScript3 extends PostInstallScriptTemplate {

    private String executionLog;
    private Integer processedCount = 0;

    public override void executeScript(Database.BatchableContext bc, List<SObject> sObjects) {
        System.debug('PostInstallScript3 : execute : hoping to get executed only once ');
        executionLog  = 'script 3 done. ';
        processedCount += sObjects.size();
    }

    public override Integer getSequenceNumber() {
        System.debug('PostInstallScript3 : getSequenceNumber ');
        return 3;
    }

    public override String getExecutionLog() {
        System.debug('PostInstallScript3 : getExecutionLog ');
        executionLog = ' Processed ' + processedCount + ' records ';
        return executionLog;
    }

    public override String getDescription() {
        System.debug('PostInstallScript3 : getDescription ');
        return 'Script 3 Description ';
    }

    public override Integer getBatchSize() {
        System.debug('PostInstallScript3 : getBatchSize ');
        return 1;
    }

    public override Database.QueryLocator startScript(Database.BatchableContext bc) {
        System.debug('PostInstallScript3 : startScript ');
        return Database.getQueryLocator('select id, name from account limit 201');
    }

    public override void  finishScript(Database.BatchableContext bc) {
        System.debug('PostInstallScript3 : finishScript ');
    }

}



User Interface


User interface allows to view the post install script, and their execution log. 
It allows to resubmit in case of error

Source code 

1. It can be found at github : https://github.com/c-shah/PostInstallScriptFramework
2. Or as unmanaged package : https://login.salesforce.com/packaging/installPackage.apexp?p0=04tf400000099bW

Monday, May 1, 2017

OData/Heroku with Salesforce - Integrate differently

As we usually come across, below is standard pattern when we integrate external app talking to Salesforce or Salesforce talking to external application.  We use different API to talk to Salesforce and use workflow outbound, or rest/soap calls to make outbound call. 

Below is different approach using OData, and in many cases it can make the integration very simple and minimal to no code on salesforce.

What is OData?
It is a standard way to represent data. Detail can be found at : http://www.odata.org/documentation/ however, at very high level it is web service way of representing data like relational database. Main features:

Metadata:
There is metadata ($metdata) to get information about all schemas, tables, columns, and procedures.

SQL like operation
We could do SQL like operation instead of creating new operation for each type (e.g. get employee by first name, by last name, etc..)

Here is naming convention (left: Classic Relational Database, right: OData 4.0 naming)



Heroku
Heroku is very well known and a lot of documentation could be found at https://www.heroku.com/, so would be focus only on two items:

Heroku Connect
Single or bi directional link to Salesforce tables/fields to Heroku tables/fields. Any changes to Salesforce is migrated to Heroku Postgres database over extremely fast and efficient SQL link. And if bidirectional link is configured, any changes on Heroku is posted back to Salesforce.




Heroku App Engine
We could host custom Java/Node and other supported language  application on Heroku with just click of a button. Hence, I wrote custom Java app using Apache Olingo to host on Heroku platform.


This app generates metadata and connects to heroku postgres to get data and exposes everything as OData service using Apache Olingo framework.

Notes:
  • Had to use Tomcat (instead of http://sparkjava.com) as Olingo requires servlet 
  • Need to implement two interfaces
    • Metadata Interface (to render schema, entity, entity set)
    • Data Processor (to fetch and return the data) 
  • Had to remove http header accept, as causing issue with Apache Olingo

Putting everything together

  • Once it is exposed as OData Service, Salesforce can connect and all the EntitySet exposed in OData would be available as external object on Salesforce (ends with __x)
  • We can do SOQL, SOSL, indirect lookup on those Salesforce Object 
  • This would be zero code on Salesforce and on Heroku side, we can get data from Cache, PostGres or External App using Rest or SOAP api

On Salesforce side, it would be just providing URL and it will automatically list down all objects and it will be able to create objects as selected.




Final Take



  •  Reiterating the first diagram, Heroku Connect provides alternative to accessing and updating data via API and it is super helpful if app is living on Heroku or would like to connect directly to postgres database
  • Salesforce can connect to external app via Odata on heroku, and that would reduce the code on Salesforce org and promote more click over code approach

Code for Apache Olingo implementation can be found at :
https://github.com/spring-work/odata

Heroku App:
http://odata-cshah.herokuapp.com/odata.svc/

Thursday, February 2, 2017

Call Salesforce REST API from Apex

Nothing new but just below example helps make call to Salesforce REST API from Apex. If you need to know Org limits at run time, most them are available via Limits call, but some are available via rest api at : /services/data/v37.0/limits (e.g. daily async limits, email or bulk email limits, etc.)  and it is easy to get those information from workbench.developerforce.com, however if you need to add this to the code, below is the code:


Add your Org URL to remote site setting
Note: if you don't know, then either you can look at browser or via below call:
System.debug( URL.getSalesforceBaseUrl().toExternalForm() );


Run Anonymous block below, which is broken down into three pieces
1. Get the base URL
2. Get Auth Token
3. Actual HTTP Request

 /* 1. get base URL */  
 public static String getSalesforceInstanceUrl() {  
       return URL.getSalesforceBaseUrl().toExternalForm();
 }  
   
 public static String getRestResponse(String url) {   
       HttpRequest httpRequest = new HttpRequest();  
       httpRequest.setEndpoint(url);  
       httpRequest.setMethod('GET');  
       /* 2. set the auth token */
       httpRequest.setHeader('Authorization', 'OAuth ' + UserInfo.getSessionId());        
       httpRequest.setHeader('Authorization', 'Bearer ' + UserInfo.getSessionID());  
       try {  
             Http http = new Http();  
             /* initiate the actual call */  
             HttpResponse httpResponse = http.send(httpRequest);  
             if (httpResponse.getStatusCode() == 200 ) {  
                   return JSON.serializePretty( JSON.deserializeUntyped(httpResponse.getBody()) );  
             } else {  
                   System.debug(' httpResponse ' + httpResponse.getBody() );  
                   throw new CalloutException( httpResponse.getBody() );  
             }   
       } catch( System.Exception e) {  
             System.debug('ERROR: '+ e);  
             throw e;  
       }  
       return null;  
 }  
   
 System.debug(' -- limit method code block -- : start ');  
 String baseUrl = getSalesforceInstanceUrl();  
 System.debug(' -- baseUrl-- : ' + baseUrl );  
 String limitsUrl = baseUrl + '/services/data/v37.0/limits';  
 System.debug(' -- limitsUrl-- : ' + limitsUrl );  
 String response = getRestResponse(limitsUrl);  
 System.debug(' -- response-- : ' + response );