Monday, December 11, 2017

SOSL - platform encryption

Salesforce platform encryption encrypts the data at rest, hence we can not run SOQL against it. SOSL is good work around but have its own limitation. (side note: SOSL also has limitation of 2000 rows returned)

Enable Platform Encryption
Encrypt Fields
Create Acount
SOQL
SOSL


Enable Platform Encryption 

a) create permission set




b) assign system permission for this permission set




c) Assign permission set to current user





Encrypt Fields

Fields to encrypt (Setup -> Platform Encryption)



Create Account

Created account with name : My Account



SOQL

If we run SOQL against account, now it would return error.

 select id, name, description from Account where name like '%My%'  

 [object Object]: description from Account where name like '%My%' ^ ERROR at Row:1:Column:49 encrypted field 'Account.name' cannot be filtered in a query call  



SOSL

If we run below SOSL, we can access our account:

 FIND {My} IN ALL FIELDS RETURNING Account(Name, Description, AnnualRevenue)  
 FIND {M*} IN ALL FIELDS RETURNING Account(Name, Description, AnnualRevenue)  






However issues comes when we want to narrow down the result. If we apply the where clause to the SOSL, it breaks. I believe it is because behind the scene SOSL still runs as SOQL.

 FIND {M*} IN ALL FIELDS RETURNING Account(Name, Description, AnnualRevenue WHERE Name LIKE 'M%')  

Error Message

 Description, AnnualRevenue WHERE Name LIKE 'M%')  
                                                                         ^  
 ERROR at Row:1:Column:82  
 encrypted field 'Account.Name' cannot be filtered in a query call     


So, we need to be careful when we write SOSL or SOQL where clause (especially in managed package) given that fields could be encrypted.


Saturday, November 11, 2017

Salesforce Canvas : what and how?

When we are integrating third party UI application, there are quite a few options

  • Including iframe inside Visualforce or Lightning component
    • mostly one way integration (salesforce to UI application)
    • security issues
    • sizing problems
  • Canvas
    • lightning and classic
    • seamless and two way integration
  • Lightning container component (Winter'18 - for lightning)
    • only for lightning
Canvas by far the most feature rich and seamless integration between salesforce and third party UI application, especially we have two way interaction. E.g. salesforce passing data to third party application and UI application updating/creating data back in salesforce.
How to configure it

  • Create Connected application

Step 1) Connected App -> Allow users to install canvas personal apps

Step 2) Create New App






Step 3) Configure Canvas App Settings  (Note: check Publisher and Create Actions, if we plan to publish canvas app there)



Step 4)



Step 5) Grab the consumer secret and key



Step 6) Assign canvas app the users or profiles







Step 7) Canvas app viewer






  • Sample Third Party application
    • From Canvas App viewer, we can create canned "heroku quick start" app, however I decided to use create my own so that I can test out all the features in controller environment
    • Create heroku application, and make sure to sign the request for the URL provided in canvas salesforce app



Main Features

Security
  • Salesforce sends signed request to third party application, and it can be decrypted using consumer key. Hence the request is secure.
  • Also when json request payload is decrypted, we get session id and all the information of the page context  






Two way event
- Events can be raised from Visual Force and can be sent to Third Party App, and Third party app can send event back to Salesforce






Resize
- Third party App can use resize API to resize the canvas size in Salesforce. When we put canvas in visualforce and then in layout, we are restricted by iFrame size and height, but if canvas is put in directly in page layout we can have much better control over the size.







Api calls (e.g. chatter)

Third party application can use OAuth token provided in JSON request and make any API call. Below is example of chatter post




Where canvas can be used 
Once it is configured correctly, it can be used at many places:


Canvas App Previewer : this is just for testing your canvas app



Page layout via Visual Force
Canvas can be added in visual force page, however if we go that route, when we add the page to layout, we will be restricted by iframe size.



VisualForce code:




Page Layout Directly
This is very advantageous as we can resize and don't have to use iframe




Lightning Component
We can add canvas in lightning component







Chatter / Publish Action
We can also put canvas chatter or publisher action as shown below





Source Code

Heroku code 
https://github.com/c-shah/canvasly

Salesforce code
https://github.com/c-shah/sf.canvasly

Thursday, October 19, 2017

Community User Test Data

Always find a hassle to create community user for test data, so just thought putting it out here: 

Normal user creation:

User communityUser = new User(
 ProfileId = [SELECT Id FROM Profile WHERE Name = 'Interconnection Community User Plus User'].Id,
 FirstName = 'first',
 LastName = 'last',
 Email = 'test.test@test.com',
 Username = 'test.' + System.currentTimeMillis() + '@test.com',
 Title = 'Title',
 Alias = 'alias',
 TimeZoneSidKey = 'America/Los_Angeles',
 EmailEncodingKey = 'UTF-8',
 LanguageLocaleKey = 'en_US',
 LocaleSidKey = 'en_US'
);


But we get below errors:

System.DmlException: Insert failed. First exception on row 0; first error: INVALID_CROSS_REFERENCE_KEY, Cannot create a portal user without contact: [ContactId]

System.DmlException: Insert failed. First exception on row 0; first error: UNKNOWN_EXCEPTION, portal account owner must have a role: []

System.DmlException: Insert failed. First exception on row 0; first error: MIXED_DML_OPERATION, DML operation on setup object is not permitted after you have updated a non-setup object (or vice versa): Account, original object: User: []


To solve it : use below approach : 

1) Create Portal Owner

   private static User createPortalAccountOwner() {  
     UserRole portalRole = new UserRole(DeveloperName = 'MyCustomRole', Name = 'My Role', PortalType='None' );  
     insert portalRole;  
     System.debug('portalRole is ' + portalRole);  
     Profile sysAdminProfile = [Select Id from Profile where name = 'System Administrator'];  
     User portalAccountOwner = new User(  
         UserRoleId = portalRole.Id,  
         ProfileId = sysAdminProfile.Id,  
         Username = 'portalOwner' + System.currentTimeMillis() + '@test.com',  
         Alias = 'Alias',  
         Email='portal.owner@test.com',  
         EmailEncodingKey='UTF-8',  
         Firstname='Portal',  
         Lastname='Owner',  
         LanguageLocaleKey='en_US',  
         LocaleSidKey='en_US',  
         TimeZoneSidKey = 'America/Los_Angeles'  
     );  
     Database.insert(portalAccountOwner);  
     return portalAccountOwner;  
   }  

2) Create Community User

   private static void createCommunityUser(User portalAccountOwner) {  
     System.runAs ( portalAccountOwner ) {  
       //Create account  
       Account portalAccount = new Account(  
           Name = 'portalAccount',  
           OwnerId = portalAccountOwner.Id  
       );  
       Database.insert(portalAccount);  
       //Create contact  
       Contact portalContact = new Contact(  
           FirstName = 'portalContactFirst',  
           Lastname = 'portalContactLast',  
           AccountId = portalAccount.Id,  
           Email = 'portalContact' + System.currentTimeMillis() + '@test.com'  
       );  
       Database.insert(portalContact);  
       User communityUser = new User(  
           ProfileId = [SELECT Id FROM Profile WHERE Name = 'Interconnection Community User Plus User'].Id,  
           FirstName = 'CommunityUserFirst',  
           LastName = 'CommunityUserLast',  
           Email = 'community.user@test.com',  
           Username = 'community.user.' + System.currentTimeMillis() + '@test.com',  
           Title = 'Title',  
           Alias = 'Alias',  
           TimeZoneSidKey = 'America/Los_Angeles',  
           EmailEncodingKey = 'UTF-8',  
           LanguageLocaleKey = 'en_US',  
           LocaleSidKey = 'en_US',  
           ContactId = portalContact.id  
       );  
       Database.insert(communityUser);  
     }  
   }  

3) We can use below code in testSetup or test method.

 User portalAccountOwner = createPortalAccountOwner();  
 createCommunityUser(portalAccountOwner);  

Wednesday, October 11, 2017

Salesforce Streaming API Summary

Was just working on streaming API and just putting thoughts together with different variations of streaming API :


  1. Push Topic
  2. Generic Topic
  3. Platform Event 

Even though underneath the cover, they all use same technology stack, they provide very different features.


Push TopicGeneric TopicPlatform Events
OverviewPush topic is used for SOQL based subscriptionGeneric Topic is used to subscribe and publish arbitrary events
Platform Events is used for structured publish and subscribe

There is a lot more native support for both publish and subscribe

Full control over structure (payload) of the event

Replay : 

When client disconnect and reconnect again, they can replay from last 24 hours with id where they left from
special ids : -1 from the beginning, -2 : all new events

Supported (version 36.0 +)Supported (version 36.0 +)Supported (version 36.0 +)
Create (Setup)
1) Using Apex Insert Statements

or 

2) workbench 
- it defaults all the param except SOQL query

Using Streaming Channel Tab

or

2) Workbench
Create __e object
Support for Trigger SubscriptionNoNoYes
How to Publishupon SOQLUsing Rest API
EventBus.publish;

API : Post like sobject /services/data/v41.0/sobjects/Low_Ink__e/ 

Process Builder

Flow
How to Subscribe
workbench (36.0) (later are causing problems)

using Java (cometd lib)

Using Javascript (cometd lib)
workbench

using Java (cometd lib)

Using Javascript (cometd lib)
using Java (cometd)

Using Javascript (cometd)

Using Visualforce (cometd)

Process Flow

Flow

Trigger

Workbench
Channel Name

This is useful when we subscribe using cometd library
/topic/<>

e.g.
/topic/AccountUpdatePushTopic
/u/<>

e.g.
/u/GenericUpdateTopic
/event/<>

e.g.
/event/UpdateObjectEvent__e


Push Topic


Push Topic Setup


Workbench
we can setup a push topic using workbench, but it doesn't allow granular control over topic configuration.



APEX

  • NotifyForFields can be Referenced (part of query - default), Where (in where condition), All (all changes)
  • Once topic is inserted, we can use PushTopic object to make any update
 PushTopic pushTopic = new PushTopic();  
 pushTopic.Name = 'AccountUpdatePushTopic';  
 pushTopic.Query = 'SELECT Id, Name, AccountNumber from Account';  
 pushTopic.ApiVersion = 40.0;  
 pushTopic.NotifyForOperationCreate = true;  
 pushTopic.NotifyForOperationUpdate = true;  
 pushTopic.NotifyForOperationUndelete = true;  
 pushTopic.NotifyForOperationDelete = true;  
 pushTopic.NotifyForFields = 'Referenced';  
 insert pushTopic;  
   



Push Topic Publish

There is no API support for publish. Any change in the data based on condition (e.g. Notify Operation and fields configured) would fire the event on the topic.


Push Topic Subscribe

Workbench

  • This is quick way to test and also record the channel, as we will need it later for cometd library

Java/JavaScript
covered later


Generic Topic



Generic Topic Setup

  • We must setup generic toipc using salesforce UI
  • There is no apex or workbench support to create generic topic



Generic Topic Publish

Rest API (Workbench)

URL : /services/data/v/sobjects/StreamingChannel//push

We can find streaming channel id using query (SELECT Name, ID FROM StreamingChannel)

e.g. /services/data/v40.0/sobjects/StreamingChannel/0M61I000000TN1FSAW/push

payload : 

{
  "pushEvents": [
      {
          "payload": "Broadcast message to all subscribers",
          "userIds": []
      }
   ]

}



Generic Topic Subscribe

WorkBench


Java/JavaScript:
Covered Later


Platform Events


Platform Event Setup


  • Essentially created two attributes (ObjectName__c and RecordId__c) so that we can publish the event having those attributes


Platform Event Publish

Apex


  • Publish event call doesn't fail, hence we have to go through the results
  • Also publish call doesn't participate in transaction, meaning if transaction has to fail, event will still publish


 List events = new List();  
 UpdateObjectEvent__e event = new UpdateObjectEvent__e(objectName__c='Account', recordId__c='1234');  
 events.add( event );  
 List results = EventBus.publish(events);  
 if( results != null && results.size() > 0 ) {  
   for (Database.SaveResult sr : results) {  
     if (sr.isSuccess()) {  
       System.debug('Successfully published event. ' + results.size() );  
     } else {  
       for(Database.Error err : sr.getErrors()) {  
         System.debug('Error returned: ' + err.getStatusCode() + ' - ' + err.getMessage());  
       }  
     }  
   }  
 } else {  
   System.debug(' Noting is published. ');  
 }  
   


Rest API

Endpoint  : /services/data/v40.0/sobjects/UpdateObjectEvent__e/
Payload    : { "RecordId__c" : "123455634343", "ObjectName__c" : "Account" }



Soap API
Similar to rest api, it is just call to insert into sObject

Process Builder 
not covered, but straightforward

Visual Flow 
not covered, but straightforward


Platform Event Subscribe

Trigger

  • only after insert is supported

 trigger UpdateObjectEventTrigger on UpdateObjectEvent__e (after insert) {  
   System.debug(' UpdateObjectEventTrigger ' );  
     
   for (UpdateObjectEvent__e event : Trigger.New) {  
     System.debug(' : ' + event.RecordId__c + ' ' + event.ObjectName__c );  
   }  
     
 }  

Process Builder 
not covered, but straightforward

Visual Flow 
not covered, but straightforward

Java/JavaScript 
Covered later


Platform Event Debug

  • debug statements in trigger doesn't show up in debug logs, we need to enable them using below







Generic Java Subscriber Client



We need to download and build the EMP connector from salesforce 
  • download : https://github.com/forcedotcom/EMP-Connector
  • unzip, and run "mvn clean install"
  • Use emp-connector-0.0.1-SNAPSHOT-phat.jar in the new project that we are going to create
  • Create a new project (java 1.8) and use below code. 
  • Please note that channel name can be changed as per which topic we are subscribing



 package com.spring.client;  
   
 import com.salesforce.emp.connector.BayeuxParameters;  
 import com.salesforce.emp.connector.EmpConnector;  
 import com.salesforce.emp.connector.TopicSubscription;  
   
 import java.util.Map;  
 import java.util.concurrent.TimeUnit;  
 import java.util.function.Consumer;  
   
 import static com.salesforce.emp.connector.LoginHelper.login;  
   
 public class StreamingClient {  
   
   public static void main(String args[]) throws Exception {  
     long replayFrom = EmpConnector.REPLAY_FROM_EARLIEST;  
     BayeuxParameters params = login("streaming@springsoa.com", "Welcome1");  
     EmpConnector connector = new EmpConnector(params);  
     Consumer<Map<String, Object>> consumer = event -> System.out.println(String.format("Received:\n%s", event));  
     connector.start().get(5, TimeUnit.SECONDS);  
     TopicSubscription subscription = connector.subscribe("/event/UpdateObjectEvent__e", replayFrom, consumer ).get(5, TimeUnit.SECONDS);  
     System.out.println(String.format("Subscribed: %s", subscription));  
     //subscription.cancel();  
     //connector.stop();  
   }  
 }  
   



Generic Javascript Subscriber Client


We can use cometD library to listen to the event, I had to write customer wrapper (cometdCustom.js) to greatly simply the visualforce page. We can also use this in independent HTML page,as long as we can get oauth session id.






Source Code

Java code                       :   https://github.com/c-shah/streaming-java-client    
Salesforce and JS code  :   https://github.com/c-shah/salesforce-streaming

Refresh VisualForce Page with Platform Events

Problem Statement 

A visual force page is embedded inside the standard page layout to display additional information from third party application as below.

A few use cases :
- Let's say there is change inside third party application content and we want to refresh the entire page
- If third party application is making change to sales force data on this page using API and we need to refresh the page



A few failed solutions

1) If third party application is rendered inside the iframe, we can not access the parent salesforce page.

E.g. if we try one of the below we get the error message, as salesforce will prevent third party iframe to access the parent page.

 window.parent.location.href = URL  
 parent.location.href=parent.location.href  
 parent.location.reload();  
 window.parent.location.href = window.parent.location.href  

error message:

 Unsafe JavaScript attempt to initiate navigation for frame with URL 'https://c.na59.visual.force.com/servlet/servlet.Integration?lid=066f4000001yE8F&ic=1&...'.   
 The frame attempting navigation is neither same-origin with the target, nor is it the target's parent or opener  


2) Polling : in parent visualforce page we can constantly do polling on server side to listen for changes and refresh the VF page as needed. Polling in general is not good idea and not very scalable.




Platform Event to Rescue

Event based solution comes quite handy here, where we can publish the event on server side and visualforce page can listen to event and on right criteria, it can alert the end user on change or refresh the page so that we can see the fresh data.

Platform event is glorified version of streaming api, and we can get more details on a separate post here.



Solution:

1. Create platform event
2. Upon change, publish event using rest api or on Apex
3. On visualforce page, listen to those event and refresh the page




Create platform event 

  • Essentially created two attributes (ObjectName__c and RecordId__c) so that we can publish the event having those attributes





Publish event using rest api or on Apex


1) Publish via Apex
  • Publish event call doesn't fail, hence we have to go through the results
  • Also publish call doesn't participate in transaction, meaning if transaction has to fail, event will still publish

 List<UpdateObjectEvent__e> events = new List<UpdateObjectEvent__e>();  
 UpdateObjectEvent__e event = new UpdateObjectEvent__e(objectName__c='Account', recordId__c='1234');  
 events.add( event );  
 List<Database.SaveResult> results = EventBus.publish(events);  
 if( results != null && results.size() > 0 ) {  
   for (Database.SaveResult sr : results) {  
     if (sr.isSuccess()) {  
       System.debug('Successfully published event. ' + results.size() );  
     } else {  
       for(Database.Error err : sr.getErrors()) {  
         System.debug('Error returned: ' + err.getStatusCode() + ' - ' + err.getMessage());  
       }  
     }  
   }  
 } else {  
   System.debug(' Noting is published. ');  
 }  
   

2) Publish via Rest API

Endpoint  : /services/data/v40.0/sobjects/UpdateObjectEvent__e/
Payload    : { "RecordId__c" : "123455634343", "ObjectName__c" : "Account" }





Listen to event on Visualforce Page 

We can use cometD library to listen to the event, I had to write customer wrapper (cometdCustom.js) to greatly simply the visualforce page. 




Source code 
It can be found at : https://github.com/c-shah/salesforce-streaming


Monday, September 18, 2017

Environment Hub - One stop shop

It is always a nightmare to keep working with username / password / tokens with different dev orgs or sandboxes.

- Keeping eye on different username / passwords for dev orgs and sandboxes
- Giving Access to your sandbox / dev org to others
- White listing the IPs to avoid tokens being asked
- Sandbox Refresh - changing email and verification
- Scripts to reset passwords or profiles or emails, etc.

I believe Environment Hub is quite sleek solution.

Install Environment Hub Application


  • You will need to contact customer support to have that App installed
  • In case of production org (non ISV), it should be installed in production org
  • for ISV, it is more flexible, but prefer to be at same place as LMA org


Configuration


  • Select Environment Hub App
  • Add Environment Hub tab
  • In case of production org, all sandboxes should be auto discovered 
  • In case of ISV, we might want to register different Developer org to Environment hub
  • We should give all users who need to use Environment Hub, appropriate access to their profile
    • Manage Environment Hub
    • Connect Organization to Environment Hub


Sandboxes


  • Sandboxes are auto discovered by Environment Hub
  • We should enable the SSO on it
  • Once SSO is enabled, it is required to refresh this sandbox
  • Once that is done, any production user (with Connect Organization permission) will be able to login to that org!
  • No more email reset, password rest, white listing IP, ...



Development Orgs


  • We can connect any dev org to Environment Hub
  • We should enable SSO on it
  • Now there are 3 different method to map Environment Hub user to Dev org
    • User name mapping - we can map the user name from Env hub to Dev org - manually
    • Federation Id : in case of SSO, as long as federation id matches between dev org and env hub org
    • User name formula field - apply the user name formula field so that env hub user can be converted to one of the Dev org user
In most cases, there is only one user that we care about, hence I use third approach (formula field) to give all users in Environment hub access to dev org.

E.g. if dev org user is dev2@ot.com, I will make formula field to be "dev2@ot.com", hence all environment hub user will evaluate to "dev2@ot.com" and would have full access to my dev org.





Salesforce Big Objects

Was just experimenting with Salesforce Big Objects and found it quite interesting. It is mainly used for big data (100M+) analytics and mostly for asynchronous data crunching like Hadoop. However, there are very critical distinction before we go with Big Object.


  • Currently Big Object only support fields and permission, and that's about it
  • We can not have
    • triggers
    • page layouts
    • extensive SOQL (indexed SOQL is supported but that's extremely limited - and make sense as we are dealing with humongous data set)
    • no workflows, process builders, etc..
    • no reports
  • Basically it is completely non UI, and just for back end data stores for big data analytics - and that's about it.

Use case

In org, we can have survey on and Object record (e.g. Account, Opportunity, etc..), and would want to store those survey data in Big Object and later analyze it. 


How to user it :


1. Create Big Object

  • There is no User Interface to create the Big Object and its fields. We must use metadata API (using Ant Migration tool or workbench) to create such artifacts. Obviously, workbench makes it a lot easier.
  • Create object file 
  • Create permission set file
  • Create package.xml file
  • Nicely bundle them in .zip file and in right directory structure (you can download it from here)
  • Use workbench to deploy the .zip file and BigObject should like below
  • You can assign permission set (Survey_BigObject) to right user so they can query and update the data



  • Pay close attention to indexed fields, this is used for inserting records - identifying duplicates, and issuing synchronous SOQL queries. 

2. Insert Data

  • We can insert data just like we do in Apex for any other Object, or we can use Bulk API
  • There is no upsert operation, salesforce will automatically check the record being inserted against the indexed value, and if index values are same, then it will do update. otherwise insert.
  • Upon failure, there are no errors, we just need to look at saveResult or saveResults.

 Account a = [ select id, name from account limit 1 ];  
 Survey__b survey = new Survey__b();  
 survey.WhatID__c = a.id;  
 survey.WhatTime__c = System.today() + 1;  
 survey.WhatObject__c = 'Account';  
 survey.Question1__c = 'What is the rating';  
 survey.Answer1__c = '1';  
 Database.SaveResult saveResult = database.insertImmediate(survey);  
 System.debug( ' success ' + saveResult.isSuccess() + ' ' + saveResult );  
   



3. Query Data

  • Querying data is quite tricky with Big Object. Either you can query all records, which most probably is going to fail when we have millions of records
  • Or synchronous SOQL can only be issued against indexed fields. And also indexed fields must be in order in the query. See below for example:
 List<Survey__b> surveys = [ select id, WhatId__c, WhatObject__c, WhatTime__c, Question1__c, Answer1__c from Survey__b ];  
 for(Survey__b survey : surveys ) {  
   System.debug( survey );  
 }  
   
   
 System.debug(' -------------- indexed query -------------- ');  
 /** no gap is allowed and only indexed field in exact order can be used for query , we can skip but no gap is allowed, e.g. below  
 *  [select id from Survey__b] is fine  
 *  [select id from Survey__b where WhatID__c = :a.id ] is fine  
 *  [select id from Survey__b where WhatTime__c = :System.today() ] is NOT fine, as you can't jump to index2 without having index1 in the query.  
 * **/  
 Account a = [ select id, name from account limit 1 ];  
 List<Survey__b> surveys2 = [ select id, WhatId__c, WhatObject__c, WhatTime__c, Question1__c, Answer1__c from Survey__b where WhatID__c = :a.id and WhatTime__c = :System.today() ];  
 for(Survey__b survey : surveys2 ) {  
   System.debug( survey );  
 }  
   
   

4. Asynchronous SOQL


  • Asynchronous soql is only supported using Rest API
  • We have to provide asynchronous SOQL and then the custom Object to store the result
  • It seems like only one Async SOQL can run at any given time - at least in org I worked on

4.1 Create Custom Object To Store Async Result


  • Created suvey analysis object to store the analysis of the query with counts

4.2 Run Asynchronous SOQL

  • Below is how asynchronous SOQL looks like, we need to provide SOQL, and then target table and mapping between selected field and target table's field.

 {  
   "query": "select Question1__c, Answer1__c, count(whatId__c) c from Survey__B where WhatObject__c = 'Account' group by Question1__c, Answer1__c",  
   "operation": "insert",  
   "targetObject": "Survey_Analysis__c",  
   "targetFieldMap": {  
     "Question1__c": "Question1__c",  
     "Answer1__c": "Answer1__c",  
     "c":"Count__c"  
   }  
 }  



  • We can execute it using Rest API on Workbench


Once Asynchonous SOQL query job is completed, we can query Survey_Analysis__c  object for accumulated result.


Sunday, September 17, 2017

Salesforce Platform Cache

Nothing new, just short summary on platform cache :

To over simplify, Platform cache is glorified hash map. Platform cache is first divided into different partitions. These are hard partition, cache usage in one partition will not overflow in another partition. Usually different partition is used for different project.


Partitions are further divided into Org cache (available to all users) and Session cache (used for user session cache). These are again hard partition, cache from Org will not overflow to Session. Minimum cache size is 5MB for Org or Session Cache.




To store key in the Cache

1:  User u = [ select id, username from user where username = 'chintan_shah@abc.com' ];  
2:  Cache.Org.put( 'local.MyPartition1.key1', u );  
3:  System.debug(' key1 is stored ');  
4:    
5:  String name = 'Chintan';  
6:  Cache.OrgPartition MyPartition1 = Cache.Org.getPartition('local.MyPartition1');  
7:  MyPartition1.put('key2', name );  
8:  System.debug(' key2 is stored ');  

  • We can either put key directly using Cache.Org.put or get access to a specific partition using Cache.Org.getPartition
  • To work with Session partition, Org would just change to Session in above code.


To retrieve the key from Cache

1:  Object u = Cache.Org.get( 'local.MyPartition1.key1');  
2:  System.debug(' key1 is stored ' + u );  
3:    
4:  Cache.OrgPartition MyPartition1 = Cache.Org.getPartition('local.MyPartition1');  
5:  System.debug(' key2 is stored ' + MyPartition1.get('key2' ) );  




To clean the Cache


  • Cache is not guaranteed persistence store, it can be cleaned any time by Salesforce 
  • Can also be wiped out during code deployment
  • Session cache max TTL is 8 hours and Org is 24 hours
  • Internally it uses LRU when it hits size limit to clean up old data
Hence, we don't ever have to do the clean up, but if we need to for certain reason, we can use below code. It also has limitation if cache is stored using cache builder.

1:  for(String key : Cache.Org.getKeys() ) {  
2:    Cache.Org.remove(key);  
3:  }  
4:    
5:  for(String key : Cache.Session.getKeys() ) {  
6:    Cache.Session.remove(key);  
7:  }  



Cache Builder


Instead of storing and retrieving cache, it is better to provide loading strategy to Platform cache, so upon cache miss, Salesforce automatically calls the class to load the cache for that key. This reduces the code and handles cache miss much more gracefully.

We have to specify cache loading strategy as class. Below is small class which loads the user information based on username. Idea is username is being the key, we need to load user data if not already in the cache.


1:  /**  
2:   * Created by chshah on 9/14/2017.  
3:   */  
4:    
5:  public class UserInfoCache implements Cache.CacheBuilder {  
6:    
7:    public Object doLoad(String usernameKey) {  
8:      String username = keyToUserName(usernameKey);  
9:      System.debug(' UserInfoCache load usernameKey ' + usernameKey + ' userName ' + username );  
10:      User u = (User)[SELECT Id, firstName, LastName, IsActive, username FROM User WHERE username =: username];  
11:      return u;  
12:    }  
13:    
14:    public static String usernameToKey(String username) {  
15:      return username.replace('.','DOT').replace('@','ATRATE');  
16:    }  
17:    
18:    public static String keyToUserName(String key) {  
19:      return key.replace('DOT','.').replace('ATRATE','@');  
20:    }  
21:  }  


1:  String usernameKey = UserInfoCache.usernameToKey('chintan_shah@abc');  
2:  User u = (User) Cache.Org.get(UserInfoCache.class, 'local.MyPartition1.' + usernameKey );  
3:  System.debug( ' u ' + u );  


  • The reason for converting username to usernameKey, is special characters are not allowed in platform cache key. 
  • In line 2, we provide cache loading strategy, hence if key is not found, it will call our class to load the key.


Consideration


  • ISV (managed package) can supply their own cache, hence it will use different namespace than "local"
  • Cache put follows same transaction boundary as SOQL updates, so any rollback due to failure will not put data in cache
  • Cache TTL limits (8/24 hours), and plus limit on how much data we can store per transaction (usually 1MB)