Getting Started

When you get started with SmartUpload, you will need the following:

  • A Salesforce org for which you are Salesforce Admin

  • An Excel file format that you wish to process in Salesforce (either for inserting or for updating data)

  • Knowledge about the Salesforce Object you wish to import into, including API names of fields

When you have this information, setting up SmartUpload is a simple 2-step procedure:

  • First, you create a 'Mapping' (see documentation below) on how to map your Excel file format to Salesforce

  • Secondly, you need to invite your users to SmartUpload, so they can use your pre-defined mapping to upload their data.

Mapping Format

A mapping has a format like this:

{ "targetSobject": "UploadTester__c", "operation": "update", "batchSize": 1, "mapping": [{ "columnName": "Tester Name", "targetField": "Name", "primary": true, "type": "text", "regexp": "/^UT-[0-9]+$/" },{ "columnName": "Birthdate", "targetField": "Date__c", "type": "date" },{ "columnName": "Text", "targetField": "Text__c", "type": "text" }] } The following sections will describe the different elements in detail.


This is the target object in Salesforce to do the insert or update to. Regular Salesforce permissions apply: if a user is trying to use a mapping which has a targetSobject defined to which they don't have permissions, the upload will fail.


This is the batchsize to use during the insert or update operation. Typically, if you have a simple object that you are importing against that does not have many workflows, process builders, validation rules or triggers attached, you can use a batchsize up to '200'.

However, if you have processes attached to field-updates (or record inserts) that require a lot of Salesforce CPU, you might run into APEX CPU limits when importing with large batchsizes. That is because all the operations within a batch will count towards the session limit. In these cases, you should use smaller batchsizes. However, this will mean that SmartUpload will need to perform more API calls.


This can be either update or insert.


When you use the Update operation, you need to have one mapping field which has the property 'primary' set to 'true'. If this field is not the 'Id' field, SmartUpload will first do a query to find the ID of the field based on the given value. In the example provided at the top, SmartUpload will for every row try to find an 'UploadTester__c' record with 'Name' matching the value in the column 'Tester Name' of the excel file.


When you use the Insert operation, it is important not to include any fields that might be auto-numbered. For instance if you have a 'Name' field that is auto-populated in Salesforce, you should not have it in your mapping file.


Every mapping entry in the 'mappings' array constitutes a column in your source Excel file. There are several fields in a mapping that you can provide.


This is the name of the column in Excel


This is the field on the 'targetSobject' that the value from the excel file will be used for,.


The type of the field. Following datatypes are supported:

  • text

  • integer

  • date

  • double


For text-fields, a regular expression can be used to validate the contents. You can use these to check for specific patterns, to disallow certain characters, or prevent entries where text fields are too long.


If you choose the operation 'Update', one of the fields must be marked 'primary', so that SmartUpload can figure out which records in Salesforce to update.