Archiving Data in Salesforce Using Big Objects

December 22, 2021 | Vasan Sampath
salesforce using big objects

When Salesforce was introduced in 1999, it was just used as a software solution to automate the sales process. Since then, Salesforce has come a long way from those humble beginnings and has become an enterprise-wide solution adopted by leading companies. They have expanded their product offerings to include Sales, Marketing, Service, CPQ, Billing, Analytics, Data, Community, Custom Solutions, and IoT.

If you’re an organization that has been using Salesforce applications for a couple of years, you may have realized that data management in Salesforce has become complex, messy, and has a ton of data that isn’t used anymore. Because of this, you may have already started running into Salesforce data storage limitations.

Given additional storage can cost up to $1500 for 500 MB per year, it may be an expensive affair if you have a large number of users and applications that generate data. Besides, redundant and unused data causes clutter for users and can lead to Governor limits getting breached. If you are at this stage already, then you need an effective and potent data archival strategy for your Salesforce infrastructure.

Salesforce provides data archival capabilities in its application that can be used to archive data and free up storage space. 

Why Should You Archive Data in Salesforce?

Large data volumes can result in slower query performance and impact your user experience. By cleaning up unwanted data, you can reduce clutter for users and drive better adoption. 

Archiving gives your organization greater control of your information processes. You can also reduce storage costs by archiving product data. And, last but not least, archiving keeps your data safe.

Planning for Archiving Data in Salesforce

Every organization using Salesforce needs an effective data archival strategy to manage storage in the long run. Before you begin with this process, form a dynamic archiving policy and consider the inputs of all your stakeholders.

Storage and Limits

Keep track of the storage you’re currently using from the total storage available. Salesforce does provide extra storage, but it comes at a cost. Also, get a sense of how your data storage is growing with time so you can project when you may start exceeding limits.

Usage Trends

Review data usage metrics in Salesforce to identify which objects are responsible for the most data usage. Make use of suitable tools and dashboards to figure out which data needs to be archived.

Implications

You never know which data can come in handy in the future, therefore it’s necessary to consult with the legal team of your organization before deleting any of the data. Several data integrity implications such as Field Removal and Parent-Child Relationship might arise in the future; it’s always better to be on the safe side.

Data Archival Framework

When creating your data archival framework, you’ll need to determine where to store the data and the frequency with which it may need to be accessed. Your IT department can help you with this task. Bring the data owners and the legal team of the organization together to provide relevant information on the content of the data. 

Make use of the tools and dashboards you have to determine which data is necessary and which is irrelevant. Next, classify the stored data and define a set of rules to know which data should be stored, deleted, or archived off the system. This policy will help you classify storage locations based on ease and level of access needed. 

Once you implement the data archiving policy, make sure to set aside time to periodically review the policy so that it stays apt as the organization and external influences change.

Archiving Data Using Big Objects

What Are Big Objects?

Big objects archive and manage large amounts of data within Salesforce without affecting its performance, while processing scales with billions of records. Organizations can make use of standard or custom big objects to solve their large data issues.

The advantage of storing data in big objects is the data remains in Salesforce and it is queryable and easy to retrieve on demand.

How to Use Custom Big Objects?

Custom big objects are defined and deployed by using the Metadata API. These objects allow you to store unique data and information about the organization.

  • Define and Create big objects using Metadata API or using Setup.
  • Create the object file that contains definition, field, index, permission set, and package file.
  • Create the object file that contains definition, field, index, permission set, and package file.
  • Select ‘Active’ to activate the particular record type.
  • Change the deployment status to ‘Deployed’.
Key Consideration to Take into Account

Before you get started using Big Objects, make sure to keep the following considerations in mind. Big objects support only object and field permissions and you can’t edit or delete the index. Big objects can be accessed from custom Salesforce Lightning and Visualforce components rather than standard UI elements (home pages, detail pages, list views, and so on). 

You can create up to 100 big objects per org. The limits for big object fields are similar to the limits on custom objects and depend on your organization’s license type. To support the scale of data in a big object, you can’t use triggers, flows, processes, or the Salesforce app. You can access data in BigObjects by using Async or standard SOQL, depending on the volume of data to query and the need for real-time information.

Save on Storage Costs & Secure Your Data

Ready to save 85 percent of the storage costs and secure your Salesforce data? Salesforce provides tons of features for data storing and archiving, but the best approach will depend on your business needs.