Organizations running Salesforce at scale face a predictable problem: data accumulates faster than it gets managed. Closed cases from two years ago, converted leads, completed tasks nobody queries, all silently inflating your storage bill and degrading org performance. Following the right Salesforce data archiving best practices keeps your org lean, your costs controlled, and your compliance defensible.
This guide is for Salesforce admins and architects who need a practical roadmap for data archiving in Salesforce, not a sales pitch. By the end of this guide, you will know why Salesforce archiving is important and what the best practices in Salesforce data archiving are that every business should follow.
Salesforce data archiving best practices include defining a clear data retention policy, identifying inactive data, choosing the right storage method (Big Objects or external storage), automating archiving based on rules, preserving parent-child relationships, and regularly testing data restore. These practices help reduce storage costs, improve performance, and ensure compliance.
What is Data Archiving in Salesforce?
Data archiving in Salesforce is the process of moving inactive or historical records out of your active org into a secondary, cost-optimised storage layer, while keeping them accessible when needed. It is not a deletion. It is not a backup. Get the Detailed Salesforce Data Archiving Guide
| Archiving | Backup | Deletion | |
|---|---|---|---|
| Data accessible? | Yes | Yes (restore) | No |
| Reduces live storage? | Yes | No | Yes |
| Compliance-safe? | Yes | Yes | Depends |
A well-defined Salesforce archive strategy ensures historical data is retained, retrievable, and cost-efficient, without cluttering your live CRM environment.
Reduce Cost, Stay Compliant & Unlock Superior Data Experience in Salesforce with External Archiving
Why Salesforce Archiving Matters
Salesforce allocates 10 GB of base data storage plus 20 MB per user license. For most orgs, that fills up faster than expected. Unmanaged data growth creates three compounding problems:
- Performance degradation: bloated objects slow SOQL queries, reports, list views, and automation runs.
- Compliance exposure: GDPR, HIPAA, and SOX require defined retention windows; unmanaged data is uncontrolled data.
A German RegTech Firm Migrates Salesforce Data Within a Week with DataArchiva
What Data Should You Archive in Salesforce?
A good Salesforce archive strategy starts with identifying cold data — records no longer part of day-to-day operations. These objects are the highest-priority candidates:
- Cases: closed cases older than 12–24 months.
- Opportunities: closed-won/lost records beyond your active reporting window.
- Leads: converted or disqualified leads older than 6–12 months.
- Tasks & Activities: completed activities older than 90–180 days.
- Files & Attachments: documents attached to records that have already been archived.
Do not archive: open cases, active pipeline records, or any data referenced by live automation or active reports
Salesforce Data Archiving Best Practices
These Salesforce archiving best practices are what separate a mature archival strategy from a reactive storage cleanup. Follow them in sequence.
1. Define a Data Retention Policy Before You Archive
- GDPR: data minimisation principle — retain only what is necessary, for as long as necessary.
- HIPAA: covered records require a minimum 6-year retention window.
- SOX: financial data must be retained for 7 years.
Without a policy, your archiving is guesswork, and guesswork does not hold up in an audit.
2. Audit Your Org Storage Before Designing Any Archive Job
Navigate to Setup → Storage Usage. Before you archive a single record, understand what is consuming your storage, which objects are growing fastest, and whether files and attachments are a larger cost driver than record data. This audit is step one of any serious data archiving in a Salesforce project.
3. Choose the Right Method: Big Objects vs. External Storage
Salesforce gives you two native archiving paths, choose based on your scale:
Store historical records inside the Salesforce Big Objects platform. Familiar to end users, no external dependency. Limitation: no aggregate SOQL, no ORDER BY on non-indexed fields, records still count toward platform storage. Best for moderate volumes.
External Storage (AWS, Azure, GCP, Heroku, on-prem)Moves data entirely off-platform as cloud/on-prem archiving. Reduces storage costs significantly and scales without governor limit constraints. Requires a connector to maintain native Salesforce accessibility. Best for enterprise volumes and regulated data.
4. Automate Archiving with Rules-Based Triggers
Manual archiving is error-prone and unsustainable at scale. Automate based on:
- Record age (e.g., Cases with Status = Closed and CloseDate > 18 months ago)
- Activity status or last modified date
- Custom field flags set by your business logic
5. Preserve Parent-Child Record Relationships
This is where most DIY archiving projects break down. Archiving a Case without its related EmailMessages, Attachments, and child tasks creates orphaned records useless for audits. A production-ready Salesforce archival solution must archive parent and child records together, in the correct dependency order, without data loss.
6. Test Your Restore Process Before You Need It
If you have never tested a restore, your archive is untested. Before going live, verify that archived records can be retrieved to the live org with their original structure and relationships intact. Build this test into your go-live checklist and repeat it annually. This step is consistently skipped and consistently regretted.
7. Monitor Archived Data with Audit Logs and Compliance Reports
Common Salesforce Archiving Mistakes to Avoid
- Archiving without child records — breaks data integrity and makes audit trails incomplete.
- No SOQL testing on Big Objects — Big Objects have query restrictions; test at volume before automating.
- Skipping the restore test — the most common oversight in any Salesforce data archiving project.
- No annual policy review — retention rules go stale when regulations or business processes change.
- Archiving active data — always validate record status before any automated job executes.
Native Salesforce Archiving vs. Third-Party Solutions
Salesforce Big Objects: Honest Assessment
Big Objects are the native option for archiving Salesforce data. They are free, stay inside the platform, and require no external dependencies. But for enterprise orgs, they come with real constraints: limited query, support, manual configuration overhead for multi-object archiving, and no built-in ability to increase archive days in Salesforce beyond what your automation logic defines. They work for moderate volumes with simple use cases.
Where DataArchiva Fits
- Automated archiving to Big Objects or external storage (AWS, Azure, Heroku, GCP, on-prem) based on custom rules.
- Relationship integrity — archives parent and child records together, in the correct order.
- Instant Salesforce access — archived records remain visible in the standard UI; no workflow disruption.
- AI-powered search across live and archived data.
- End-to-end encryption and role-based access for regulated industries.
- One-click restore with full data structure and relationships preserved.
Make Your Salesforce Data Agentforce-Ready with DataArchiva
If you are managing large volumes, regulated data, or need to increase archive days in Salesforce without hitting storage or governor limits, DataArchiva is the most complete native solution available.
FAQs
The main options for archiving Salesforce data are:
- Salesforce Big Objects — native, free, limited SOQL support, best for moderate volumes
- DataArchiva — AppExchange-native, supports Big Objects and external storage, automated, compliance-ready; best overall for enterprise orgs
- Custom ETL pipelines — flexible but high-maintenance; requires dedicated development resources
For most Salesforce orgs, a dedicated Salesforce archival solution like DataArchiva provides the best balance of automation, native accessibility, and compliance coverage.
To archive records in Salesforce effectively:
- Audit storage — Setup > Storage Usage to identify high-volume objects.
- Define a retention policy per object with compliance requirements mapped.
- Choose your storage method — Big Objects for moderate volume, external storage for enterprise scale.
- Build automated archive jobs using record age, status, or field-value triggers.
- Validate parent-child relationship handling in your chosen archival solution.
- Run a restore test before going live.
- Schedule quarterly compliance audits and an annual policy review.
Yes, you can archive Salesforce data by moving inactive or historical records to external or lower-cost storage without deleting them. Solutions like DataArchiva help archive data natively, securely, and with full access.
There’s no hard cap, but archived data still counts toward storage unless moved to optimized storage or external systems.
Let DataArchiva Handle All Your Salesforce Data Management Woes!



