Two Storage Pools, One Billing Problem
Salesforce separates storage into two distinct pools: data storage and file storage. Data storage covers records — accounts, contacts, opportunities, cases, and every custom object row your teams create. File storage covers attachments, documents, Chatter files, Salesforce CRM Content, and assets uploaded through Experience Cloud or Knowledge articles. Most organisations hit data storage limits long before they expect to, because every piece of platform activity — emails logged through Einstein Activity Capture, Agentforce interaction records, Data Cloud-synced profiles — writes rows to the data store.
Under the Enterprise Edition, each user licence provides 1 GB of data storage and 2 GB of file storage. The Unlimited Edition matches that allocation but applies additional pooled storage that scales with the user count. The Professional Edition delivers 1 GB of data storage and 2 GB of file storage per org, not per user, which creates pressure at relatively modest user counts. Essentials Edition organisations receive 10 GB of combined storage for the entire org.
What catches organisations off-guard is the interaction between third-party integrations and storage consumption. Every MuleSoft flow that writes transformed records back into Salesforce objects consumes data storage. Every CPQ quote document stored as a Salesforce File consumes file storage. If your MuleSoft vCore sizing is generous — meaning flows run frequently and write verbose payloads — storage consumption will scale accordingly. This is one reason we always examine integration architecture when conducting a Salesforce licence review.
What Happens When You Hit the Limit
Salesforce does not enforce a hard cut-off at 100% capacity. The system allows you to reach 110% before automations begin to fail. At that threshold, Flow automations cannot complete, users cannot upload new files or email attachments, and Agentforce AI agents lose the ability to ingest new documents into their context windows. If you rely on Einstein or Data Cloud for personalisation pipelines, those processes also stall — because they depend on being able to write staging records before processing.
The practical consequence is an operational disruption that looks like an IT problem but is in reality a licensing problem. Salesforce account teams will arrive quickly with a quote for additional storage capacity or a licence upgrade. The quote will look reasonable in isolation. What buyers rarely scrutinise in that moment is that the new storage SKU, like every other add-on, carries the standard 8–10% annual uplift clause through to the next renewal. A $30,000 storage purchase signed under pressure at 110% capacity will cost closer to $36,000 by the end of a three-year term if the uplift clause is accepted without negotiation.
Understanding the API Limit Structure
Salesforce enforces daily API request limits at the org level, not the individual user level. For Enterprise Edition orgs, the baseline is 100,000 API calls per day, plus an additional 1,000 calls per user licence. An Enterprise org with 200 users therefore has a theoretical daily limit of 300,000 calls — though in practice, many orgs operate primarily on the base 100,000 unless headcount is large.
These limits apply across all REST, SOAP, Bulk, and Streaming API requests. Every connected application counts against the same pool: marketing automation tools, ERP integrations, custom portals, MuleSoft flows, AppExchange applications, and internal developer tooling. Organisations that have grown their integration landscape organically — adding tools without a centralised API governance model — frequently find themselves approaching or exceeding daily limits without being able to identify which applications are responsible.
The daily limit is technically a soft limit. Salesforce will not cut off API calls the moment you cross the threshold in a single burst. However, if you consistently breach the daily allocation, their system protections will eventually block subsequent calls until the 24-hour window resets. The operational impact varies: a blocked MuleSoft batch sync might mean stale data in downstream systems; a blocked CPQ integration might prevent opportunity records from closing correctly, directly impacting revenue recognition.
Purchasing Additional API Calls
Salesforce provides a mechanism to purchase additional API capacity through the “Additional API Calls — 10,000 per day” SKU, accessible through the My Account self-service portal within your org. This add-on allows you to increase your daily limit in increments rather than forcing a full edition upgrade. For organisations that have hit limits due to a specific integration project — a Data Cloud activation campaign, for instance, or a major MuleSoft rollout — this is a more cost-effective path than upgrading all user licences.
However, three contractual issues arise when purchasing API add-ons. First, the add-on is typically structured as an annual commitment, meaning you pay for 365 days of elevated capacity whether you need it for three months or twelve. Second, the renewal uplift clause applies, so the add-on becomes incrementally more expensive each year without explicit negotiation. Third, Salesforce often uses the API limit conversation as an opportunity to propose edition upgrades — framing Performance Edition as a way to "solve the problem permanently." Performance Edition pricing is not published and is consistently higher than the Enterprise Edition equivalent, making the API limit discussion a gateway to a significantly more expensive commercial conversation.
Need an independent review of your Salesforce storage and API position?
We benchmark your consumption against licence entitlement and identify where Salesforce is likely to push add-ons at renewal.Data Cloud and the Storage Multiplication Effect
For organisations using Salesforce Data Cloud (rebranded Data 360 in late 2025), storage complexity increases substantially. Data Cloud operates on a credit consumption model — credits are consumed by data ingestion, identity resolution, and segment activation. The Data Cloud data store is separate from the core Salesforce org storage and is licensed through a combination of Data Cloud credits and Salesforce Data Cloud user licences.
The critical planning issue is that Data Cloud writes unified profile data back into core Salesforce objects for use by Sales Cloud, Service Cloud, and Agentforce. Those write-backs consume core Salesforce data storage. Organisations that activate large customer segments through Data Cloud often see core storage consumption increase by 20–40% within the first year, without any corresponding increase in CRM users. When this triggers a storage overage, both the Data Cloud credits renewal and the core storage add-on are up for renegotiation simultaneously — a position that strengthens Salesforce's ability to tie the two into a bundle at a higher combined price point.
Agentforce adds another layer to this equation. Each Agentforce interaction — priced on a per-conversation basis — generates interaction logs, context records, and follow-up task records stored in the core data store. High-volume Agentforce deployments with thousands of daily interactions can consume several gigabytes of data storage per month purely from agent activity records, which were typically not modelled in the original storage planning assumptions.
Five Practical Controls Before Your Next Renewal
Based on our work with enterprise Salesforce customers across 500+ engagements, five controls have the highest impact on storage and API cost management:
- Run a storage audit 90 days before renewal. Pull the storage usage report from Setup → Storage Usage and categorise consumption by object type. You will almost always find large volumes of old email logs, archived ContentDocument records, and custom object rows from deprecated processes that can be safely archived or deleted.
- Implement a retention policy for ContentDocument and EmailMessage objects. These two object types account for the majority of unexpected data storage growth in mature orgs. Salesforce's native archiving tools are limited — consider a third-party archiving solution before paying for additional storage capacity.
- Audit API call sources before accepting a limit increase. Use the API Usage and Limits report under Setup to identify which connected apps consume the most calls. You will frequently find deprecated integrations or over-polling scheduled jobs that can be reduced without any functional impact, eliminating the need for the additional API SKU entirely.
- Negotiate storage and API limits as part of the master contract, not as add-ons. Storage and API capacity purchased as separate SKUs renew at the full list price and carry the uplift clause independently. Folding them into the master agreement gives you a single negotiated rate, a single renewal point, and leverage against the annual uplift that isolated add-on purchases never provide.
- Model the MuleSoft impact explicitly. If you are deploying MuleSoft, size your vCore count against integration volume, and simultaneously model the Salesforce data storage impact of the flows you intend to build. MuleSoft integrations that write enriched records back to Salesforce are the single fastest path to an unexpected storage overage in a well-designed technical architecture.
Renewal Positioning: Storage as a Negotiation Lever
Salesforce's fiscal year ends January 31. The period between November and mid-January is when account teams are under maximum pressure to close renewals and deliver against annual targets. If your storage consumption projections suggest you will exceed limits within 12 months, surfacing this discussion in November — with a clear ask to increase entitlements within the current contract pricing — is tactically superior to waiting until the overage occurs. A Salesforce AE with quota pressure in January will accept a lower per-unit storage price far more readily than the same AE in March operating with a full pipeline and no urgency.
The same logic applies to API limits. If your integration roadmap includes a major Data Cloud activation or a new MuleSoft project, quantify the expected API call increase and bring a proposal to Salesforce before the project goes live. Buying API capacity as part of a broader expansion deal — where you are also adding user licences or a new cloud — provides cross-product leverage that a standalone storage SKU purchase never delivers.
Storage and API management are not IT housekeeping tasks. They are contract management tasks. The organisations that control their Salesforce costs most effectively treat storage consumption as a commercial variable that belongs in the same conversations as user count, edition mix, and uplift clauses — not as an operational problem that IT resolves by clicking “buy” in the self-service portal. Every unplanned storage purchase Salesforce closes is a contract they wrote on their terms. The alternative is to write it on yours.
Salesforce Licensing Intelligence — Delivered Monthly
Join 4,200+ procurement and IT leaders who receive our Salesforce contract alerts, benchmark data, and negotiation playbooks.