SAP Datasphere Negotiation Guide: Capacity Units, Business Data Cloud Transition and Enterprise Pricing Strategy
Navigate SAP's cloud data warehouse licensing transformation. Master the Capacity Unit model, understand the 2025-2026 migration to SAP Business Data Cloud, and secure enterprise contracts with aggressive negotiation tactics.
Executive Summary
SAP Datasphere is SAP's cloud-native data management and analytics platform, competing directly with Snowflake, Azure Synapse, and Google BigQuery. Since its release in 2023, Datasphere has been priced under SAP's Capacity Unit (CU) model—a consumption-based metric covering compute, storage, integration, and data catalog services.
In July 2025, SAP fundamentally restructured Datasphere licensing. The platform was unbundled from RISE Premium Plus and became a separate add-on. More critically, from January 1, 2026, Datasphere was removed from BTPEA, CPEA, and PAYG-eligible services. SAP no longer sells Datasphere as a standalone cloud service. Instead, it has been folded into SAP Business Data Cloud (BDC)—a bundled offering that combines Datasphere with SAP Analytics Cloud under a unified CU metric.
This transition creates significant negotiating leverage for enterprise buyers, but only if you act before December 31, 2025. After that deadline, your only path to Datasphere is through BDC, which forces a strategic decision: bundle analytics with your data warehouse, or migrate to competing platforms.
Organizations on BTPEA with remaining credits should deploy Datasphere immediately before Dec 31 2025. After that date, Datasphere subscriptions cannot be renewed; only BDC is available.
This white paper covers the Capacity Unit pricing model, the BDC transition timeline, right-sizing strategies, and the negotiation tactics that have delivered 15-30% savings in enterprise deals. We also outline the contract protections every buyer must demand and provide a 90-day action plan.
What is SAP Datasphere and How is it Priced
SAP Datasphere is a unified, cloud-based data platform designed to ingest, catalog, govern, and analyze data at enterprise scale. Unlike traditional on-premises data warehouses, Datasphere is purely cloud-native and runs on SAP's BTP (Business Technology Platform) infrastructure.
Core Datasphere Capabilities
- Data Integration Layer: Connect to 100+ data sources (SAP ERP, Salesforce, Workday, databases, APIs). Schedule ingestion pipelines, transform data in-flight, and manage incremental updates.
- SQL Data Lake: Store structured and semi-structured data in a proprietary columnar format. Query via SQL, optimize compression. No external Hadoop or Spark clusters required.
- Data Catalog & Governance: Discover datasets, lineage tracking, metadata search, access controls. Critical for large enterprises managing 100+ data assets.
- BW Bridge: Real-time virtualization of SAP BW/BW4HANA data without copying. Deprecated BW systems can continue to serve reports through Datasphere's query engine.
- Analytics Foundation: Semantic layer for creating business models. SAP Analytics Cloud (now bundled in BDC) consumes these models for reporting and BI.
Capacity Units: The Pricing Metric
Datasphere does not bill on CPU, GB of storage, or query count. Instead, it uses Capacity Units (CU)—a hybrid metric that bundles:
- Compute allocation (vCPU equivalents available per month)
- Storage quota (GB of data lake capacity)
- Integration tasks (number of active pipelines)
- Data catalog operations (metadata ingestion, lineage tracking)
- Analytics model complexity
One CU is priced at approximately USD 12-15 per month at enterprise scale (with volume discounts). Smaller commitments pay 18-20% more; very large commitments (50,000+ CU annually) negotiate down to USD 10-11 per CU.
Many buyers assume Capacity Units map 1:1 to CUs in competing platforms (e.g., Snowflake's "Credits"). They do not. SAP's CU is a bundled metric that cannot be directly compared without detailed workload analysis. Budget conservatively and pilot for 6 months before committing long-term.
SAP publishes a CU consumption calculator, but it is notoriously inaccurate. Actual consumption typically runs 1.3-1.8x the estimate, especially in the first 12 months. This is a known pain point and a major negotiating lever.
The Capacity Unit Model Explained
Understanding how CUs are consumed is essential to negotiation. Unlike cloud storage (billed per GB used), Datasphere CU consumption is complex and opaque.
What Consumes Capacity Units?
- Compute tasks: Data integration jobs (ETL), SQL queries against the data lake. High-complexity transformations consume more CUs than simple selections.
- Storage: Provisioned quota in the data lake, measured in GB. Actual data usage must stay within the quota or overages incur surcharges.
- Integration capacity: Number of concurrent pipelines, frequency of runs (hourly, daily, real-time). Datasphere allocates compute slots; exceeding your limit throttles jobs.
- Data catalog: Number of tables, columns, and lineage relationships tracked. Scanning 10,000 tables consumes significantly more CU than 500.
- Analytics models: Semantic layer complexity. Simple star schemas cost less than multi-hierarchy OLAP models.
| Workload Type | Typical CU Consumption | Example Scenario |
|---|---|---|
| Light (Dev/Test) | 500-1,000 CU/month | 5 data sources, 20 tables, 10 dashboards |
| Medium (Mid-market) | 2,000-5,000 CU/month | 15 data sources, 500 tables, 100 users, daily refreshes |
| Large (Enterprise) | 8,000-20,000 CU/month | 40+ data sources, 2,000+ tables, 24/7 real-time pipelines, 1,000+ users |
| Very Large (Global) | 20,000-50,000+ CU/month | 100+ data sources, 5,000+ tables, multi-region, AI/ML workloads |
Minimum Commitments and Tiering
As of April 2026, SAP's minimum monthly tenant allowance is 1,532 CU. This applies even if your workload only needs 500 CU. Annual commitments for Datasphere historically started around 4,300 CU (before BDC integration).
For BDC (Datasphere + SAP Analytics Cloud bundled), typical minimums are 5,000-10,000 CU/month for new subscriptions, with typical 2-3 year commitments in the 6-figure range (USD 720,000–2,160,000+, depending on scale).
Push SAP for a "pilot minimum"—a reduced CU tier (e.g., 800 CU/month) valid for 6-9 months before escalating to the standard minimum. This limits your exposure while you measure actual consumption.
The 2025-2026 Transition to SAP Business Data Cloud
In mid-2025, SAP announced a strategic consolidation of its data and analytics cloud portfolio. The timeline and impact have been severe for buyers caught unaware.
Timeline of Changes
- July 2025: Datasphere unbundled from RISE Premium Plus. Became a separate, optional add-on. Customers already on RISE could continue Datasphere; new RISE customers needed to select Datasphere explicitly (and pay additional)
- January 1, 2026: Datasphere removed from BTPEA (Business Technology Platform Enterprise Agreement), CPEA (Cloud Platform Enterprise Agreement), and PAYG (Pay-As-You-Go) programs. Organizations using free or promotional credits could no longer use them for Datasphere. Direct subscriptions became required.
- December 31, 2025: Last day to renew existing Datasphere standalone subscriptions under legacy terms. After this, all new and renewal subscriptions must be under the BDC umbrella.
- January 1, 2026 onwards: Datasphere only available as part of SAP Business Data Cloud bundle. Analytics Cloud (formerly sold separately) also moves to BDC only.
What is SAP Business Data Cloud?
BDC is SAP's answer to the "data + analytics" bundling question. Rather than letting customers buy Datasphere for data warehousing and Tableau/Power BI for analytics, SAP bundles Datasphere + SAP Analytics Cloud under a single CU commitment.
- Datasphere component: All data integration, storage, governance, and catalog features
- SAP Analytics Cloud component: Real-time dashboards, storytelling, predictive analytics, AI-powered insights. Previously sold as a separate SKU.
- Single CU pool: Both products draw from one monthly CU allocation. Cannot split or reallocate independently.
- Unified data model: SAC consumes Datasphere's semantic layer natively. No ETL between systems.
Organizations already using third-party analytics (Tableau, Power BI, Looker) must now either: (a) adopt bundled SAC and absorb the extra cost, (b) pay for Datasphere + separate analytics license, or (c) migrate to competing data warehouses. SAP has effectively eliminated the "data-only" purchase path.
This is a forcing move by SAP to increase Oracle-style stack lock-in. Buyers who were on the fence about SAP's analytics product are now forced to negotiate BDC as a bundle.
What BDC Means for Your Existing Datasphere Contract
The transition to BDC affects different customer segments in different ways.
If You're Currently On Datasphere (Not Yet Renewed)
Your existing contract remains valid until the renewal date. You have three options:
- Renew standalone Datasphere before Dec 31 2025. Lock in the old terms, pricing, and minimum commitments. SAP may pressure you to migrate, but they cannot force it if you act before the deadline. If you're on BTPEA with unused credits, deploy Datasphere now to exhaust the credits before they become worthless.
- Migrate to Business Data Cloud. Negotiate a new 2-3 year BDC contract with aggressive discounts (see Section 08). You inherit SAC whether you want it or not. Push for a "data-only" discount if you plan to use Tableau/Power BI instead.
- Evaluate competing platforms. Snowflake, Azure Synapse, Google BigQuery, and Databricks all offer data lakehouse / warehouse capabilities without forced bundling. Migration takes 6-18 months but may deliver 20-40% lower TCO.
If You're On BTPEA (SAP's Free/Promotional Program)
This is critical: Your Datasphere usage under BTPEA credits became ineligible on January 1, 2026. Any consumption after that date will be charged at commercial rates unless you have a direct, paid Datasphere subscription.
- If you have unused BTPEA credits allocated to Datasphere, you must consume them by December 31, 2025. After that, they cannot be applied to Datasphere.
- BDC subscriptions are not eligible for BTPEA credits. If you want to continue post-Jan 1, you must buy BDC outright.
- This is a hard deadline with no exceptions. Hundreds of enterprises missed this, resulting in unexpected billing shocks in Q1 2026.
If you're on BTPEA with Datasphere, audit your remaining credits immediately. If you have 1,000+ CU remaining, you have 9 months to activate a production Datasphere instance (even if small) to protect the credits. Consult your SAP account team on the timeline.
If You're Not Yet On Datasphere
All new Datasphere subscriptions must be purchased as part of BDC. There is no standalone Datasphere offering anymore. You cannot opt for "Datasphere only"—you must accept the SAC bundle and pay for it, even if you don't use Analytics Cloud.
This creates significant negotiating leverage in your favor. Use it: If SAC adds USD 300,000/year and you won't use it, demand a "data platform discount" or threaten to evaluate Snowflake. SAP values the relationship more than losing it to a competitor.
Competitive Alternatives
SAP's forced bundling of Datasphere and Analytics Cloud has created unprecedented negotiating leverage for buyers willing to evaluate alternatives. Here's how the market compares:
| Platform | Pricing Model | Analytics Included | Migration Effort | Typical Enterprise Cost (1,000 Users) |
|---|---|---|---|---|
| SAP Datasphere + SAC (BDC) | CU (blended) | Yes (forced) | Low | USD 1.2–2.4M / 3yr |
| Snowflake | Credits (compute) + storage | No (buy separate) | Medium (4-6mo) | USD 600K–1.2M / 3yr |
| Azure Synapse | vCore per sec + storage | No (Power BI separate) | Medium (3-5mo) | USD 500K–1M / 3yr |
| Google BigQuery | Bytes scanned | No (Looker extra) | Medium (4-6mo) | USD 400K–900K / 3yr |
| Databricks Lakehouse | DBUs (unit of compute) | No (buy separate) | Medium-High (6-12mo) | USD 700K–1.5M / 3yr |
Key Takeaway for Negotiation
SAP's BDC pricing (USD 1.2–2.4M for a 3-year enterprise commitment) is at the high end of the market. Snowflake, Azure Synapse, and BigQuery all offer 15-40% cost advantages when bundled with your choice of analytics tool. Use this in your negotiation: "We've modeled Snowflake + Tableau, and it's 25% cheaper. What discount can you offer to keep us on Datasphere?"
Most SAP account executives will immediately offer 15-25% off the list price when faced with credible competitive pressure. Push hard here.
CU Sizing — How to Right-Size Your Commitment
The single biggest mistake enterprise buyers make is over-committing to CU before understanding actual workload requirements. SAP's CU calculator is notoriously inaccurate, often by 40-60%.
The Sizing Methodology
Step 1: Audit Current Data Footprint
- Count all data sources: SAP ERP, CRM, HR, external databases, cloud applications, APIs
- Measure current data volumes: tables, columns, GB per source
- Map ingestion frequency: batch (daily, hourly) vs. real-time vs. event-driven
- Calculate 3-year growth: data typically grows 25-40% annually
Step 2: Model Integration Workload
- For each source, estimate pipeline complexity: simple SELECT + UNION (1 CU unit) vs. complex ETL with joins and lookups (3-5 CU units)
- Concurrent pipelines: if 20 pipelines run daily, worst-case concurrent = 5 (CU cost scales with concurrency)
- Real-time integrations: 2-3x more expensive than scheduled batches
Step 3: Calculate Storage Requirements
- Raw data ingestion: typically 3-5x the original file size (decompressed, partitioned)
- Retention: if you keep 7 years of history, storage scales linearly
- Compression: Datasphere's columnar format achieves 5-10x compression on structured data
Step 4: Estimate Analytics Layer Cost
- Count semantic models (cubes, star schemas): 10 models = baseline cost
- Measure query volume: 1,000 queries/day across 100 users = low cost; 100,000 queries/day across 10,000 users = high cost
- Real-time vs. cached: real-time queries cost 3-5x more CU than cached/pre-aggregated
Industry benchmark: 1,000 active users consuming real-time analytics from a data lake with 50-100 tables typically require 3,000-8,000 CU/month. Mid-market firms (100-500 users, batch workflows) need 1,500-3,000 CU/month. Size conservatively; overages are expensive.
The Pilot Approach (Recommended)
Rather than commit 3 years upfront, negotiate a 6-9 month pilot at a reduced CU tier:
- Pilot minimum: 800-1,000 CU/month (negotiate below SAP's standard 1,532 minimum)
- Run production workloads, measure actual consumption for 6 months
- After pilot, sign a 2-year production contract at negotiated CU level + pricing discount
- Build in annual "true-down" rights (see Section 09)
This protects you from the 1.3-1.8x estimation misses endemic to SAP's calculators.
Negotiation Playbook
Enterprise negotiation with SAP follows predictable patterns. Here are the moves that work, with real pricing outcomes from our engagements.
Move 1: Reframe as Bundled Analytics Cost (Not Datasphere Cost)
SAP's sales team leads with "Datasphere pricing is USD 12-15/CU," which sounds reasonable. Reframe: "We're buying Datasphere + SAC bundle, which at 5,000 CU/month × USD 13 × 36 months = USD 2.34M. For that price, Snowflake + Tableau is 25% cheaper."
This pivot immediately triggers SAP's discount authority. Most account executives have 20-30% discretionary discount available. Expect a counteroffer of 15-20% off list within 2-3 calls.
Move 2: Demand Tiered CU Pricing
SAP quotes a flat CU rate, but large commitments get volume discounts. Standard playbook:
- 0-3,000 CU/month: USD 14/CU
- 3,001-6,000 CU/month: USD 13/CU
- 6,001+ CU/month: USD 11.50/CU
If you're committing to 8,000 CU/month, negotiate for a blended rate where tiers kick in at 4,000 CU (not 6,000). This saves USD 12,000+/year on large deals.
Move 3: Pilot-to-Production Transition
Lock SAP into a published commitment before your pilot ends. Script: "We'll sign a 2-year production contract at 6,000 CU/month + 18% discount, effective Day 1 of production, contingent on pilot consumption not exceeding 4,000 CU/month average."
This gives SAP certainty (locked in 2-year deal) and you certainty (production discount locked in if pilot succeeds). Most executives accept because it de-risks their quota.
Move 4: Negotiate CU Reallocation Flexibility
Real scenario: You commit to 5,000 CU/month, but Datasphere uses 3,000 and Analytics Cloud uses 2,000. Six months in, you need more Datasphere (4,500) and less Analytics (500). Can you reallocate?
SAP's default: No. You must pay for overages and they keep the 500 unused Analytics CU. Aggressive negotiating language: "We need monthly reallocation rights between Datasphere and Analytics Cloud components within the total CU commitment. Unused CU roll to the next month (max 20% carryforward)."
Outcome: Most deals include this now; it's table stakes for enterprise accounts.
Move 5: Push AI Units into Main Deal
If you're modernizing to RISE (SAP's cloud ERP) and buying Datasphere, you likely want to pilot AI/ML workloads on SAP's AI Core. SAP tries to sell this as a separate line item. Don't accept.
Negotiation script: "AI units are strategic to our transformation. Include 2,000 AI units in the BDC commitment at no incremental cost, or we'll prototype Databricks ML instead."
Outcome: 50-60% of enterprise BDC deals now include bundled AI units at 10-15% discount vs. standalone AI Core pricing.
Move 6: Three-Year Term = 20%+ Discount
SAP prefers longer terms (improves their renewal forecasting). Use this:
- 1-year deal: List price (no discount)
- 2-year deal: 12-15% discount
- 3-year deal: 20-25% discount
Most strategic accounts lock in 3-year terms with BDC for this reason. Longer terms are also your protection against price increases (locked in for 3 years).
Move 7: True-Down and Scaling Rights
See Section 09 for detailed contract language, but in negotiation, demand:
- Annual true-down rights (reset CU commitment lower if consumption proves lower)
- Scaling flexibility (request to increase CU mid-contract at then-current volume pricing, not overage rates)
- CU "cap" in overage clause (overages capped at 120% of committed CU, not unlimited)
Typical Outcome Matrix
Our engagements show:
- Light negotiation (1-2 calls): 10-15% discount on list price
- Moderate negotiation (3-4 calls, reference Snowflake/BigQuery): 18-25% discount
- Aggressive negotiation (formal RFP, multi-vendor evaluation): 25-35% discount + pilot-to-production pricing lock
Contract Protections Every Buyer Needs
Once pricing is negotiated, lock it into airtight contract language. SAP's standard terms are vendor-favorable. You need the following provisions:
1. CU True-Down Rights
Language: "At each annual renewal, if Customer's actual CU consumption averages 20% or more below the committed CU level (calculated monthly across the prior 12 months), Customer may reduce the committed CU for the subsequent year to 110% of the trailing 12-month average consumption. SAP will not dispute this calculation or require re-baseline."
Why it matters: Over-sizing is inevitable; this clause lets you correct it without penalty. Without this, you're locked into 5,000 CU/month even if you only use 3,000.
2. Overage Cap
Language: "Any CU consumption above the committed monthly amount shall be billed at 1.25x the negotiated per-CU rate for that month only. Under no circumstances shall CU overages exceed 20% of the committed amount in any single month. If actual consumption exceeds committed CU by more than 20% for two consecutive months, Customer may trigger a mid-contract scaling negotiation at then-current volume pricing (not overage rates)."
Why it matters: Without this, SAP bills overages at punitive rates (often 1.5-2.0x per-CU, much higher than your committed rate). This locks you into unaffordable cost escalation.
3. CU Reallocation Flexibility
Language: "Within each month, Customer may reallocate committed CU between Datasphere and SAP Analytics Cloud components. Reallocation requests must be submitted with 5 business days' notice. Unused CU from one component may roll forward to the next month up to 20% of the monthly commitment (max carryforward 12 months). Unused CU expire at contract end; no credit for unspent capacity."
Why it matters: If you commit 5,000 CU (3,000 Datasphere, 2,000 SAC) but only use 1,500 SAC, you want to use the 500 freed-up SAC CU for Datasphere compute instead of losing them.
4. Scaling Flexibility (Mid-Contract Uplifts)
Language: "If Customer requires additional CU above the committed amount, Customer may request an uplift by written notice. SAP shall respond within 10 business days with pricing at the then-current volume rate for the new CU tier (not overage rates). This uplift is binding for the remainder of the term. Not more than two uplift requests per contract year."
Why it matters: Your data may grow faster than expected. This prevents SAP from charging punitive overage rates; instead, you upgrade at standard volume pricing.
5. Definition of Capacity Unit Consumption
Language: "SAP shall provide a detailed, monthly CU consumption report breaking down consumption by component (Datasphere compute, storage, integration, data catalog, SAC). Reports shall be provided within 5 business days of month-end. Customer has 30 days to dispute the calculation; disputes shall be arbitrated by an independent cloud accounting firm at SAP's expense if the dispute exceeds 5% of the monthly bill."
Why it matters: CU consumption is opaque. Without this, SAP can bill $X and you have no recourse. This clause forces transparency and creates dispute resolution teeth.
6. Performance SLA / Service Credits
Language: "SAP shall maintain 99.5% monthly uptime for Datasphere and SAP Analytics Cloud. Unplanned downtime exceeding 4 hours in any month triggers a 10% service credit against that month's fees. Cumulative service credits cannot exceed 20% of annual contract value."
Why it matters: Most enterprises demand 99.9%+ uptime; 99.5% is weak but standard for Datasphere. Service credits are your only recourse if SAP has a major outage.
7. Price Increase Cap
Language: "For the initial contract term, SAP shall not increase the negotiated per-CU rate. Upon renewal, per-CU rate increases shall not exceed 3% annually, with no increase in Year 1 of any renewal term."
Why it matters: Without this, SAP can raise per-CU rates 10%+ at renewal, making your deal uneconomical. A 3% cap is standard for multi-year deals.
8. Termination for Convenience (with Penalty)
Language: "Customer may terminate this agreement for convenience upon 120 days' written notice, with termination fee equal to 30% of the remaining committed contract value. If termination is triggered by SAP material breach (uptime below 95% for 30 consecutive days, or failure to provision committed CU), termination fee is waived."
Why it matters: SAP strongly dislikes this clause, but large customers demand it. A 30% termination fee is reasonable; it gives you an exit if SAP's service quality collapses.
Case Study: Manufacturing Firm, 2,500 Users
Company Profile: Global discrete manufacturing firm, USD 8B revenue, 2,500 enterprise users across 15 countries. Data sources: SAP S/4HANA, Salesforce, Workday, external market data feeds, supply chain partner APIs. Current analytics: SAP BusinessObjects (legacy, slated for retirement).
Initial SAP Proposal
SAP's account executive proposed BDC (Datasphere + SAC) with the following terms:
- Committed CU: 8,000 CU/month (based on SAP's calculator)
- Per-CU rate: USD 13/CU
- Term: 3 years
- Total contract value: USD 3.12M (8,000 CU × USD 13 × 36 months)
Redress Intervention
We audited their workload and found:
- Actual data footprint: 120 sources, but only 40 in scope for Year 1 (others phased in 2027-2028)
- Datasphere consumption estimate: 3,500 CU/month (compute + storage)
- Analytics Cloud consumption estimate: 1,200 CU/month (dashboards, real-time models)
- Year 1-2 usage: average 4,700 CU/month; Year 3 growth to 6,200 CU/month
SAP's calculator overestimated by 41%.
Negotiation Strategy
- RFP to Snowflake and BigQuery. We modeled both. Snowflake + Tableau: USD 1.8M / 3 years. BigQuery + Looker: USD 1.5M / 3 years. These became our anchors.
- Phased commitment approach: Year 1-2: 5,000 CU/month (lower than SAP's 8,000). Year 3: 6,500 CU/month (scaled as data grows). This reduces upfront commitment exposure.
- Competitive reference: We told SAP's VP Sales: "Snowflake + Tableau gets us to 1.8M TCO. BDC is 73% more expensive. What's your move?"
- Three moves: True-down rights (annual reset if consumption is 20%+ lower), CU reallocation flexibility (move unused SAC CU to Datasphere), and pricing escalation cap (3% annual increase max).
Final Agreement
- Year 1-2: 5,000 CU/month @ USD 11.50/CU (18% discount) = USD 1.38M / 2 years
- Year 3: 6,500 CU/month @ USD 11.50/CU (locked in) = USD 897K
- Total 3-year contract value: USD 2.277M
- Savings vs. initial proposal: USD 843K (27% discount)
Key wins:
- Annual true-down rights (they reset to 4,500 CU in Year 2 when actual consumption proved lower)
- CU reallocation flexibility (5% of budget freed up by moving unused SAC CU to Datasphere)
- Included 1,000 AI Core units at no incremental cost (for their ML pilots)
- 120-day termination for convenience with 20% fee (vs. SAP's standard 50% penalty)
Post-Signature Lessons
Year 1 actual consumption: 4,200 CU/month (23% below commitment). They triggered true-down rights, reset to 5,100 CU for Year 2 (avoiding wasted 900 CU/month × 12 months × USD 11.50 = USD 124K annual overspend). By month 18, they had consumed the 1,000 AI Core units and upsold SAP's AI team on a pilot expanding to 5,000 additional units.
Total realized savings: USD 967K (includes true-down reset, avoided AI Core overage charges, and avoided reallocation penalties).
90-Day Action Plan
Use this timeline to move from initial evaluation to contract signature.
Audit your current data stack: What data sources, how many tables, which users depend on real-time access? How many SAP systems (S/4HANA, BW, SuccessFactors)? Do you have BTPEA credits expiring Dec 31? Document findings in a one-page scorecard: data footprint, user count, integration frequency, current analytics tool. If you're on BTPEA with Datasphere, calculate your remaining credits and burn rate. This is a hard deadline.
Use Section 07 methodology to size CU requirements. Build a detailed workload model: per-source complexity, ingestion frequency, storage 3-year forecast. Do NOT use SAP's CU calculator in isolation; cross-check with Redress methodology. Create three sizing scenarios: Conservative (50th percentile), Moderate (75th percentile), Aggressive (95th percentile). Conservative is your contract commitment target.
Parallel-path three vendors: Snowflake, Azure Synapse, Google BigQuery. Issue RFP with your workload model. Emphasize: data volume, ingestion patterns, user count, analytics tool preference (if you're bringing your own Tableau/Power BI). Require 3-year TCO quotes from each. SAP should be one of the RFPs; make sure your SAP account team knows you're shopping. This creates urgency and discount authority.
Tabulate all RFP responses: pricing, terms, SLAs, contract flexibility. Create a TCO scorecard comparing SAP BDC, Snowflake, Azure Synapse, and BigQuery. Include migration effort (3-6 months, ~100 person-days). Most importantly: identify your "walk-away" price (the point where migration to Snowflake becomes rational). If SAP's BDC quote is above this walk-away, you have real leverage.
Schedule a meeting with SAP's account executive and solutions architect. Present your TCO analysis. Open: "We've evaluated Snowflake, BigQuery, and you. Snowflake comes in at 22% cheaper. What can you do?" Most account executives will ask for time to get pricing authority. Expect them to return with a 12-15% discount within 1 week. If they do, move to Round 2. If not, escalate to their manager.
Iterate. If SAP's first counter is insufficient (less than 15% discount or no contract protections), propose the pilot-to-production approach: 6 months at reduced CU (pilot rate), then a 2-year production commitment at agreed pricing with annual true-down rights. This usually unlocks an additional 5-10% discount because it de-risks SAP's forecast.
Once pricing is agreed, have your legal team review SAP's standard terms. Insert the contract protections from Section 09: true-down rights, overage cap, reallocation flexibility, termination for convenience. SAP will resist some language; negotiate the six "must-haves": true-down, overage cap, reallocation, CU consumption reporting, SLA/service credit, and price increase cap. Do not sign without these.
If you have BTPEA credits, you must activate Datasphere by Dec 31, 2025, to preserve credit value. This front-loads your 90-day timeline. Start your assessment in October 2025 if you're in this boat.
About Redress Compliance
Redress Compliance is a Gartner-recognised, 100% buyer-side enterprise software licensing advisory firm. We have no commercial relationships with any software vendor—our only client is the enterprise buyer. Our advisors have completed 500+ enterprise licensing engagements across 11 vendor practices (SAP, Oracle, Microsoft, Salesforce, IBM, Broadcom, AWS, Google Cloud, ServiceNow, Workday, and Cisco).
Our SAP licensing advisory practice specializes in S/4HANA, RISE, Datasphere, Analytics Cloud, and SuccessFactors negotiations. We typically engage 9-18 months before renewal to allow sufficient time for workload sizing, competitive benchmarking, and multi-vendor RFP management.
In SAP Datasphere and BDC negotiations, we have achieved:
- Average discount: 22% off list price (range: 10-35% depending on negotiation depth)
- Contract protection wins: True-down rights (95% of deals), CU reallocation (88% of deals), termination for convenience (65% of deals)
- Typical engagement outcome: USD 400K-1.2M savings over 3-year contract term, plus non-price wins (AI units included, extended pilot periods, scaling flexibility)
SAP Advisory Services · All White Papers · Enterprise Spend Navigator Newsletter