How to use this assessment: How to use this assessment: Work through each item and mark it complete once confirmed. Items flagged High Risk represent the most common sources of material overspend. A score of 15 or more indicates a well-governed position.

Scoring Guide
Tally your confirmed items to determine the rigour of your software evaluation process.
0 – 9 Evaluation Risk High
10 – 14 Partial Framework
15 – 20 Robust Process

Section 1: Requirements and Vendor Shortlisting

Vendor selection succeeds when requirements precede vendor engagement. Organisations that document functional and non-functional requirements, establish evaluation criteria, and perform systematic shortlisting reduce implementation risk by 48 percent and improve user adoption by 31 percent compared to shortcut-based evaluation processes.

1. You have documented functional and non-functional requirements before engaging any vendor, separating must-haves from nice-to-haves with business owner sign-off.
Organisations skipping formal requirements definition average 37 percent higher implementation costs and 2.5 times project duration overruns. Requirements documentation should cover functional workflows, integration points, data model, reporting requirements, non-functional attributes such as scalability and uptime, and business constraints including budget and data residency. Functional requirements define scope; non-functional requirements prevent scope creep and post-signature disputes. Business owner sign-off ensures accountability and prevents shifting requirements during implementation.
● High Risk
2. You have conducted a market landscape review to identify vendors in the relevant category and excluded those falling clearly outside evaluation scope.
Early vendor exclusion reduces evaluation noise and focuses effort on credible contenders. Market reviews should assess vendor viability including financial stability, market share, and reference ability, as well as product maturity and go-to-market approach. Organisations conducting early landscape reviews reduce evaluation cycles from 6 to 9 months down to 3 to 4 months while improving candidate quality. Absence of exclusion criteria forces evaluation teams to engage too many marginal vendors, extending timeline and diluting decision quality.
● High Risk
3. You have created a weighted evaluation scorecard assessing vendors against documented criteria with clear scoring methodology and weighting rationale.
Scorecards eliminate subjective vendor preference and enforce discipline on evaluation teams. Best-practice scorecards assign weights to evaluation dimensions aligned with business priorities such as functional fit, total cost of ownership, implementation risk, and vendor viability. Organisations using scorecards reduce post-signature regret by 44 percent and improve user satisfaction by 28 percent compared to gut-feel evaluation. Scorecard methodology should be approved by business sponsors before vendor demonstrations begin, establishing baseline expectations.
● High Risk
4. You have conducted structured product demonstrations and reference calls with existing customers of shortlisted vendors, focusing on functional fit validation and implementation reality checks.
Vendor demonstrations are marketing theatre; reference calls surface implementation reality and user satisfaction. Reference validation should target organisations with similar scale, industry, and use case, asking specifically about implementation timelines versus promised, cost overruns, training adequacy, and post-launch support quality. Organisations conducting four or more reference calls per vendor improve implementation predictability by 31 percent and reduce post-launch frustration. References should be supplemented with published analyst assessments to evaluate product strategy and execution.
● High Risk
5. You have documented and approved the final vendor selection decision with clear justification, including gaps between vendor capabilities and requirements and documented acceptance of trade-offs.
Documented vendor selection decisions create accountability and prevent post-signature disputes over misaligned expectations. Selection documentation should explain scorecard outcomes, rationale for weighting, vendor strengths and gaps, and acceptance of known limitations such as integration rework or training investment. Sign-off from business leadership and IT creates shared ownership. Organisations with documented selection decisions report 33 percent faster implementation and 26 percent fewer scope disputes compared to undocumented selections.
● Medium Risk

Section 2: Commercial and Pricing Evaluation

Software pricing obscures total cost of ownership through variable licensing models, hidden implementation costs, and escalating support fees. Disciplined cost evaluation — comparing total cost rather than list price and validating cost assumptions through reference checks — prevents surprise cost overruns and misaligned vendor incentives.

6. You have evaluated total cost of ownership across licence, implementation, integration, training, and 3 to 5-year support costs, not simply comparing list price.
Organisations comparing list prices alone incur average TCO surprises of 34 to 41 percent due to hidden implementation, integration, and training costs. TCO models should include licence costs and Year 3 to 5 escalation assumptions, implementation and customisation based on vendor estimates and reference data, integration with existing systems, training and change management, and ongoing support and maintenance. Benchmarking TCO assumptions against reference customers and industry analyst data improves cost predictability and prevents budget surprises.
● High Risk
7. You have documented assumptions underlying cost projections and stress-tested costs against upside and downside scenarios.
Cost projections based on unchallenged vendor assumptions drive systematic underestimation. Stress testing should model scenarios of higher-than-expected user growth at plus 50 percent, extended implementation timelines of plus 6 months, greater integration scope, and accelerated version upgrades. Organisations stress-testing cost assumptions report 28 percent better accuracy in budget outcomes and 22 percent lower post-signature cost disputes. Cost assumptions should be validated against reference customer experiences and benchmarked against analyst guidance.
● High Risk
8. You have evaluated licensing models and negotiated per-unit costs reflecting your deployment and user profile.
Licensing model misfit drives 23 to 29 percent cost waste. Named-user models suit stable, full-time workforces; concurrent licences fit dynamic environments with shift workers or consultants; usage-based pricing requires demand forecasting discipline. Organisations benchmarking per-user and per-transaction costs against market data negotiate 18 to 26 percent more favourable terms. Vendor pricing structures are designed to maximise revenue given your deployment profile; informed model selection shifts leverage to the buyer.
● High Risk
9. You have negotiated volume discounts, multi-year pricing locks, and pricing escalation caps to control long-term cost growth.
Uncapped pricing escalation clauses create unsustainable long-term costs. Multi-year agreements with fixed or capped escalation — not to exceed 3 percent annually — typically deliver 12 to 18 percent cumulative savings versus annual renewals at market rates. Volume discounts increase with commitment size and negotiation leverage; Tier 1 vendors routinely offer 20 to 35 percent discounts for committed 3 to 5-year purchases. Organisations with escalation caps report cost predictability within 5 percent of projections; uncontrolled escalation drives 15 to 22 percent annual cost overruns in later contract years.
● High Risk
10. You have verified vendor financial stability and assessed renewal risk if the vendor enters financial distress or is acquired by a competitor.
Vendor financial distress or acquisition by competitors creates renewal risk: contracts may be renegotiated unfavourably, support may degrade, and roadmap abandonment may leave you stranded. Organisations assessing vendor stability via credit rating and analyst reports mitigate financial risk. Contracts should include termination-for-convenience clauses and data export guarantees in the event of vendor acquisition or bankruptcy. Technology acquisitions typically trigger service consolidation and customer rationalisations; renewal risk escalates post-acquisition.
● Medium Risk

Section 3: Contract Terms and Risk Assessment

Standard vendor contracts reflect vendor interests, not customer interests. Disciplined contract review — identifying unfavourable risk allocation, negotiating balanced terms, and codifying implementation expectations — prevents post-signature disputes and aligns vendor incentives with customer success.

11. You have engaged legal counsel to review vendor contracts and negotiated amendments to balance risk allocation including liability caps, indemnification, data protection, and termination provisions.
Organisations accepting vendor standard terms forfeit 60 to 70 percent of available contractual leverage. Standard contracts favour vendors through unlimited liability caps, narrow indemnification, broad termination rights for non-payment, and weak data protection guarantees. Legal review should focus on liability symmetry, IP indemnification, data governance, and force majeure. Balanced contracts reduce dispute frequency by 34 percent and improve post-signature collaboration. Vendors expect negotiation; early legal engagement signals seriousness and improves leverage.
● High Risk
12. You have documented and negotiated Service Level Agreements specifying uptime, incident response times, and penalties for breach aligned with your operational dependencies.
SLAs without remedies lack enforceability. Operational SLAs should define uptime guarantees such as 99.5 percent monthly, incident response targets, and credits or termination rights if breaches occur. Organisations with SLAs including financial remedies drive 31 percent improvement in vendor support responsiveness. SLAs should be technology-specific: hosted SaaS requires higher uptime guarantees than on-premise systems. Absence of enforceable SLAs leaves you dependent on vendor goodwill during outages and service degradation events.
● High Risk
13. You have negotiated data governance terms protecting data ownership, specifying encryption standards, data residency, backup and disaster recovery procedures, and data export and deletion rights on contract termination.
Organisations ceding data governance to vendors create exit lock-in and regulatory risk. Data governance terms should confirm your ownership of data, require encryption in transit and at rest, specify data centre location for GDPR and CCPA compliance, guarantee backup and disaster recovery, and include data export in standard formats upon termination. Vendors restricting data export or charging extraction fees create switching costs; strong data terms preserve optionality and are non-negotiable for regulated industries.
● High Risk
14. You have negotiated security, audit, and compliance provisions requiring vendor attestation through SOC 2 or ISO 27001 and right-to-audit clauses enabling independent security assessments.
Vendors resisting security scrutiny or refusing audit rights create compliance risk. Security provisions should require vendor attestation, periodic penetration testing, vulnerability management, and security incident notification. Right-to-audit clauses enable you to conduct independent security assessments. Organisations leveraging contractual audit rights identify security gaps 28 percent earlier than those relying on vendor certifications alone. Vendors with mature security practices welcome audit rights as proof of excellence; resistance signals immaturity.
● High Risk
15. You have documented implementation timeline, resource commitments, acceptance criteria, and payment milestones tied to delivery, establishing accountability for on-time and on-budget implementation.
Implementation contracts without defined milestones and payment discipline create scope creep and cost overruns. Implementation agreements should specify start date, milestone deliverables, resource commitments from both parties, and acceptance criteria. Payment should be milestone-based, not upfront; withhold final payment until acceptance criteria are met. Organisations tying payment to milestone acceptance report 26 percent on-time delivery improvement and 19 percent cost savings versus upfront payments. Penalty clauses for schedule delays align vendor incentives with timely delivery.
● High Risk

Section 4: Implementation and Post-Deployment Governance

Post-signature, governance shifts from vendor selection to implementation management and deployment success. Structured implementation oversight, user adoption tracking, and post-launch governance ensure the software delivers promised value and integrates sustainably into your operating environment.

16. You have established a cross-functional implementation governance structure with clear roles and weekly status reporting to track timeline, budget, and scope.
Organisations with formal implementation governance achieve on-time delivery 36 percent more often and budget adherence 28 percent better than those relying on informal vendor coordination. Governance structures should include an executive steering committee for monthly oversight, a project management office for weekly status, and working groups by functional area for configuration, integration, and training. Absence of governance allows vendor control of implementation timeline and scope; formalised oversight ensures accountability.
● High Risk
17. You have documented the integration architecture specifying system interfaces, data flows, and ETL requirements before configuration begins, preventing scope creep and integration rework.
Integration rework is the leading cause of implementation overruns, responsible for 37 percent of schedule delays and 26 percent of budget growth. Integration architecture documents should specify source and target systems, data volumes and refresh frequency, transformation rules, and exception handling. Early definition of integration scope prevents discovering gaps during UAT or go-live. Organisations defining integration architecture upfront achieve 31 percent faster time-to-value and 22 percent lower integration costs.
● High Risk
18. You have designed a phased rollout plan with pilot user groups, UAT gates, and go-live decision criteria, reducing deployment risk and enabling course correction before full deployment.
Big-bang deployments amplify implementation risk. Phased approaches with 10 to 15 percent pilot cohorts, UAT validation gates, and go-live criteria reduce go-live incidents by 48 percent. Pilot feedback identifies process and training gaps before full deployment; UAT gates force closure of critical issues. Organisations with phased rollouts achieve stabilisation 4 to 6 weeks post-deployment versus 8 to 12 weeks for big-bang approaches. Go-live decision criteria should be documented and enforced; skipping gates to meet arbitrary deadlines destabilises the system.
● High Risk
19. You have invested in structured training and change management to drive adoption and reduce post-launch support costs.
Inadequate training drives user frustration, slow adoption, and lingering productivity impacts lasting 24 to 36 months. Best-practice training includes role-specific content, multiple delivery formats, and post-launch refresher sessions. Change management should include communication plans, user support structures such as help desks and super-users, and feedback loops to address user concerns. Organisations investing in training and change management report 33 percent faster adoption and 26 percent higher user satisfaction. Training costs of 1 to 3 percent of implementation budget are recovered through faster productivity gains.
● High Risk
20. You have established a 90-day post-launch governance and support structure covering issue triage, vendor escalation, performance monitoring, and user feedback collection to stabilise the system.
Post-launch support is critical: 67 percent of implementation value is at risk in the 90-day stabilisation window. Post-launch governance should include daily issue triage, escalation procedures for vendor-side issues, system performance monitoring, and user feedback collection. Organisations with structured post-launch governance reduce time-to-stabilisation by 4 to 6 weeks and uncover improvement opportunities 40 percent faster. Post-launch planning should identify quick wins for the first 1 to 2 months and a multi-phase enhancement roadmap for 6 to 12 months beyond go-live.
● Medium Risk

Ready to optimise your AI contract and cost position?

Download our AI Platform Contract Negotiation Guide — covering all major vendors, pricing structures, and negotiation tactics.
Download Free Guide →

Next Steps

Score your confirmed items against the benchmarks above. If you are in the High Exposure or Partial Governance bands, prioritise the items flagged High Risk — these represent the most common sources of material overspend and are addressable within a single procurement or FinOps cycle.

Redress Compliance works exclusively on the buyer side, with no vendor affiliations. Our GenAI advisory practice has benchmarked AI costs, negotiated enterprise AI contracts, and built governance frameworks across 500+ enterprise engagements. Contact us for a confidential review of your AI cost and contract position.