How do partners decide which AI projects to fund?
Partners decide to fund AI projects based on three factors: clear financial return within one quarter, minimal disruption to current workflows, and an internal champion who will own the outcome. The decision is not about technology. It is about business risk, and the proposal that wins is the one with specific numbers on a specific workflow, a credible payback period, and a clear plan for what happens if it does not work.
Short answer: Partners fund AI with clear ROI in one quarter, minimal disruption, and an internal champion. Specific numbers on specific workflows win. Vague transformation pitches lose.
How partnership decisions actually work
Understanding the decision mechanics matters because they explain why good AI projects get rejected and mediocre ones get approved.
In most mid-market professional services firms, AI investment decisions follow one of three paths:
Path 1: Managing partner discretion. For investments under £10,000 to £15,000, the managing partner can typically approve without a partnership vote. This is why AI audits (£3,500 to £5,000) are a common starting point. They fall below the threshold that triggers a formal decision process.
Path 2: Management board or technology committee. For investments of £15,000 to £50,000, most firms route through a management board or, if one exists, a technology committee. These bodies meet monthly or quarterly, which creates timing constraints. Miss the meeting window and your proposal waits another month.
Path 3: Full partnership vote. For investments above £50,000 or those requiring firm-wide change, a partnership vote is usually required. This is the highest bar and the slowest process. It is also where most ambitious AI proposals go to die.
The practical implication: structure your AI investment to start below the partnership vote threshold. A £25,000 pilot approved by the managing partner or management board can prove the concept and build evidence for a larger partnership-approved programme later.
What partners actually care about
We have sat in dozens of partnership meetings and proposal reviews. The concerns are remarkably consistent across firms and jurisdictions.
Financial return and payback speed
Partners are not irrational about AI investment. They apply the same commercial logic they use for any business decision. The questions are: how much, what return, and how fast?
The threshold varies by firm, but a useful benchmark is 200 percent ROI within 12 months with break-even within 3 to 6 months. Our Calder and Reid case study, £25,000 investment returning £78,000 per year, exceeds this comfortably. That is the kind of proposition that gets funded.
What does not work: “AI will transform our firm over the next three years.” Partners discount long-term projections heavily because they have been burned by technology promises before. CRM systems that were going to revolutionise client development. Case management migrations that were going to save hundreds of hours. Document management overhauls that took twice as long and cost three times the estimate.
AI proposals succeed when they respect this history and offer something different: a specific return on a specific investment within a specific timeframe.
Disruption and risk
Every partner who has lived through a failed technology project asks: what goes wrong? The answer matters more than the best-case scenario.
Effective risk management in an AI proposal addresses:
Implementation risk. What if the build takes longer than planned? Answer: scope is fixed, and overruns are the consultancy’s problem, not yours. At Formulaic, our build contracts cap timeline and cost.
Adoption risk. What if staff do not use it? Answer: the system is designed around the existing workflow, not a new one. Staff do not need to change how they work, just where one step happens.
Quality risk. What if the AI gets things wrong? Answer: every output is reviewed by a qualified professional. The AI handles the first 80 percent. The human handles quality assurance. Error rates are tracked and reported.
Regulatory risk. What if the SRA or a state bar objects? Answer: the system complies with current guidance, uses enterprise-grade data protection, and maintains full audit trails.
Exit risk. What if we want to stop? Answer: you can switch off the system at any time. There is no lock-in beyond the initial build cost. Data remains yours.
The champion question
Every successful AI project we have delivered had an internal champion: a partner or senior associate who personally owned the outcome. Every failed proposal we have seen lacked one.
The champion does three things: they push the proposal through the decision process, they drive adoption within their team, and they provide honest feedback during implementation that makes the system better.
Without a champion, even a well-funded project drifts. Nobody chases the consultancy on timelines. Nobody addresses staff reluctance. Nobody escalates issues before they become failures.
If you cannot identify a champion, you are not ready to invest. Find the champion first, then build the proposal around their practice area.
The business case that works
Here is the structure that consistently gets approved in partnership settings.
One page summary
Partners read one page. If the summary does not convince them, the appendices are irrelevant.
The problem: “Our employment team receives 200 enquiries per month. Each requires 15 to 20 minutes of solicitor time for initial triage. That is 50 to 67 hours per month, costing £42,000 to £56,000 per year in solicitor time. Only 30 percent of enquiries become paying clients.”
The solution: “An AI intake system qualifies enquiries automatically, collecting structured information and routing qualified leads to solicitors. Similar systems reduce solicitor triage time by 60 to 80 percent.”
The investment: “£25,000 build cost. £4,000 per year running cost.”
The return: “£25,000 to £45,000 per year in solicitor time saved. Additional value from faster client response (improving conversion from 30 to 40+ percent). Break-even: 3 to 4 months.”
The champion: “Sarah Jenkins, head of employment, will lead adoption.”
The risk: “If the system does not deliver within 8 weeks, we stop. Maximum downside: £25,000 sunk cost. If it works, we consider expanding to family law and conveyancing.”
Supporting detail
For partners who want more depth, include a 3 to 5 page appendix with current workflow analysis, comparable case studies, technical overview (non-technical language), compliance assessment, and implementation timeline.
How to sequence AI investments for maximum partnership support
The mistake most firms make is proposing their most ambitious AI project first. The right strategy is sequential:
Phase 1: AI audit (£3,500 to £5,000). Below most approval thresholds. Identifies opportunities with data. Creates the evidence base for Phase 2. Takes 1 to 3 weeks.
Phase 2: Single-system pilot (£15,000 to £30,000). Targets the highest-ROI opportunity identified in the audit. One practice area, one workflow, clear metrics. Takes 4 to 10 weeks to build, 8 weeks to prove.
Phase 3: Expansion (£30,000 to £100,000+). Uses Phase 2 evidence to justify broader investment. By this point, the partnership has seen real results and the conversation shifts from “should we invest in AI?” to “which practice area is next?”
This sequence works because each phase de-risks the next. The audit justifies the pilot. The pilot justifies the expansion. Partners are not being asked to take a leap of faith. They are being asked to take a measured step based on evidence from the previous step.
Common objections and honest responses
“We tried technology projects before and they failed.” This is valid. Acknowledge it. Explain how AI projects differ: shorter timelines, measurable outcomes within weeks not months, and the ability to stop if results do not materialise. A 6-week pilot is not a 2-year ERP migration.
“Our clients would not want us using AI.” Some would not. Most do not care how you work, only that the work is good and timely. The firms that disclose AI use proactively report no client pushback and often receive positive feedback about efficiency.
“We should wait until the technology matures.” The technology is mature enough to deliver ROI now. Firms that deployed in 2024 and 2025 have 12 to 24 months of competitive advantage over those still waiting. The cost of waiting is measurable in lost efficiency.
“It is too expensive.” Compared to what? A £25,000 system that saves £78,000 per year is not expensive. It is the highest-return investment most firms can make. The expensive option is paying solicitors to do work that a machine can do faster and more consistently.
What we have learned from 30 deployments
The pattern is clear. Firms where a senior partner champions the first project see faster approval, higher adoption, and better outcomes. Firms where AI is a technology department initiative without partnership ownership struggle with funding, adoption, and sustainability.
The Meridian case is instructive. Their AI system generated over 1,000 times its cost in pipeline value. But it only got built because one partner was willing to stake their credibility on it. That partner’s early success changed the partnership’s entire attitude toward AI investment.
Start small. Start with a champion. Prove the numbers. Expand from evidence. That is how partners decide, and that is how you should structure your approach.
What is the typical approval process for AI investment in a law firm? +
Most mid-market firms require a business case reviewed by the managing partner or a technology committee. Spend under £10,000 often needs only managing partner approval. Spend over £25,000 typically requires partnership vote or management board sign-off. The process takes 2 to 8 weeks.
What ROI threshold do partners look for in AI projects? +
Partners typically want to see projected ROI of 200 percent or more within 12 months, with break-even within 3 to 6 months. The Calder and Reid example, £25,000 investment returning £78,000 per year, exceeds most partnership thresholds comfortably.
How do you handle partner resistance to AI investment? +
Start with the most receptive partner and the most obvious use case. A small pilot that proves value is more persuasive than any presentation. Once one practice area shows results, resistant partners face competitive pressure from within their own firm.
Should AI investment come from the technology budget or practice area budgets? +
Practice area budgets where possible. When a practice area owns the investment, they own the outcome and are more invested in adoption. Central technology budgets work for firm-wide tools but create an accountability gap for practice-specific systems.
How do partners evaluate AI consultancy proposals? +
Partners look at three things: specificity of the proposal (does it address our actual workflow?), credibility of the projected savings (are the numbers realistic?), and risk mitigation (what happens if it fails?). Vague proposals about AI transformation get rejected. Specific proposals with named workflows and defensible numbers get funded.
What kills AI proposals in partnership meetings? +
Three things: lack of specific numbers, no identified champion to manage the project, and unclear risk management. 'We should invest in AI' fails. 'Our employment team should invest £25,000 in intake automation to save £78,000 per year, with Sarah leading adoption' succeeds.
How do firms prioritise between multiple AI opportunities? +
Rank by ROI speed, not ROI size. A project that returns £30,000 in 3 months is better than one returning £100,000 in 18 months because it builds internal confidence and funds subsequent projects. Start with the quickest win.
Do partners need to understand AI technically to approve investment? +
No. Partners need to understand the business case: cost, expected return, timeline, and risk. Technical details should be available for those who want them but should not dominate the proposal. The decision is a business decision, not a technology decision.
Founder, Formulaic. 12+ years building growth systems for professional services firms. Shipped 30 production AI systems across 6 clients.
Connect on LinkedIn →Want personalised recommendations?_
Take the AI Opportunity Scorecard for a benchmarked readiness score and three prioritised use cases specific to your firm. 3 minutes. Free.