How do you scope an AI project for a law firm?
You scope an AI project for a law firm by selecting one high-volume workflow, mapping it from end to end, defining measurable success criteria before any building begins, setting a fixed budget and timeline, and ensuring regulatory compliance requirements are embedded in the specification from day one. The scope is the single most important determinant of whether an AI project delivers ROI or becomes an expensive experiment. Most failed AI deployments in law firms fail at the scoping stage, not the building stage.
Short answer: Pick one workflow, map it end to end, define measurable success criteria, fix the budget and timeline, and build in regulatory compliance from the start. Scope determines ROI.
Why this question matters now
The wave of AI experimentation in law firms between 2023 and 2025 produced a pattern: firms that started with clear, narrow scope delivered production systems. Firms that started with broad ambitions delivered pilots that never reached production. The difference was not budget or technology. It was scope discipline.
In 2026, managing partners have moved past the question of whether to invest in AI. The question is how to invest wisely. Scoping is where wisdom meets execution. A £25,000 project with tight scope delivers a running system. A £100,000 project with loose scope delivers a report explaining why the system is not yet ready.
The regulatory environment adds urgency to scoping discipline. In the UK, the SRA expects firms to demonstrate oversight and understanding of AI systems they deploy. In the US, multiple state bar associations have issued ethics opinions requiring competence with AI tools and informed client consent for AI-assisted work. A project scoped without regulatory considerations may produce a technically functional system that the firm cannot use without compliance risk.
Getting the scope right is not a bureaucratic exercise. It is the most efficient thing a firm can do to protect its AI investment.
How do you select the right workflow to automate?
Not every workflow is a good candidate for AI. The best candidates share four characteristics:
High volume. The workflow runs frequently enough that time savings compound. A task performed 200 times per month delivers 10 times the savings of one performed 20 times per month, with similar build costs. Client intake, document generation, and email triage are high-volume candidates. Partner strategy sessions are not.
Repetitive structure. The steps are substantially the same each time. The inputs vary (different clients, different facts) but the process is consistent. If every instance of the task requires a novel approach, AI adds limited value. If 80% of instances follow the same pattern, AI handles the 80% and humans handle the 20%.
Structured data. The inputs and outputs are structured or can be structured without excessive effort. Form submissions, database records, document templates, and standardised emails are structured. Freeform partner meetings and complex negotiations are not.
Measurable outcomes. You can define what success looks like in numbers: time per task, error rate, conversion rate, cost per transaction. If you cannot measure the current state, you cannot measure improvement. And if you cannot measure improvement, you cannot demonstrate ROI.
The selection matrix:
| Workflow | Volume | Structure | Data quality | Measurability | Score |
|---|---|---|---|---|---|
| Client intake | High | High | Medium-High | High | Strong candidate |
| Document drafting | Medium-High | High | High | High | Strong candidate |
| Email triage | High | Medium | Medium | Medium | Good candidate |
| Legal research | Medium | Low | Medium | Low | Weak candidate |
| Client meetings | Low | Low | Low | Low | Not a candidate |
Rate each workflow on these four criteria. The workflows that score highest across all four are your scoping targets.
What should the scope document include?
A good scope document is 3-5 pages. It should be specific enough that two people can read it independently and agree on what will be built. If ambiguity exists in the scope, it will appear in the build as misaligned expectations, change requests, and cost overruns.
Section 1: Workflow description
Document the current workflow in precise detail:
- What triggers the workflow (new enquiry, new instruction, deadline approaching)
- What steps are involved, in what order
- Who performs each step and how long it takes
- What systems are used at each step
- What data moves between steps
- Where errors occur and how often
- What the output is and who receives it
This is not a flowchart exercise. It is observation. Sit with the people who do the work and watch the process. The actual workflow often differs from the documented process.
Section 2: Requirements
Define what the AI system must do, stated as specific capabilities:
- “The system must extract client name, contact details, matter type, and urgency from web form submissions.”
- “The system must check the client against the firm’s conflict database and flag matches.”
- “The system must generate a templated acknowledgement email within 2 minutes of submission.”
- “The system must create a matter record in Clio/LEAP/Proclaim with all extracted data populated.”
Each requirement should be testable. If you cannot write a test for it, the requirement is too vague.
Section 3: Exclusions
Equally important as requirements. State what the project will not do:
- “The system will not handle complex multi-party matters. These will be flagged for manual processing.”
- “The system will not integrate with the firm’s billing platform in this phase.”
- “The system will not process enquiries in languages other than English.”
Exclusions prevent scope creep and set expectations. Every stakeholder should read and agree to the exclusions before the project starts.
Section 4: Success criteria
Measurable outcomes that define whether the project succeeded:
- “Reduce average intake processing time from 35 minutes to 10 minutes.”
- “Achieve 95% accuracy on data extraction within the first month.”
- “Process 80% of enquiries without human intervention.”
- “Reach payback (cumulative savings exceed build cost) within 16 weeks.”
Section 5: Budget and timeline
Fixed values, not ranges. “£25,000 build cost, 6-week delivery, £400/month running costs.” If a consultancy cannot give you a fixed price for a defined scope, either the scope is not defined well enough or the consultancy is not confident in their ability to deliver.
Section 6: Regulatory and compliance requirements
Specific regulatory obligations that the system must satisfy:
- Data protection requirements (UK GDPR, state data protection laws)
- Professional conduct rules (SRA, state bar)
- Client notification requirements (if applicable)
- Data residency requirements (where data is processed and stored)
- Supervision and oversight mechanisms (how humans review AI outputs)
- Audit trail requirements (what the system must log for compliance purposes)
What are the most common scoping mistakes?
Scope too broad. “We want to automate our entire client journey.” That is five projects. Scope one: intake. Build it. Deploy it. Then scope the next one. Broad scope produces long timelines, budget overruns, and systems that are partially functional across many workflows rather than fully functional for one.
No baseline measurement. If you do not know how long intake takes today, you cannot measure whether AI improved it. Measure the current state before scoping the project. Spend 2-4 weeks collecting baseline data.
Regulatory afterthought. Compliance bolted on after the system is built is expensive and sometimes impossible. A data residency requirement discovered during testing can require re-architecting the entire system. Build compliance into the scope from the first page.
No exclusions. Without explicit exclusions, every stakeholder assumes their requirement is included. This leads to scope creep, missed deadlines, and budget overruns. The exclusions section is as important as the requirements section.
Technology-first scoping. “We want to use GPT-4 for our firm.” That is a technology choice, not a scope. Start with the workflow and the business problem. The technology selection follows from the requirements, not the other way around.
What we’ve seen at Formulaic
The strongest predictor of AI project success in our experience is scope quality. Projects with clear, narrow scope and measurable success criteria have a 100% deployment rate across our portfolio. Projects that started with broad ambitions and were not narrowed during scoping have required mid-project rescoping in every case.
When we scope a project, the document is typically 4 pages: workflow description, 8-12 specific requirements, 5-8 exclusions, 4-5 success criteria, fixed budget, and regulatory notes. We produce this collaboratively with the firm over 1-2 weeks, including workflow observation sessions with the staff who do the work.
One project scoped this way, an intake system for a commercial litigation practice, delivered in 5 weeks at the scoped budget of £28,000. It met all 5 success criteria within the first month. The firm has since scoped and deployed two additional systems using the same methodology. Each subsequent scoping exercise was faster because the firm understood the process and the level of specificity required.
The lesson from 30 production deployments: spend more time on scope and less time on everything else. A well-scoped project practically builds itself. A poorly scoped project fights you at every stage.
How long should scoping take for an AI project? +
1 to 2 weeks for a focused single-workflow project. If scoping takes longer than 3 weeks, the project is either too broad or the firm has not made a clear decision about what to build. Extended scoping is often a sign of scope creep or unclear sponsorship.
Should a law firm scope AI projects internally or hire a consultancy? +
Hire a consultancy for the first project. They bring pattern recognition from previous deployments and can identify pitfalls you will not see. For subsequent projects, the internal team can co-scope with the consultancy based on lessons from the first deployment.
What is the biggest scoping mistake law firms make? +
Trying to automate too many workflows at once. A project that targets client intake, document drafting, and matter management simultaneously is three projects, not one. Each has different data requirements, integrations, and success criteria. Scope one workflow at a time.
How do you set a budget for an AI project? +
Base it on the expected annual savings. If the target workflow costs £60,000 per year in staff time and AI can reduce that by 60%, the annual saving is £36,000. A build cost of £20,000 to £30,000 delivers payback within 12 months. Do not spend more than the first year's expected savings.
What regulatory requirements affect AI project scope? +
In the UK: SRA rules on data protection, supervision of AI outputs, and client notification. In the US: state bar ethics opinions on AI use, ABA guidance on competence with technology, and state data protection laws. Build compliance into the scope from day one, not as an afterthought.
How detailed should the project scope document be? +
Detailed enough that two people can read it and agree on what will be built, how success will be measured, and what is explicitly excluded. A good scope document is 3 to 5 pages: workflow description, requirements, success criteria, exclusions, budget, timeline, and regulatory notes.
What happens if the scope needs to change mid-project? +
Evaluate the change request against the original success criteria and budget. If the change improves ROI without extending the timeline significantly, include it. If it expands the project beyond the original budget or delays delivery by more than 2 weeks, defer it to a follow-up project.
Should the AI project scope include staff training? +
Yes. Training and change management are part of the deployment, not separate activities. The scope should include training sessions, documentation, and a 2 to 4 week supported transition period. Systems without training plans have lower adoption rates and lower ROI.
Founder, Formulaic. 12+ years building growth systems for professional services firms. Shipped 30 production AI systems across 6 clients.
Connect on LinkedIn →Want personalised recommendations?_
Take the AI Opportunity Scorecard for a benchmarked readiness score and three prioritised use cases specific to your firm. 3 minutes. Free.