The legal profession has an AI problem — and it's not the one you think. The problem isn't that lawyers aren't using AI. It's that they're using it without training, without governance, and without their firms knowing about it. A 2025 Law Society survey found that 78% of UK law firms have launched AI pilots, but fewer than 20% have structured training programmes. Meanwhile, 62% of junior lawyers report using AI tools for legal work at least weekly — most without formal guidance from their firm.
By Toni Dos Santos, Co-Founder, Spicy Advisory
The Legal AI Gap: Why 2026 Is the Tipping Point
The UK legal sector is at an inflection point. Three forces are converging to make AI training non-optional for every law firm and in-house legal team.
Regulatory Expectations Are Hardening
The Solicitors Regulation Authority has been clear: technology competence is a professional obligation. The SRA's 2024 guidance on AI use in legal practice states that solicitors must understand the capabilities and limitations of AI tools they use, and must not delegate legal judgement to AI without appropriate oversight. The Bar Standards Board has issued similar guidance for barristers.
This isn't aspirational — it's enforceable. The SRA has the power to discipline solicitors who provide inadequate service due to inappropriate AI use. The Legal Services Board, which oversees all legal regulators, published a cross-sector statement in 2025 emphasising that all regulated legal professionals must maintain competence in the technology tools they use.
Clients Are Asking — and Expecting Lower Fees
The client pressure is intensifying. According to a 2025 Thomson Reuters survey, 71% of UK corporate counsel now ask their external law firms about AI capabilities during panel reviews. Clients aren't just curious — they're expecting AI to drive efficiency, and they're expecting that efficiency to be reflected in fees.
Firms that can't articulate their AI strategy, demonstrate how AI improves their service delivery, or show that their lawyers are trained to use AI effectively are losing pitches to firms that can. The competitive pressure is real and immediate.
Shadow AI Is Creating Professional Liability Risk
Here's what keeps managing partners up at night: associates and paralegals using consumer AI tools — ChatGPT, Claude, Gemini — for legal work without the firm's knowledge or approval. This creates multiple risks:
- Confidentiality breaches: Client data entered into consumer AI tools may be used for model training, violating confidentiality obligations and potentially waiving legal privilege
- Accuracy failures: AI hallucinations in legal research or drafting could lead to incorrect advice, missed deadlines, or flawed contract clauses
- Professional indemnity exposure: If AI-generated errors lead to client losses, the firm's PI insurance may be voided if the firm didn't have appropriate AI governance in place
- Regulatory sanctions: The SRA could take disciplinary action if inadequate AI governance leads to service failures
The only sustainable response is structured training that gives lawyers the skills to use AI effectively and the judgement to use it safely.
What Lawyers Actually Need to Learn
Generic AI training doesn't work for lawyers. The legal profession has unique requirements around confidentiality, professional ethics, evidential standards, and the nature of legal reasoning. Here's what each role needs:
Associates: The Frontline of Legal AI
Associates are the primary AI users in most firms, and they need practical skills:
- AI-assisted legal research: How to use AI to accelerate case law research, statute analysis, and regulatory monitoring — while verifying outputs against primary sources
- Drafting augmentation: Using AI to generate first drafts of contracts, letters, and memoranda — and the critical review skills to catch errors, inconsistencies, and hallucinated clauses
- Due diligence acceleration: AI-powered document review for M&A transactions, regulatory investigations, and disclosure exercises
- Hallucination detection: The specific skill of identifying when AI has fabricated case references, misquoted statutes, or invented legal principles. This is the single most important AI skill for any lawyer
- Prompt engineering for legal work: How to structure prompts that produce useful legal outputs — including jurisdiction specification, precedent constraints, and style matching
Partners: Strategy and Client Communication
Partners need a different skill set focused on strategy and client relationships:
- Client-facing AI strategy: How to communicate the firm's AI capabilities to clients — what AI does and doesn't do, how it improves service quality, and how it affects pricing
- Pricing implications: Understanding how AI efficiency affects the billable hour model and how to transition to value-based pricing where appropriate
- Supervision responsibilities: How to effectively supervise AI-augmented work product — what to check, what to trust, and how to maintain quality standards
- Business development: Using AI insights to identify cross-selling opportunities, anticipate client needs, and develop thought leadership
Legal Ops and IT: The Infrastructure Layer
Legal operations teams are responsible for the tools and workflows that enable AI adoption:
- Tool evaluation and procurement — assessing AI tools against security, confidentiality, and regulatory requirements
- Workflow design — integrating AI into existing practice management systems and matter workflows
- Data governance — ensuring training data, prompts, and outputs are handled in compliance with data protection and confidentiality requirements
- Usage monitoring — tracking AI adoption, identifying shadow AI use, and measuring efficiency gains
Paralegals: High-Volume AI Applications
Paralegals often gain the most from AI training because their work involves high-volume, pattern-based tasks:
- Document review and classification using AI-powered platforms
- Contract analysis — extracting key terms, identifying risk clauses, comparing against templates
- Bundling and disclosure automation
- Legal research support — preliminary case law searches, regulatory monitoring, precedent tracking
In-House Counsel: The Internal AI Governance Role
In-house legal teams have a dual role — they use AI for their own work and they govern AI use across the organisation:
- Reviewing and negotiating AI vendor contracts — data processing agreements, liability allocation, IP ownership
- Developing internal AI usage policies — what employees can and can't do with AI tools
- Advising the business on AI regulatory compliance — UK GDPR, sector-specific requirements, and (where relevant) EU AI Act obligations
- Managing AI-related disputes and liability
The Spicy Legal AI Training Pathway
Our three-phase programme takes legal teams from baseline literacy through tool proficiency to workflow transformation. Each phase builds on the previous one, and each includes assessment to ensure competency rather than just attendance.
Phase 1: AI Literacy (All Legal Staff, 3 Hours)
The foundation. Every lawyer, paralegal, and legal support professional needs this baseline:
- What AI can and can't do: Capabilities and limitations in a legal context — pattern matching vs. legal reasoning, statistical correlation vs. causal analysis
- Hallucination awareness: Practical exercises in identifying fabricated case citations, incorrect statute references, and misrepresented legal principles. We use real examples from reported cases where AI-generated legal submissions contained fabricated references
- Confidentiality protocols: What data can enter which AI tools, the distinction between enterprise and consumer AI services, and the firm's approved tool list
- Ethical obligations: SRA, BSB, and Law Society guidance on AI use. Professional responsibility for AI-generated work product
Phase 2: Tool Proficiency (Role-Specific, 1 Day)
Hands-on training with the firm's approved AI tools, tailored to each role:
- Prompt engineering for legal work: Structuring prompts that specify jurisdiction, cite format, precedent constraints, and desired output structure
- Output verification workflows: Systematic approaches to checking AI-generated work — cross-referencing citations, validating reasoning chains, checking for internal consistency
- Tool-specific training: Deep dives into the firm's selected AI platforms — whether that's Microsoft Copilot, Harvey, CoCounsel, Luminance, or other legal-specific AI tools
- Practical exercises: Real-world scenarios — draft a contract clause using AI, research a legal issue using AI, review a disclosure set using AI — with expert feedback on technique and output quality
Phase 3: Workflow Transformation (Team-Level, Ongoing)
The final phase embeds AI into daily practice:
- Matter workflow redesign: Identifying which steps in each matter type can be AI-augmented, which require human-only work, and how to structure handoffs
- Efficiency measurement: Tracking time savings per matter type, quality improvements, and client satisfaction changes
- Client communication: How to inform clients about AI use in their matters — transparency builds trust, secrecy destroys it
- Continuous learning: Monthly AI update sessions to cover new tools, new capabilities, and lessons learned from AI-augmented matters
The Billable Hour Question
Let's address the elephant in the room. If AI makes lawyers 30% more efficient, does that mean 30% less revenue under the billable hour model? This is the question that's making many firms hesitant about AI training — and it's the wrong question.
The firms that are successfully navigating this transition are doing three things:
- Shifting to value-based pricing for work where AI dramatically reduces time — fixed fees for contract reviews, capped fees for due diligence, success-based fees for litigation
- Increasing volume: AI efficiency allows lawyers to handle more matters, more clients, and more complex work. Revenue per lawyer can increase even if revenue per hour decreases
- Competing on quality: AI-trained lawyers produce better first drafts, more thorough research, and faster turnaround. Clients pay for quality, and they pay more for speed
The Magic Circle firms — which have invested most heavily in AI training — report that AI has increased revenue per lawyer, not decreased it, by enabling their lawyers to take on higher-value work and serve more clients. Mid-tier firms that delay AI training risk being squeezed from both directions: Magic Circle firms taking high-value work more efficiently, and AI-enabled alternative legal service providers taking routine work at lower cost.
Ready to close the AI skills gap in your legal team? Spicy Advisory delivers AI training programmes specifically designed for UK law firms and in-house legal teams. Our Spicy Legal AI Training Pathway covers everything from hallucination awareness to workflow transformation — with practical exercises using real legal scenarios. Book a discovery call.
Frequently Asked Questions
Does the SRA require law firms to train staff on AI?
The SRA does not mandate a specific AI training programme, but it does require solicitors to maintain competence in the tools they use for legal work. The SRA's 2024 guidance on AI in legal practice makes clear that technology competence is an aspect of professional competence — solicitors who use AI tools without understanding their capabilities and limitations risk providing inadequate service. The SRA has the power to take disciplinary action where inadequate AI governance leads to service failures. In practical terms, any firm whose lawyers use AI tools should have structured training to demonstrate compliance with competence obligations.
Can solicitors use AI for legal research and drafting?
Yes, but with important caveats. The SRA and Law Society guidance permits the use of AI as an assistive tool for legal research and drafting, provided that solicitors exercise independent professional judgement over all AI-generated outputs. This means AI can generate first drafts, identify relevant case law, and suggest analytical frameworks — but the solicitor remains personally responsible for the accuracy, completeness, and appropriateness of the final work product. Solicitors must verify AI-generated case citations against primary sources, check the accuracy of legal analysis, and ensure that AI outputs are appropriate for the specific matter and jurisdiction. The key principle is that AI augments but does not replace professional legal judgement.
What are the professional indemnity insurance implications of using AI?
This is an evolving area that every firm should discuss with their PI insurer. Most UK PI insurance policies currently cover claims arising from AI-related errors, provided the firm can demonstrate reasonable governance and supervision of AI use. However, insurers are increasingly scrutinising firms' AI governance frameworks during renewal. Firms without structured AI training, approved tool lists, and documented governance processes may face higher premiums or coverage exclusions. The Law Society's practice note on AI recommends that firms notify their insurer of their AI usage, ensure their AI governance framework meets the insurer's expectations, and keep records of AI-related quality control processes.
How do Magic Circle firms approach AI training differently?
Magic Circle firms have invested significantly more in AI training than the broader UK legal market. Key differentiators include: dedicated AI training teams (typically 3-5 people within the innovation or knowledge management function), mandatory AI literacy training for all lawyers (not just volunteers), firm-specific AI tools with bespoke training programmes (Harvey, CoCounsel, and custom-built solutions rather than generic AI tools), integration of AI training into the trainee solicitor programme from day one, and ongoing measurement of AI adoption and efficiency gains at the matter level. These firms typically invest £2,000-£5,000 per lawyer annually in AI training and development — compared to £200-£500 at most mid-tier firms. The result is measurably higher AI adoption rates, better quality AI-augmented work product, and stronger competitive positioning with clients.