Back to Insights
SQESQE1Legal EducationSolicitorsSRAPrep4SQE

The £300 Million SQE Prep Industry Is Failing Students

Charles Peter14 April 2026

I've spent months looking at the numbers behind the Solicitors Qualifying Examination preparation market. What I found isn't just disappointing—it's a systematic failure hidden behind glossy marketing and institutional prestige.

In July 2025, only 41% of candidates passed the SQE1 exam.

That's a 59% failure rate. Students paid between £5,000 and £10,000 for preparation courses. They also paid the SRA £1,934 in exam fees. Many of them failed anyway.

The economics don't add up. Roughly £300 million flows through this market annually. Students invest everything—financially, emotionally, professionally. The return on that investment? A coin flip with worse odds.

What Students Actually Get for Their Money

The disconnect starts with what providers are selling versus what the exam actually tests.

Providers sell knowledge. The exam tests judgment.

Students receive extensive textbooks, video lectures, and notes focused on memorizing legal principles. Contract formation. Tort liability. The foundational rules of law.

The SQE1 asks for the "single best answer" among several legally plausible options. It's not a memory test. It's a test of nuanced application under pressure.

Candidates report feeling blindsided by the exam's ambiguity. Their prep materials covered the topics but not the depth or style of actual questions. They learned the rules but not the specific logic required to differentiate between a "correct" answer and the "best" answer in 1 minute and 40 seconds.

The problem compounds because providers are guessing.

The SRA doesn't release past papers. Providers hire recent exam sitters to reconstruct questions from memory—a practice that produces wildly inconsistent results.

Students report that provider mock exams differ significantly from the real assessment. Some are too easy, creating false confidence. Others focus on the wrong details entirely.

Research shows students typically score 12% lower on the real exam than on official SRA sample questions. That gap represents the calibration failure at the heart of this industry.

The Knowledge Gaps Nobody Talks About

I've reviewed feedback from hundreds of students who paid premium prices for courses that systematically under-delivered on core content.

The gaps aren't about missing entire legal branches. They're about highly specific sub-topics and niche procedural rules that providers either skip for brevity or fail to explain properly.

Professional Conduct and Ethics is the most frequently cited gap. Candidates encounter complex scenarios involving Anti-Money Laundering nuances and Conflict of Interest situations that never appeared in their manuals.

BPP students have noted that ethics materials were "woefully short." Several exam questions covered scenarios simply not found in their £10,000 course.

Tax and Solicitors' Accounts receive high-level summaries rather than the deep, worked examples the exam demands. Students choose to skip these topics because their course made them seem minor, only to find they represent a significant chunk of scoring.

Underlying Law gets systematically overlooked. Large providers like BPP focus heavily on Practice modules, assuming students already know Contract, Tort, and Land Law from their degrees.

Students report that BPP "does not cover underlying law whatsoever" in certain tracks. These topics appear just as frequently in SQE1 as the new practice material.

Up to 10% of exam questions contain material not explicitly covered in prep course manuals.

That's not a calibration issue. That's a fundamental failure to deliver what was promised.

The Business Model That Protects Failure

In any other industry, a 41% success rate for a premium product would trigger mass consumer revolt.

Legal education is different. Providers hold all the cards because of how the path to qualification is structured.

The regulatory shield works like this: Providers justify premium prices by pointing at the SRA's secrecy. Because the SRA refuses to release past papers, providers claim they're performing a high-value "detective" service. They sell reconstructed intelligence back to students at a markup.

The biggest providers don't actually sell primarily to individuals. They sell to law firms.

Firms pay for exclusive cohorts for their trainees. For a firm, £10,000 per student is a rounding error compared to the risk of a trainee failing and delaying their start date. Firms stick with big providers because they offer administrative ease and bespoke reporting—not because the pass rates are highest.

This creates a prestige pricing floor. If firms pay £12,000, individual students feel they must pay at least £6,000 to get anything "reputable."

The refund trap is elegant in its cruelty.

Most providers use a digital access model. The moment you log in to the portal or download the first PDF, you've "consumed" the intellectual property. Your right to a refund evaporates, even if the content proves inadequate.

Instead of refunds, providers offer "Pass Pro" guarantees—free resits if you fail. This costs providers almost nothing. It's just keeping a digital account active. But it prevents them from returning any cash.

When students fail and complain, the provider's defense is always the same: "We gave you 3,000 pages of notes and 2,000 practice questions. Did you do all of them?"

Because the syllabus is vast, students can rarely say yes. The blame shifts from quality of materials to volume of effort.

Not sure where your SQE1 logic gaps are? Use the Prep4SQE AI Diagnostic Tool to map your weaknesses across Wills, Land, and Property Law in under 2 minutes. Try the Diagnostic Tool for free →

The Regulatory Failure Enabling This System

The Legal Services Board gave the SRA a red rating for operational delivery.

The specific failure? The SRA pledged to publish provider-level pass rate data in late 2023. They missed that deadline. Then they missed the autumn 2025 deadline set by the LSB.

The SRA now expects to publish "contextualized" data and a new SQE Course Comparison Tool sometime in 2026.

The delay creates a vacuum.

Without official, comparative data, students can't verify if a provider's "90% pass rate" claim is based on their entire student body or just a small, elite group of apprentices.

The SRA's justifications for the delay reveal the problem. They cite "data collection flaws" and the need for "contextualization." They argue that publishing raw data might not "support the development of a healthy market."

The regulator is admitting they're withholding consumer protection data to protect provider business models.

This isn't oversight. It's enablement.

The Apprentice Gap Exposes the Fundamental Flaw

Solicitor apprentices achieve a 71% pass rate for SQE1.

The general cohort achieves 53%.

That's an 18-point gap. It's the smoking gun that proves the traditional prep course model is fundamentally misaligned with how the exam tests competence.

The difference isn't just student quality.

Yes, firms use rigorous screening. But the real advantage comes from how apprentices encode legal knowledge.

The £10,000 course student learns Business Law and Practice through a 400-page manual. They memorize the procedure for a board meeting in isolation.

The apprentice has sat in on three real board meetings, drafted the minutes, and filed the forms at Companies House before they open the textbook.

When the exam asks about a procedural error, the apprentice doesn't recall a list. They recognize a situation. This context reduces cognitive load during the assessment.

Apprentices also avoid the volume overload trap. Instead of a six-month cram, their learning spreads over 30 months. They study in 20-minute windows between real work, forcing content to be punchy and exam-focused rather than academic.

The psychological advantage is massive.

Apprentices have zero debt and a guaranteed job. Individual students study under the crushing pressure of a £15,000 to £20,000 financial bet.

High-cortisol environments impair the complex reasoning required for single best answer questions. The apprentice's stability allows for better performance.

The data suggests the commercial model treats the SQE like an academic law degree. The apprenticeship model treats it like a professional qualification.

One approach works. The other generates £300 million in revenue while producing a 59% failure rate.

What an Honest Provider Would Look Like

An outcome-focused provider would have to dismantle the current business model entirely.

The curriculum would be a living document. After each exam sitting, it would update to include the niche topics students reported missing. If it's not in the SRA's Assessment Specification, it's not in the materials.

The mock system would be brutally transparent. Every practice question would be tagged with its success rate among students. If 90% get a question right, it's too easy and gets removed.

The provider would publish its conversion rate—the actual difference between what students score on mocks versus the real exam. An honest provider would say: "Our mocks are 10% harder than the real thing. If you hit 60% here, you're safe."

Lectures would be replaced with decision training. The focus would shift from explaining why the right answer is right to analyzing why the distractors are wrong. That's where most students fail.

Content would be delivered in 15-minute bursts via an app, mimicking the rapid-fire decision-making required in the exam.

Pricing would be tied to outcomes. A base fee covering technology and materials, plus a success fee payable only if the student passes. If a student completes 90% of the course and fails, they get a 50% cash refund—not just a free resit.

This forces the provider to care about student success as much as their own profit margin.

An honest provider would look less like a law school and more like a training laboratory. Smaller, leaner, obsessed with the gap between mock results and SRA reports.

Why This Won't Change

Market forces alone can't fix this.

The business model is too profitable. The regulatory shield is too strong. The captured market is too reliable.

Students have no leverage. They need the qualification. The SQE is the only path. As long as that remains true, most candidates will continue paying the entry tax.

Applications for SQE courses rose by over 50% in the 2024/25 cycle. Demand is increasing despite the failure rates.

The SRA's delayed transparency might shift spending toward providers with better track records. But it won't fundamentally change the incentive structure.

The legal education market will evolve, not collapse.

High-volume providers with poor outcomes will face pressure. Boutique providers with better results will gain market share. Technology platforms offering quality question banks at lower prices will force consolidation.

But the fundamental dynamic remains. Students will continue accepting inadequate preparation as the cost of qualification because they have no alternative.

I doubt it will ever truly change.

The system works perfectly—just not for the students paying for it.

The SQE isn't an exam you can out-study; it’s an exam you must out-logic.

Stop guessing where your weaknesses are. Use our AI Diagnostic Tool to map your specific logic gaps in Wills, Land, and Property Law in under 2 minutes. Try the Diagnostic Tool for free

Want to see your own logic gaps?

Take our 2-minute AI Diagnostic. Get a personalised score, identify your Wills Wall and other niche traps, and start studying smarter.

Free · No sign-up required · Personalised results in 8 minutes