Skip to main content
Back to Resources
Compliance

5 LCAP Mistakes That Get Plans Sent Back (And How to Fix Them)

By Nathan Critchett · March 5, 2024

It's a Thursday afternoon in late June. Dr. Morales pulls the envelope from her mailbox, already knowing what it is. The county office seal. She scans the letter and her stomach drops. Three sections flagged for revision. Two goals marked "insufficient." A request for "additional specificity" on expenditure justifications.

Her board presentation is in nine days.

She'll spend the next week pulling her assistant superintendent and two directors off their actual jobs to rework a document they'd already spent 300 hours building. The revision won't require new ideas. She knows exactly what the county wants. She just didn't have time to get it right the first pass, because she was also running a school district.

This scene plays out across California every summer. After reviewing hundreds of LCAPs from districts of every size, we built software specifically to catch these issues before they reach a county reviewer's desk. Here are the five that come back the most.

1. Goals That Read Like Mission Statements

This is the single most common reason county reviewers push back. A district writes a goal that belongs on a lobby poster, not in a compliance document.

Here's what it looks like in the wild:

Before: "All students will have access to high-quality instruction that prepares them for college, career, and civic life in a supportive and inclusive learning environment."

That sentence could describe literally any school district in the state. It contains zero measurable claims. A county reviewer reads it and has one question: how would you know if you achieved this?

The fix is specific, measurable language tied to a data source your district actually tracks.

After: "Increase the percentage of students meeting or exceeding standards on the CAASPP ELA assessment from 38% to 45% by June 2026, with targeted growth of 8 percentage points for English learners and 6 percentage points for socioeconomically disadvantaged students."

The difference is obvious once you see it side by side. The first version describes a feeling. The second version describes a target you can point to on a dashboard in twelve months and say, "We hit it" or "We didn't."

County reviewers aren't hunting for perfect prose. They're hunting for accountability. They need to be able to verify whether you did what you said you'd do. A mission statement makes that impossible.

In Compliance Composer, every goal gets cross-checked against your district's actual data sources. If you write a goal that references a metric you don't collect, or set a target with no baseline, the system flags it before it leaves your screen.

2. Actions Disconnected From Metrics

This one is sneakier. The goals are fine. The actions are reasonable. But there's no visible thread connecting the two.

A district sets Goal 2 as improving math proficiency for unduplicated pupils. The action listed under that goal? "Provide ongoing professional development for certificated staff." The expenditure? $180,000 from Title II.

A county reviewer looks at that and asks: which staff? What kind of PD? How does this specific training connect to math proficiency for this specific student group? The logic chain is missing.

Before: "Action 2.3: Provide professional development opportunities for teachers to improve instructional practices. Funding: $180,000 (Title II)."

After: "Action 2.3: Train 42 elementary math teachers in Cognitively Guided Instruction (CGI) through a year-long coaching partnership with the county math office. CGI's emphasis on student reasoning has been shown to improve conceptual understanding in low-income populations (Carpenter et al., 2015). Progress will be measured by comparing CAASPP math scores for unduplicated pupils at CGI-trained sites against matched comparison schools. Funding: $180,000 (Title II, Object 5800)."

The first version spends $180,000 on something vague. The second version tells you who's getting trained, in what, why it should work for this population, and how you'll measure whether it did. That's the difference between an action and a wish.

We see this pattern constantly: districts allocate real money to real programs but describe them in language so generic that a reviewer can't tell if the spending is strategic or accidental. Compliance Composer pulls from your actual expenditure data and staffing records to draft action descriptions that are grounded in your district's specifics, including the object codes, the FTE allocations, and the research citations.

3. Copy-Pasting Last Year's Plan With Updated Dates

Everyone does this. And everyone knows it's a problem.

The annual update section is supposed to demonstrate that your district learned something. That you looked at what worked, what didn't, and adjusted accordingly. When a county reviewer opens your annual update and sees the same narrative structure from 2023 with "2024" find-and-replaced in, they know exactly what happened. Nobody reflected. Nobody analyzed. Somebody just needed to get the document filed.

Here's the tell: look at your "Changes in Actions" column. If every entry reads something like "Action will be continued and modified as needed based on data analysis," you've got a copy-paste plan. That sentence says nothing. It's a placeholder dressed up as a response.

Before (Annual Update, Goal 1): "This action was partially implemented due to staffing challenges. The district will continue this action in the coming year with modifications as appropriate based on ongoing data analysis."

After: "Tutoring services under Action 1.4 reached 312 of the targeted 500 unduplicated pupils (62% participation rate). Exit survey data showed students who attended 10+ sessions gained an average of 14 scale score points on the interim CAASPP, compared to 3 points for non-participants. Low participation was driven by transportation barriers at three rural sites. For 2025-26, we're shifting $45,000 from after-school to embedded intervention blocks during the school day to remove the transportation barrier, targeting 90% participation."

The second version tells a story. Something happened. Here's what the data showed. Here's what we learned. Here's what we're changing and why. That's the entire point of the annual update section.

Compliance Composer keeps your previous year's plan as live context. When you start the annual update, it surfaces the gaps automatically: "You committed to serving 500 students, but your participation data shows 312. Here's a draft explanation with three possible modification strategies." You edit and refine from there.

4. Burying Unduplicated Pupil Data in Appendices

California's LCAP exists primarily to ensure that districts are directing resources toward the students who need them most: foster youth, English learners, and students from low-income households. These are your unduplicated pupils, and the increased/improved services section is where you prove that your spending on these students goes above and beyond what all students receive.

The mistake districts make is treating this section like a formality. They'll reference "see Appendix D for disaggregated data" and move on. Or they'll describe their unduplicated pupil services in broad strokes without showing the proportionality calculation that justifies the spending.

County reviewers zero in on this. It's the section where they spend the most time, because it's the section most likely to be underdeveloped.

Before: "The district will use supplemental and concentration grant funds to provide increased and improved services for unduplicated pupils, including additional counseling, intervention programs, and family engagement activities. See Appendix C for detailed expenditure breakdown."

After: "The district's unduplicated pupil count is 6,842 (72.3% of total enrollment). Supplemental and concentration grant funding totals $4.2M. Of this, $2.8M funds positions and programs that serve only unduplicated pupils (2 additional counselors at Title I sites, 4 bilingual family liaisons, targeted reading intervention for EL students scoring Below Standard). The remaining $1.4M funds districtwide improvements principally directed toward unduplicated pupils, including extended learning time at the 6 schools with UPP above 85%. The proportionality percentage is 8.1%, exceeding the required minimum of 6.9%."

The first version waves at the data from a distance. The second version walks the reviewer through the math, names the programs, and shows the proportionality calculation in the narrative itself. The reviewer doesn't have to hunt through an appendix to verify the claim.

Compliance Checker runs your unduplicated pupil data against your expenditure allocations and flags when your proportionality calculation is missing, when a service you've described as "increased" is actually districtwide, or when your supplemental/concentration spending doesn't match the numbers in your budget tables.

5. Missing the "Why" Behind Expenditure Changes

This one trips up experienced districts, not just first-timers. Year over year, your spending changes. Programs get cut. New initiatives get funded. Budgets shift between goals. The LCAP requires you to explain those changes.

Most districts describe what changed. Almost none explain why.

Before: "Expenditures for Goal 3 decreased from $1.2M to $890,000 due to adjustments in program funding."

A county reviewer reads that and immediately wonders: which programs? Why the decrease? Did something fail? Did priorities shift? Was there a funding cut? "Adjustments in program funding" is the compliance equivalent of saying "things happened."

After: "Goal 3 expenditures decreased by $310,000 (from $1.2M to $890,000) for two reasons. First, the district's 3-year MTSS implementation grant from the county office expired ($180,000). The core MTSS coordinator position has been absorbed into the general fund under Goal 1, Action 1.2. Second, the after-school credit recovery program at Lincoln High ($130,000) was discontinued after two years of data showed minimal impact on graduation rates for the target population (unduplicated pupils graduating within 4 years increased by only 0.8 percentage points, below the 3-point target). Those funds have been redirected to a dual enrollment partnership with the local community college under Goal 3, Action 3.5, which early data from a pilot cohort suggests produces stronger outcomes for this group."

The second version explains the reasoning. A grant expired, so you moved the position. A program didn't work, so you pulled the funding and redirected it somewhere with better evidence. That's exactly what the state wants to see: a district that pays attention to results and makes changes based on what the data actually shows.

Compliance Composer tracks your expenditures across years. When it detects a significant change (anything above 10% in either direction), it prompts you with a draft explanation that references the specific funding sources, the outcome data, and the reallocation target. You're editing a first draft that already knows your numbers, instead of staring at a blank cell trying to remember why you moved $130,000 eighteen months ago.

The Pattern Behind the Pattern

Look at all five mistakes together. Vague goals. Disconnected actions. Copy-paste updates. Buried data. Missing rationale. They share a root cause: time.

Every superintendent we've talked to knows what a strong LCAP looks like. They've read the exemplars. They've sat through the trainings. They can spot weak goal language in someone else's plan from across a conference table. The problem is that knowing what "good" looks like and having 400 hours to execute it are two completely different things.

These five mistakes aren't competence problems. They're capacity problems. When your best leaders are also your compliance writers, and they're also managing labor negotiations, and they're also responding to a facilities emergency, and they're also preparing for a board meeting, the LCAP gets whatever cognitive energy is left over. Usually that's a Sunday afternoon in May, running on coffee, trying to get the document to "good enough" before the deadline hits.

We built Compliance Composer and Compliance Checker because "good enough" shouldn't require 300 hours and a prayer. The district leaders doing this work are brilliant, capable people. They just need the mechanical labor stripped away so they can focus on the parts that actually require a human brain: the strategy, the judgment calls, the honest assessment of what worked and what didn't.

That's the work that makes a plan worth reading. The formatting should take care of itself.

Keep Reading

Want to see this in action?

We'll walk you through a real report and recommend the right starting point for your team.