Scafidi’s Hasty Conclusions about FRC Report on the Education Scholarship Tax Credit
In his December 12 GPPF blog postwith Georgia GOAL’s Jim Kelly and his more detailed comments linked there, Ben Scafidi accuses us of “outright errors, implausible assumptions and a lack of balance” in our recent report on the fiscal effects of Georgia’s Qualified Education Tax Credit (QETC) program. However, in his critique, Scafidi blatantly misrepresents our analysis, misconstruing it as being critical of the QETC program and making several other egregious errors.
The blog post, under the attention-grabbing headline “Keep Crony Capitalism Out of Georgia’s School Choice Movement,” does get a few things right. Yes, under certain assumptions it is possible that the QETC program could actually save Georgia taxpayers money in educating scholarship recipients. Kelly and Scafidi select a set of assumptions that produces a rather large cost saving for state and local taxpayers, but this particular set of assumptions is not realistic. More reasonable assumptions still result in savings, or at least a smaller cost than what past fiscal assessments of QETC have assumed. Our report shows the assumptions that must hold for there to be a positive net fiscal effect.
Official estimates of the fiscal cost of QETC focus on the revenue effects alone, as is the mandate for revenue fiscal notes prepared for the legislature. The cost in terms of lost tax revenue is simply the amount of tax credits taken in a given year, so if the $58 million of tax credits allowed in a given year are all taken, the revenue loss is $58 million. But as our report makes clear, that is not the true cost. At the state budget level, for every student who leaves a public school for a private one, the state saves the amount of QBE funding it would have granted the local school system had the student stayed in the public school. At the local level, the school system loses that QBE funding, but it saves all of the variable costs associated with educating that student.
There are many variables that determine the actual net cost or net savings to taxpayers from QETC, and unfortunately, the values of some of these variables are unknown and can’t be reliably estimated from the publicly available program information. Nevertheless, the true cost is something less than the $58 million of tax credits currently allowed, perhaps a lot less. As we explain in the report, the program is more likely to save Georgia taxpayers money when you consider the effects on both the state and local budgets. That is the point of the report, to show how those savings can arise and to lay out a range of plausible assumptions about the factors that determine the net cost or savings. Let’s consider Scafidi’s main objections one at a time:
“By making $1,503 in taxpayer spending on schools disappear, the report dramatically understates the fiscal benefits …”
Scafidi concludes that we made a $1,503 per student error because our QBE grant per student and local revenue per student averages don’t add up to the total per student spending figure. Well, they shouldn’t add up.
First, the $9,046 total Scafidi cites from page 4 of our report is the scholarship cap for calendar year 2013 as published by the Department of Education (DOE). It is not clear precisely how DOE calculates this number, but it differs significantly from the average total per student operating expenditures of the state’s school districts. The actual statewide average of per student operating expenditures for fiscal year 2013, the period of analysis in our report, was $8,336 according to DOE (see https://app.doe.k12.ga.us/ows-bin/owa/fin_pack_revenue.entry_form).
Second, in FY2013 about 8.3 percent of funding for Georgia’s public school systems, $690 per student, came from the federal government, not from state or local funding. Net of these federal dollars, expenditures from state and local funding sources would be $7,646 per student.
Third, the $4,066 average per student QBE grant we cite in the report is the portion of QBE funding that is tied to enrollment. That is, it represents the expected reduction in state spending when one student leaves a public school for a private school. But it does not, and is not intended to represent all QBE funding; about 8.5% of total QBE funding in FY2013 was composed of grants that are not a function of enrollment and thus would not be reduced by a student switching from public to private school.
Finally, Scafidi also takes exception to our estimate of the variable component of total costs by which school districts can reduce expenditures in response to a student switching from public school to private school. He cites a study by Bifulco and Rebeck as supporting a higher number than ours. That study uses data from two New York state school districts, but given that New York schools’ local funding per student is over twice that of Georgia, that is hardly a reasonable basis for comparison.
“The report relies on a study of Arizona’s tax credit scholarship program …”
Scafidi claims that we erred in considering estimates of public-to-private switching rates from a study of Arizona’s tax credit scholarship program on the grounds that Arizona’s program is materially different from Georgia’s because it has no “requirement that students be enrolled in a public school prior to receiving a scholarship.” Although he is correct that the Arizona program analyzed in the Cato Institute report we cite did not have a prior public school enrollment requirement, that report cites an estimate that 70 to 80 percent of scholarships awarded under the program were to low-income families, who are generally assumed by most proponents of tax credit scholarship programs to be switchers. Yet the authors of the Cato report conclude that the switching rate for this Arizona program was only 15 to 30 percent.
Nevertheless, we do not rely on the Cato study’s switching rate conclusion or that of a school choice study focused on Chicago that predicted about a 27 percent switching rate in response to a $1,000 (in 1990) voucher. In fact, we draw no conclusions about the actual switching rate under QETC. We do explain why it is certainly less than the 100 percent figure assumed in many analyses, but we conclude that publicly available information about scholarship recipients does not enable us to estimate the QETC switching rate. That is, it is not presently knowable. The only place the 30 percent top end figure from the Cato study enters our analysis is as the worst case, the bottom of the range that we use in our analysis presented in Table 5.
Instead of pretending to estimate the actual net cost or net savings based on little more than a guess about the switching rate, we show what the cost or savings would be over a range of switching rates, from the 30 percent worst case to switching rates as high as 100 percent. The point was to show how sensitive the costs or savings are to the switching rate assumption that one makes and what switching rate is necessary, given the other relevant factors, for the program to break even—for the state and local savings to offset the revenue loss from the tax credit.
“The report relies on a conjecture by a critic [Southern Education Foundation] of the program that lots of scholarship students may reside in four metro Atlanta counties.”
We rely on no such thing. Arguments about Southern Education Foundation aside, we use the average per student QBE grant for the four metro county systems to show the sensitivity of state budget effects to differences in QBE grants across the state. The facts are that QBE grants vary by district and by program, QBE grants are generally lower in the metro Atlanta counties, and state level savings from a student switching would thus be lower in those counties. Using a range of possible values of QBE grants in a sensitivity analysis is just that, a test of the sensitivity of costs or savings to the average amount of QBE grant associated with scholarship students.
We also point out that the lack of transparency in the program prevents us from knowing the overall geographic distribution of scholarship recipients. Even if Scafidi’s anecdotal evidence from one scholarship organization is true, it tells us little about the overall geographic distribution of scholarships. Nor does it demonstrate any error in our sensitivity analysis.
That we “[ignore] the evidence on school choice …”
We state that broader issues about school choice, such as evidence of effects on student outcomes, are “beyond the scope” of our report because, well, they are. While we note the arguments presented by proponents and opponents of school choice, this report never claims to focus on anything other than the near-term fiscal effects of QETC; the long-term effects of school choice would be the subject of a much larger debate.
Charges of “crony capitalism.”
Finally, the blog post and Scafidi’s longer comments go on to make some misguided or disingenuous arguments against the proposed BEST scholarship program, which we also address in the report, but neither endorse nor oppose. Scafidi suggests that adopting BEST would somehow mean depriving some Georgians of an opportunity to benefit from taking a tax credit and instead giving the tax credit benefits to big corporations as a form of “crony capitalism” payoff. Such charged phrases distract from the fiscal analysis of the two programs and in any case, whether the donor to a scholarship organization is an individual or a corporation, the tax benefits to the contributor are the same.
As currently drafted, BEST would be an addition to the current QETC program, not a replacement, and we are simply estimating the fiscal impact of each program. Regardless of who donates and takes the tax credits, the principle difference between QETC and BEST is that the latter is targeted toward low-income families, and as we explain in the report, this is where Georgia’s taxpayers stand to save the most because these families are the least likely to send their kids to private schools in the absence of a scholarship.