This is Part 2 in a 6-part series on the state audit of JCPS conducted in 2013-14 and published in May of 2014
The state audit said the school district must regularly compare itself with a “static” group of school districts with regard to finances and operations, but the simple comparisons used do not take into account differences in states’ education systems, and the auditor doesn’t follow his own rule in backing up a significant finding.
The report identifies those peer districts as Austin TX, Baltimore County MD, Pinellas County FL, Cobb County GA, and Charlotte-Mecklenburg NC.
The financial information for benchmarking was from the U.S. Education Department’s National Center for Education Statistics (NCES) using its on-line tool for making comparisons.
The NCES provides good general information about revenues and expenditures of school districts in key categories, but: 1. It is years out of date (see Part 1), and 2. There is not enough detail and consideration of significant state funding differences to support the audit claims at the May 2014 press conference that JCPS was a “bloated bureaucracy.”
More up-to-date budget information was available on each of these school district’s web sites; a good researcher could get it. When you are making such major public claims, real time data is critical.
In fact, JCPS benchmarks constantly and focuses on achievement data and best practices research. It is particularly helpful to use districts that are comparable using National Assessment for Education Progress (NAEP) achievement information, also available through the NCES. Within NAEP, JCPS is part of the Trial Urban District Assessment (TUDA) made up of districts that agree to test enough kids using the same assessments so that peer-to-peer comparisons are possible. Only two of the five districts chosen by the Auditor are TUDA districts.
Furthermore, three of the auditor-recommended districts are suburban districts, while JCPS is classified as urban. NCES will automatically generate districts that are comparable to yours. Most of the districts the auditor chose are not among those recommended by the tool. The PDK audit (see Part 1) had already used the five districts chosen by the state auditor, but added up to 15 more drawn from the 100 largest districts. JCPS said it would use the state audit’s recommended list but has added five more it believes are comparable.
Warehouse Recommendation: Finding With No Foundation
The state audit’s justification behind Finding #16 on warehousing is a narrative that shows the contrived nature of many of the audit’s findings.
It his report and at last month’s anniversary event, Auditor Edelen claimed that the elimination of warehouses was his most important recommendation. However, he doesn’t back up this claim with solid information and uses only one comparison school district, and it’s not on the list he insists the district use for benchmarking.
The first issue is that JCPS is already doing what the audit suggests. The 2014 audit recommends that JCPS “consider eliminating the central warehouse and delivery system currently in place and transition to a just-in-time delivery system.”
JCPS implemented just-in-time purchasing agreements in 2009.
Next, the audit misrepresents the number of JCPS warehouses. He claims that the “network of six JCPS warehouses that store, and deliver a large number of the supplies used by JCPS’ schools and administrative departments is based on an outdated model that is not necessary or cost effective due to the just-in-time delivery capabilities of outside vendors.”
You have to read deep into the narrative to find that JCPS has just two separate warehouses for instructional, custodial and maintenance supplies. The other “warehouses” are sections of three vehicle maintenance garages where parts used by the maintenance workers are stored. They are not separate warehouses. In fact, providing these items where they are used could eliminate any delivery need. I count two warehouses, and three garages with supply areas.
On top of that, the fact is that the audit’s recommended peer districts use warehouses. It only took a quick Internet tour through the Web sites of the comparison districts to find that four of the five have warehousing and central purchasing systems (the fifth might have them I just couldn’t find the information).
Austin’s Web site says, “The AISD warehouse stocks various classroom and maintenance supplies for convenience and cost savings.”
Finally, while state audit’s Finding #1 called for using “static” group of districts for benchmarking, he used another district for comparing use of warehouses. The state audit says, “we contacted Dallas Independent Schools because it was a school district similar in size to JCPS.” Why? Dallas is not on the auditor’s benchmarking list. If benchmarking against a “static” group is so important, why compare JCPS with Dallas, and not to the five districts the audit says JCPS must use?
Dallas evidently has warehouses according to the audit…just doesn’t stock items used infrequently.
To recap: In its report, the auditor criticized the district for not having a “static” group of districts to compare with, but within the same report used another district for comparison on what he has termed as one of the most important findings.
Seriously, with regard to warehouses and central purchasing, the most important action is that the district analyze whether it can achieve savings by changing warehousing or central purchasing practices. There are many state regulations JCPS must follow regarding purchasing, and change must be managed carefully.
JCPS is committed to continuous improvement in this area, and its actions prove it. Here’s part of the response from JCPS to this audit finding:
We reviewed the warehouse system in 2009 and switched to an outsourced JIT (just-in-time) process for office/instructional supplies. We concur that a new review of the remaining warehouse system is appropriate to conduct realizing that some of our current bulk purchasing may still provide better purchasing power than JIT vendors.
If this is the state audit’s most impactful finding, it is not impressive.
(Links to the state audit report and the other benchmarking reports is in Part 1)