The Council of State School Officers and National Governors Association: Whom do they Serve?

CCSSO & NGA: Whom do they Serve?< Nonpartisan Education Review / Articles
Access this resource in .pdf format







 

The Council of Chief State School Officers and National Governors Association:

Whom do they serve?



 

 

 

Introduction

The Council of Chief State School Officers (CCSSO) and the National Governors Association (NGA) are member associations headquartered in Washington, DC. They are also co-owners of the Common Core Standards—the controversial educational content standards that most US states have incorporated, in whole or in part, into their K–12 education programs.[1]

 

Yet, despite what their names might suggest, they are not government entities, even though most of their members are elected or appointed state government officials. Peter Wood explains[2]

 

The standards were developed by the National Governors Association (NGA) in collaboration with the Council of Chief State School Officers (CCSSO). These are private, non-governmental bodies—in effect, education trade organizations. The National Governors Association, despite its name, isn’t just a group of sitting governors. It includes many ex-governors and current or former gubernatorial staff members. The deliberations of the NGA and the CCSSO are not open to the public and the work that these bodies did to develop the Common Core State Standards remains for the most part unavailable to outsiders. Neither body, being private, is subject to Freedom of Information requests. The standards themselves are copyrighted by the NGA and the CCSSO.

 

The impact these organizations have on US schools via the Common Core Initiative deserves our attention and scrutiny. Also important to consider, however, is the impact their intimate association with the Common Core Initiative has had on them. Do these organizations any longer serve their members’ needs on education issues? Do governors and state superintendents receive unbiased information and a full range of evidence and policy options from the association staff they pay with their member dues?

 

A Note on Data Sources

This report lifts most of its facts from CCSSO and NGA filings with the Internal Revenue Service (IRS)[3] and the CCSSO and NGA websites. Other sources are referenced as appropriate.

 

Financial accounts appeal as a source of organizational information, in part, because one expects them to be accurate and complete. They are typically filed by professionals who both have legally binding fiduciary responsibilities and are desirous of preserving their reputations (and staying out of jail). Moreover, they are subject to audit by the IRS, an agency with considerable legal authority.

 

That doesn’t mean that IRS filings are always as informative as they could be. One will not find, for example, any record in the CCSSO’s filings of the contributions it has received. Since fiscal year 2004, the CCSSO has folded all its contribution (i.e., grant and donation) revenue into the “program service revenue” category, perhaps inappropriately. Program service revenue comprises fees, dues, and direct payments for services.[4]

 

In 2014, The Bill and Melinda Gates Foundation awarded CCSSO $6,148,749. Granted, the money was awarded contingent upon the CCSSO using it for certain purposes. But, the Gates Foundation did not receive the services and materials the grant paid for. If it had, it could not have legally classified the expense as a charitable contribution.

 

CCSSO’s own auditor reports, available from its website (with some digging), itemize grants received from the federal government, but not those from anywhere else.[5]

 

Suffice it to say that, even though the CCSSO may, perhaps, wish to obscure the origins of the grants it receives, those contributions are substantial and now dwarf the revenues received from membership dues and meeting registration fees.

 

As for the National Governors Association (NGA), financial data for its education activities remain well hidden. The Center for Best Practices—NGA’s research and policy analysis division—is just one of several within the NGA. Likewise, the education group is just one of several groups within the Center for Best Practices, and one of the smaller ones at that. Publicly available financial data break down only to the level of the Center as a whole. Contributions from outside organizations, such as the Bill and Melinda Gates Foundation for NGA Common Core work, are not itemized in NGA public documents.

 

A Short Aside

I am a proud, dues-paying member of the American Psychological Association. The APA is the oldest, the largest, the best-known, etc., member organization of formally trained psychologists in North America. As I understand it, practicing psychologists (i.e., therapists) comprise about half the membership and, likewise, university professors comprise about half. The tiny percentage left over is populated by folk like me, who fit into neither of the two large groups.

 

The first stream of email notices I received from the APA upon joining concerned elections. Nominations had already been solicited, but candidates still needed to be chosen, and then voter participation was requested. The effort concerned not only the several offices responsible for the organization as a whole. The APA also hosts 54 divisions of special interests, such as School Psychology and Clinical Psychology, and state and provincial chapters. Most, if not all, of the hundreds of APA sub-divisions have elections, too. All told, the election process seemed to consume several months and comprise the bulk of APA activity.

 

At first I thought simply that APA was a very large group of professionals with many and varied interests, and the democratic aspect seemed refreshing. But, then, only a few weeks after election season had finally ended, I received the broadcast request for nominations for the next round of elections. APA’s leaders hold office for just one year.

 

APA’s election proliferation makes sense if the purpose is to involve as many members as possible with responsibilities (or to help pad resumes). It makes little sense if the purpose is to run the organization efficiently. It takes months before new leaders can understand the mechanics of running such a large, amorphous association. And, long-term planning might seem futile given that the next office holder will take over in just several months and may hold a very different agenda.

 

As with many other Washington, DC based national associations, APA’s long-term institutional memory and day-to-day management know-how reside in the permanent, salaried, non-elected administrative staff.

 

In 2015, APA released the report of an independent investigation it had commissioned into the “collusion” of APA administrators with Department of Defense and Central Intelligence Agency officials on torture policies and procedures over a decade earlier, in the aftermath of the 9/11 disaster and the Iraq invasion.[6]

 

For a decade, the nominal leaders of the APA were unaware of the extent and character of aid and cooperation its administrators had granted to “curry favor” with US military and espionage agencies. When bits and pieces of the story leaked out in the media in the ensuing years, APA’s nominal leaders responded by reiterating APA’s strong and seemingly unambiguous policy against torture.

 

The APA website hosts a timeline of the events with its last entry from early 2017.[7] In summary: for several years, APA’s nominal leaders were unaware that APA administrators colluded with US agencies responsible for torture contrary to APA’s written policy opposing it and banning any psychologist from participating in it. Then, for several more years, as pieces of the story leaked in the media, APA’s nominal leaders denied any involvement and pointed to the written policy. Finally, after APA’s nominal leaders turned their full attention to the matter, it took several more years to clean up the mess.

 

With all the talk these days about the “deep state”—“ a body of people, typically influential members of government agencies or the military, believed to be involved in the secret manipulation or control of government policy”[8]—perhaps it is time to explore the possible existence of a “deep nonprofit sector”.

 

Back to CCSSO and NGA

It is unlikely that the Council of Chief State School Officers (CCSSO) and the National Governors Association (NGA) have ever cooperated with spy agencies on torture protocols. The point of the previous aside was to illustrate how separate the actual administration of Washington, DC-based national nonprofits can be from their nominal administration.

 

Here are some organizational characteristics that CCSSO and NGA share with the American Psychological Association:

 

Š      the cast of nominal members and leaders changes frequently, with every election turnover, or resignation and replacement, while salaried staff tend to remain in place for longer durations;

 

Š      all but a few of the nominal members and leaders live and work outside of Washington, some of them thousands of miles away, and travel to Washington only occasionally; and

 

Š      permanent staff live in the Washington, DC area and most of their personal and professional relationships and their career paths are focused there.

 

Many complain about lifetime professional politicians—those who stay in Washington, DC long after their peak years. But, there exists also a parallel group of lifetime political staff. Unlike Cincinnatus, the legendary Roman leader who returned to his farm when he felt he had accomplished what he had been conscripted to Rome to do, some Beltway staffers choose not to return home when their bosses retire, change jobs, or lose an election. Rather, they form or join a think tank or association, making use of their contacts and experience working DC’s hallways and lobbies.[9]

 

Given Washington, DC’s magnetic pull on members of this group, it should come as no surprise to hear them so often suggest federal solutions for education problems. The 10th Amendment to the US Constitution may be of interest only for its circumvention.[10] Federal involvement in education means more work for them there, debating, lobbying, researching, and writing talking points.

 

CCSSO

Traditionally, membership associations survive on member dues and smaller amounts from meeting registrations and publication sales. Some may also provide professional development, consulting, or program evaluation services for a fee. The CCSSO still does those things a traditional membership association does. But, far more of its revenue is now derived from grants. And, that requires doing things that please donors. That which serves the membership and that which pleases donors are not always the same.

 

Figure 1 compares the amounts of revenue derived from membership dues, meeting registration fees, and just one of the many donors to CCSSO programs, the Bill and Melinda Gates Foundation, over the time period 2003 to 2017.

 

Figure 1. CCSSO Revenue from Membership and Registration Fees or Gates Foundation Grants in $millions, 2003–2017[11]

 

CCSSO received over $2.5 million in member dues in tax year 2014. However, “contracts, grants, & sponsorships” income exceeded $31 million, twelve times the amount from dues and meetings. CCSSO in its current form could easily survive a loss of member dues payments; it could not survive intact the loss of its contracts and grants (currently, these are predominantly payments for promoting the Common Core). The tail wags the dog.

 

In fact, contracts and grants have long comprised the largest revenue source for the CCSSO, but since 2001 they have doubled in size (see Figure 2).

 

Figure 2. CCSSO Revenue from Contracts & Grants or Membership Fees in $millions, 2001–2015

 

 

Also in tax year 2014, 26 CCSSO staffers received annual salaries in excess of $100,000. At least another six took home more than $200,000. The CEO, Chris Minnich, got more than a quarter million. Gene Wilhoit, who along with David Coleman originally convinced Bill Gates to help fund the Common Core Initiative, took home $358,114 in his final year (2013) as CCSSO’s CEO.

 

From 2009 to 2015, the number of employees receiving in excess of $100,000 year increased from 11 to 26, despite no change in the total number of employees.

 

CCSSO also spent over $8 million on travel in 2014, more than on salaries and wages.

 

CCSSO also claimed over half a million for “lobbying to influence a legislative body (direct lobbying)”, but $0 as “lobbying to influence public opinion (grass roots lobbying).” Yet, at another location in their IRS form, a “grassroots nontaxable amount” of $250,000 is declared.

 

So much money flows through CCSSO that it earned almost a quarter million dollars from investments alone in 2014.

 

CCSSO conceals where most of its donor money comes from. But, according to Citizen Audit, the following organizations are among those it tracks that gave CCSSO money in just the most recent tax year: Sandler Foundation, Pearson Charitable Foundation, National Education Association, Knowledgeworks Foundation, Educational Testing Service, College Entrance Examination Board, Birth To Five Policy Alliance, and American Institutes For Research in The Behavioral Sciences.[12]

 

Where does all the donor money go? Surely, most of it goes to the intended target programs, salaries, administration, and travel. But, apparently, some proportion also is squirreled away in the CCSSO’s growing portfolio of investments (see Figure 3).

 

Figure 3. CCSSO Membership and Registration Fee Revenue and Unrestricted Net Assets in $millions, 2002–2015

 

CCSSO’s election/selection process for its board of directors also appears to be something of a mystery. As described in its IRS filing[13]

 

The internal operations committee shall nominate to the membership one candidate for each office to be filled at the annual policy forum selecting from those members who express interest in serving. … Upon receipt of the report of the internal operations committee at the meeting, the presiding officer shall give the opportunity for additional nominations to be made from the floor…. Upon close of nominations by motion from the floor, the election of each officer shall proceed by secret ballot, and the candidate receiving the plurality of votes cast for each office shall be declared elected.

 

So, state superintendents who attend the annual meeting get to vote. The mysterious part is the nominating panel, the “internal operations committee.” I could find nothing about it on the CCSSO website. Furthermore, I requested—twice—a copy of the by-laws from the CCSSO’s communications office, without success.

 

The CCSSO’s election process concerns us all because CCSSO owns the Common Core Standards, and the Common Core Standards touch most of our students. Can they be altered? Perhaps not if the “internal operations committee” happens to like them.

 

NGA Center for Best Practices

After a flurry of Common Core-focused writing around the time of its introduction (2009–2013) NGA staff has had little to say about it. Yet, NGA still co-owns the standards, and money to support its Common Core advocacy continues to roll in.[14]

 

A Common Core-focused NGA publication reveals much about staff preferences. “Trends in State Implementation of the Common Core State Standards: Making the Shift to Better Tests” hard-sells the Common Core aligned consortium tests from the Partnership for Assessment of Readiness for College and Careers (PARCC) and the Smarter Balanced Assessment Consortium (SBAC).[15] Moreover, it strongly recommends “eliminating” current tests that “are not adequately measuring student learning or that will be duplicative of the tests offered by PARCC and Smarter Balanced.”

 

The PARCC and SBAC tests were described as “more rigorous and relevant,” “more rigorous and educationally useful,” and “better aligned to the more rigorous and relevant CCSS and … more useful to educators.” All this before they had even been created. Footnoted references all support the Common Core and its tests. It was claimed that “…the estimated cost of the new tests will be no greater than those of the tests currently administered… ,” “benefits will be greater and costs lower in the long run if the assessments are administered online with a computer or tablet,” and “The CCSS are evidence and research-based informed by the most effective models from states and countries across the globe, include rigorous content, and demand the mastery of that content….”

 

Time would reveal all of the above to be just wishful thinking. Much is summarized in Table 1 from the report.

 

 

Some of the statements contained therein represent still more wishful thinking (e.g., “results [will be] reported within approximately 2–4 weeks of testing” and “highly informative and actionable reports [will be] provided to teachers/parents”) that time would repudiate. Other statements were made without supportive evidence (e.g., “provides information about readiness for entry-level, credit-bearing courses” and “anchored in measuring readiness for college or career-training level work, will have greater rigor than current state tests”).

 

High quality; low veracity

It is often said that scientific writing is dull and boring to read. Writers choose words carefully; mean for them to be interpreted precisely and, so, employ vocabulary that may be precise, but is often obscure. Judgmental terms—particularly the many adjectives and adverbs that imply goodness and badness or better and worse—are avoided. Scientific text is expected to present a neutral communication background against which the evidence itself, and not the words used to describe the evidence, can be evaluated on its own merits.

 

By contrast, according to some advocates, Common Core, PARCC, and SBAC are “high-quality”, “deeper”, “richer”, “rigorous”, “challenging”, “stimulating”, “sophisticated”, and assess “higher-order” and “critical” thinking, “problem solving”, “deeper analysis”, “21st-Century skills”, and so on, ad infinitum.

 

Conversely, alternatives to Common Core and Common Core consortia assessments may be described as “simple”, “superficial”, “low-quality”, and “dull” artifacts of a “19th-Century” “factory model of education” that relies on “drill and kill”, “plug and chug”, “rote memorization”, “rote recall”, and other “rotes”.

 

Our stuff good. Their stuff bad. No discussion needed.

 

This is not the language of science, but of advertising. Given the gargantuan resources Common Core, PARCC, and SBAC advocates have had at their disposal to saturate the media and lobby policymakers with their point of view, that opponents could muster any hearing at all is remarkable.[16]

 

Their version of “high-quality” testing minimizes the importance of test reliability (i.e., consistency and comparability of results), an objective and precisely measurable trait, and maximizes the importance of test validity, an imprecise and highly subjective trait, as they choose to define it.[17] “High-quality”, in Common Core advocates’ view, comprises test formats and item types that match a progressive, constructivist view of education.[18] “High-quality” means more subjective, and less objective, testing. “High-quality” means tests built the way they like them.

 

“High quality” tests are also more expensive, take much longer to administer, and unfairly disadvantage already disadvantaged children, due to their lower likelihood of familiarity with complex test formats and computer-based assessment tools.[19]

 

Read, for example, the CCSSO report Criteria for high-quality assessment, written by Linda Darling-Hammond’s group at Stanford’s education school, people at the Center for Educational Research on Standards and Student Testing (CRESST), and several other sympathizers.[20] These are groups with long histories of selective referencing and dismissive reviews.[21]

 

Unlike a typical scientific study write-up, Criteria for high-quality assessment brims with adjectival and adverbial praise for its favored assessment characteristics. In only 14 pages of text the reader confronts “high-quality” 24 times; “higher” 18 times; “high-fidelity” seven times; “higher-level” four times; “deep”, “deeply”, or “deeper” 14 times; “critical” or “critically” 17 times; and “valuable” nine times.[22]

 

Uncommon Core

Along with the US Education Department and the Bill and Melinda Gates Foundation, CCSSO and NGA represent the most important institutions supporting the Common Core Initiative.[23] Both have tied their reputations to the success of the Common Core Initiative. So, it is fair to ask: has the Common Core Initiative been successful?

 

An enormous quantity of resources has been expended to promote the Common Core Standards and aligned assessments. Unfortunately, pronouncements of the alleged certainty of their success preceded (in some aspects, preceded by several years) any possible objective evaluation of outcomes. As time passed and some of the positive results failed to appear as promised, liberties were taken with the evaluation methods to artificially induce positive outcomes from otherwise poor results.[24]

 

The various techniques of altering definitions, manipulating data, cherry-picking references, hiring only sympathetic evaluators, etc., continue apace with ample funding. Arguably, a large majority of the available pundits, researchers, and advocacy organizations in US education policy have at one time or another been paid to promote Common Core.[25]

 

Yet, even with so many influencers paid to convince us that Common Core is wonderful, many remain unconvinced. All the money, time, and effort have produced, at best, a stalemate. States continue to “leave” the Common Core Initiative to either: retain the standards under another name; alter the standards; or replace the standards.[26]

 

Proponents argue that the Common Core Standards largely remain, even if under a different name or tweaked some here and there. But, remember, the most fundamental and persuasive argument of the Common Core Initiative has always been that it would produce a common set of standards across the country and student performance across states could be validly compared.[27]

 

Despite what proponents say and foundation directors choose to believe, that goal is no longer possible. It became impossible the moment states began to review and change the standards, frameworks, blueprints, and test items to match their own preferences. Indeed, the goal was never possible so long as states retained any power to make changes anywhere along the test development continuum of standards-frameworks-blueprints-tests.[28]

 

In 2015, Florida debuted a new Common Core Standards-aligned statewide test, the Florida Standards Assessment. Florida purchased test development services from a company that had just completed the administration of a new Common Core-aligned test in Utah. This company—American Institutes for Research—could offer the most competitive bid for the Florida contract because they would use the Utah item bank. Indeed, the state of Utah was paid for the “rental” of its test items.

 

Few, if any, states would be willing to administer another state’s test items without first reviewing those items, however, and Florida is no exception. Utah’s test items ran through the standard gauntlet of content and bias reviews by state educators, and some items were changed. Meanwhile, though Florida was willing to borrow Utah’s test items, it drafted its own test frameworks and blueprints—the standard two intermediate steps between test item writing and test form completion.

 

After the first operational administration of the test, standard analysis of the results revealed that students at all levels found some of the items opaque. An independent review revealed that at some grade levels as many as one out of three items was not aligned to the Florida standards.[29]

 

Granted, Florida had tweaked its standards slightly from the original Common Core Standards, just as Utah had. But, these two are among the states Common Core proponents argue still retain so much of the original Common Core that their results are validly comparable.[30]

 

But, apparently, the slight differences in standards between the two states were not the primary source of the huge mis-alignment between the two tests. Rather, it appears to have been the simple fact that Utah and Florida educators each drafted their own test frameworks and blueprints that interpreted and prioritized differently how the same standards were to be tested.[31]

 

Conclusion

The decisions to promote Common Core at both CCSSO and NGA were made initially by their nominal leaders—governors and state superintendents. But, they were made based on limited, skewed, and sometimes-false information. Moreover, judging from recent Common Core-related policy documents written by staff, association members still do not receive anything close to full briefings—with the full range of evidence and points of view covered. Instead, they receive promotional sheets from their staff—at best, talking points for those governors and superintendents who might still be interested in selling Common Core to their constituents.

 

Given the predominance of contract and grant funding at both organizations, one wonders if they merit being classified as member associations any longer. If their staff does not provide their members an objectively neutral range of policy options and evidence, what are they doing? Perhaps, in lieu of helping their members serve their constituents, they busy themselves writing grant proposals to fetch the far larger funding amounts available from foundations and the federal government. And, if that is the case, CCSSO and NGA do not primarily serve their members. Rather, they primarily serve the federal government or wealthy foundations. They have been co-opted.[32]

 

Do CCSSO and NGA need to be as large and as wealthy as they have become  …not to serve their members?

 



Access this resource in .pdf format


Citation: Phelps, R.P. (2018). The Council of Chief State School Officers and National Governors Association: Whom do they serve? Nonpartisan Education Review / Articles. Retrieved [date] from http://nonpartisaneducation.org/Review/Articles/v14n4.pdf




Endnotes

[1] See, for example, Common Core Standards Initiative, “Branding Guidelines” http://www.corestandards.org/about-the standards/banding-guidelines/

 

[2] Wood, P. (September 2015). Drilling Through the Core: Why Common Core is Bad for American Education. Boston: Pioneer Institute, p. 17. http://pioneerinstitute.org/drilling-through-the-core/

 

[3] Council of Chief State School Officers (2002–2015). Form 990: Return of Organization Exempt From Income Tax; National Governors Association (2002–2015). Documents obtained through Citizen Audit, https://www.citizenaudit.org/; the National Center for Charitable Statistics, http://nccs.urban.org/.

CCSSO Form 990s may be found here: 2001 2002 2003 2004 2005 2006 2007 2008 2009 2010 2011 2012 2012b 2013 2013b 2014 2014b

NGA Form 990s may be found here: 2002 2003 2004 2005 2006 2007 2008 2009 2010 2011 2012 2013a 2013b 2014 2015

 

[4] See Internal Revenue Service. (2017). 2016 Instructions for Form 990, p.38. https://www.irs.gov/pub/irs-pdf/i990.pdf

“Program service revenue includes income earned by the organization for providing a government agency with a service, facility, or product that benefited that government agency directly rather than benefiting the public as a whole. Program service revenue also includes tuition received by a school, revenue from admissions to a concert or other performing arts event or to a museum; royalties received as author of an educational publication distributed by a commercial publisher; interest income on loans a credit union makes to its members; payments received by a section 501(c)(9) organization from participants or employers of participants for health and welfare benefits coverage; insurance premiums received by a fraternal beneficiary society; and registration fees received in connection with a meeting or convention.”

 

[5] See, for example, Dixon Hughes Goodman. (November 10, 2015). Financial Statements as of and for the Years Ended June 30 2015 and 2014, and Independent Auditor’s Report. Tysons, VA: Author.    http://www.ccsso.org/Documents/2015/2015auditedCCSSOfs.pdf

 

[6] Hoffman, D.H., Carter, D.J., Viglucci, C.R., Benzmiller, H.L., Guo, A.X., Lafti, S.Y., & Craig, D.C. (July 2, 2015). Report To The Special Committee Of The Board Of Directors Of The American Psychological Association Independent Review Relating To APA Ethics Guidelines, National Security Interrogations, And Torture. Chicago: Sidney Austin LLP. http://www.apa.org/independent-review/APA-FINAL-Report-7.2.15.pdf; http://www.apa.org/independent-review/final-report-message.aspx

 

[7] American Psychological Association. (n.d.) Timeline of APA Policies & Actions Related to Detainee Welfare and Professional Ethics in the Context of Interrogation and National Security. Washington, DC: Author.

http://www.apa.org/news/press/statements/interrogations.aspx

 

[8] From Google dictionary. https://www.google.com/search?q=what+is+the+deep+state&ie=utf-8&oe=utf-8

 

[9] See also:  Lofgren, M. (2015, November 2). The “anti-knowledge” of the elites. Moyers & Company. http://billmoyers.com/2015/11/02/the-anti-knowledge-of-the-elites/#.VjgS4F7oAJU.twitter or

https://consortiumnews.com/2015/10/31/the-anti-knowledge-of-the-elites/; Savage, G.C. (2016). Think tanks, education and the elite policy actors. Australian Educational Researcher, 43, 35–53.

 

[10] See, for example, Weiss, J. (Fall 2015). “Competing Principles:

Race to the Top, a $4 billion US education reform effort, produced valuable lessons on designing a competition-based program,” Stanford Social Innovation Review. https://ssir.org/articles/entry/competing_principles

 

[11] CCSSO membership fee totals for 2016 and 2017 are estimated based on the average increase from 2013 to 2015. Totals for 2003–2015 come from CCSSO IRS filings.

 

[12] https://www.citizenaudit.org/organization/530198090/COUNCIL%20OF%20CHIEF%20STATE%20SCHOOL%20OFFICERS/

 

[13] FY2014, CCSSO Form 990, p.36

 

[14] Or, at least it did until 2015, as far as I can tell. Contributions to the NGA are not itemized in its annual IRS filings, or in any document downloadable from its website.

 

[15] National Governors Association. (2013). “Trends in State Implementation of the Common Core State Standards: Making the Shift to Better Tests,” NGA Paper. Washington, DC: Author. https://eric.ed.gov/?q=Common+core+standards&ff1=locTennessee&id=ED583243

 

 

[16] For example, from the federal government alone, PARCC received $185,862,832 on August 13, 2013. https://www2.ed.gov/programs/racetothetop-assessment/parcc-budget-summary-tables.pdf; SBAC received $175,849,539 to cover expenses to September 30, 2014. https://www2.ed.gov/programs/racetothetop-assessment/sbac-budget-summary-tables.pdf. A complete accounting, of course, would include vast sums from the Bill and Melinda Gates Foundation, other foundations, the CCSSO, NGA, Achieve, and state governments.

 

[17] For more on this issue, see Moss, P.A. (March 1994). “Can There Be Validity without Reliability?” Educational Researcher, 23(2), pp. 5-12; Ebel, Robert L. 1961. “Must All Tests Be Valid?” American Psychologist. v.16, pp.640–647; Tristán López, A., & Pedraza Corpus, N.Y. (2017). “La Objetividad en las Pruebas Estandarizadas,” Revista Iberoamericana de Evaluación Educativa, 10(1). https://revistas.uam.es/index.php/riee/article/view/7592

 

[18]Constructivism is basically a theory -- based on observation and scientific study -- about how people learn. It says that people construct their own understanding and knowledge of the world, through experiencing things and reflecting on those experiences.” Here are two descriptions of constructivism: one supportive, http://www.thirteen.org/edonline/concept2class/constructivism/ and one critical, http://epaa.asu.edu/ojs/article/view/631

 

[19] Phelps, R.P. (2008/2009). Educational achievement testing: Critiques and rebuttals. In R. P. Phelps (Ed.), Correcting fallacies about educational and psychological testing, Washington, DC: American Psychological Association; Phelps, R.P. (2003); Kill the Messenger: The War on Standardized Testing. New Brunswick, NJ: Transaction Books; McQuillan, M., Phelps, R. P., & Stotsky, S. (2015, October). How PARCC’s false rigor stunts the growth of all students. Boston: Pioneer Institute. http://pioneerinstitute.org/news/testing-the-tests-why-mcas-is-better-than-parcc/

 

[20] http://www.ccsso.org/Documents/2014/CCSSO Criteria for High Quality Assessments 03242014.pdf/

 

[21] See, for example, Richard P. Phelps. (2012). The rot festers: Another National Research Council report on testing. New Educational Foundations, 1. http://www.newfoundations.com/NEFpubs/NEFv1n1.pdf; (2015, July); The Gauntlet: Think tanks and federally funded centers misrepresent and suppress other education research. New Educational Foundations, 4. http://www.newfoundations.com/NEFpubs/NEF4Announce.html ;

 

[22] For an extended critique of the CCSSO Criteria, see “Appendix A. Critique of Criteria for Evaluating Common Core-Aligned Assessments” in Mark McQuillan, Richard P. Phelps, & Sandra Stotsky. (2015, October). How PARCC’s false rigor stunts the academic growth of all students. Boston: Pioneer Institute, pp. 62-68. http://pioneerinstitute.org/news/testing-the-tests-why-mcas-is-better-than-parcc/

 

[23] Note that 3 of these 4 organizations are private and unaccountable to the general public and public schools they affect so profoundly.

 

[24] See, for example, McQuillan, M., Phelps, R. P., & Stotsky, S. (2015, October). How PARCC’s false rigor stunts the growth of all students. Boston: Pioneer Institute. http://pioneerinstitute.org/news/testing-the-tests-why-mcas-is-better-than-parcc/; Phelps, R.P. (2016, February 16). Fordham Institute’s pretend research. Boston: Pioneer Institute. https://pioneerinstitute.org/featured/fordhams-parcc-mcas-report-falls-short/; Phelps, R.P. (2015, November 11). Fordham report predictable, conflicted. Boston: Pioneer Institute. http://pioneerinstitute.org/blog/fordham-report-predictable-conflicted/; Phelps, R.P. (2015, November 10). Setting academic performance standards: MCAS vs. PARCC. Boston: Pioneer Institute. http://pioneerinstitute.org/featured/study-poor-performance-of-other-states-in-parcc-consortium-would-translate-to-lower-standards-for-mass/

 

[25] See, for example, Pullman, J. (2017). The Education Invasion: How Common Core Fights Parents for Control of America’s Kids. New York: Encounter Books; Tompkins-Strange, M.E. (2016). Policy Patrons: Philanthropy, Education Reform, and the Politics of Influence. Cambridge, MA: Harvard Education Press; Schneider, M.K. (2014). A Chronicle of Echoes: Who’s Who in the Implosion of American Public Education. Charlotte, NC: Information Age Publishing; Phelps, R. P. (2016, May 21). ‘One size fits all’ national tests not deeper or more rigorous. Education News. http://www.educationnews.org/education-policy-and-politics/one-size-fits-all-national-tests-not-deeper-or-more-rigorous/

 

[26] National Conference of State Legislators. (n.d.) “Common Core Status Map.” http://www.ccrslegislation.info/CCR-State-Policy-Resources/common-core-status-map/

 

[27] See, for example, Weiss, J. (Fall 2015). “Competing Principles:

Race to the Top, a $4 billion US education reform effort, produced valuable lessons on designing a competition-based program,” Stanford Social Innovation Review. https://ssir.org/articles/entry/competing_principles

 

[28] O’Conner, J. (September 1, 2015). “Test Review Raises Questions About Florida Standards Assessments Results,” StateImpact Florida.

https://stateimpact.npr.org/florida/2015/09/01/test-review-raises-questions-about-florida-standards-assessments-results/; Postal, L. (September 1, 2015). “FSA is valid, but new state test had

problematic debut, study finds,” Orlando Sentinel. http://www.orlandosentinel.com/features/education/school-zone/os-fsa-valid-study-test-florida-post.html

 

[29] Wiley, A., Hembry, T.R., Buckendahl, C.W., Forte, E., Towles, E., Nebelsick-Gullett, L. (August 31, 2015). Independent Verification of the Psychometric Validity for the Florida Standards Assessment. Alpine Testing Solutions and EdCount, LLC. http://www.fldoe.org/core/fileparse.php/5306/urlt/FSA-Final-Report_08312015.pdf

 

[30] For an excellent general discussion of the issues of comparability between differing tests, see Wainer, H. (2011) Uneducated Guesses: Using Evidence to Uncover Misguided Education Policies. Princeton, NJ: Princeton University Press, chapters 5–9.

 

[31] Personal communication with Chad Buckendahl, July 1, 2017.

 

[32] Domhoff, G.W. (1967). Who Rules America? New York: Prentice-Hall.