Equitable Grading

I think most of these policies are not good for math students.

I am OK with test retakes (one per test) if a student scored below 75% on a test, and the maximum possible retake score is 75%. It encourages students to do their best the first time rather than try to play the system (which is what they do), and it allows for some recovery for the good student who bombs one test.

I am very OK with zeros, particularly on homework that is not attempted. In my experience, completing homework is a very valuable part of learning math. Each day, I cruised the room and looked at everyone’s homework. Sometimes I was spot checking but mostly, I was looking for problems that students had with it. In precalculus honors, for example, there were usually three or four problems that gave many students problems, and I always addressed all questions. Occasionally, I assigned a problem that only a handful of students would/could do, and that was OK. It challenged the top students and I always made it clear that such problems wouldn’t be on any test.

I am also OK with zeros for tests or quizzes that were never taken.

One policy I didn’t see listed was the one about deadlines. The last year that I taught, we couldn’t impose deadlines on anything, so that, for example, a student could turn in a large stack of homework on the last day of the trimester and we were expected to accept it without penalty. I am very opposed to this because the idea of homework is that it facilitated learning as we moved toward a test. It was counted as zero points once we took the test over that chapter.

Math is different from other content areas. You can fail chemistry one year and get an A in physics the next, because one doesn’t lead to the next. In math, it is just different. It is ludicrous to think that a student who can’t pass algebra 1 will be successful in algebra 2. Math learning is sequential, as is learning a world language, and it’s pretty difficult to fill in gaping holes when a student is trying to take a class for which prerequisite learning was incomplete/missing.

Posted in Curriculum & Instruction, Ethics, Joye Walker, K-12, Mathematics, teachers | Tagged , , , | Leave a comment

Comparing states by only looking at overall NAEP average scores can provide incomplete analysis of performance

One of the more notable problems with much that is written about the National Assessment of Educational Progress (NAEP) regarding relative state performances is that far too often, only overall average scores are compared. Whether we are talking college professors, state education agencies, local educators, members of the press, and more, far too often some important parts of the real story are ignored because only overall average scores are compared.

This isn’t a new problem. The National Center for Education Statistics (NCES) has cautioned about overly simplistic analysis that only looks at overall average scores for many years. NCES even included special comments on the topic in the 2009 NAEP Science Report Card (http://nces.ed.gov/nationsreportcard/pdf/main2009/2011451.pdf).

Below is a partial extract from Page 32 in that report card that highlights some examples of how the picture can be VERY different once more thorough analysis of NAEP is conducted.

The first example used by NCES is Kentucky’s performance in the 2009 Grade 8 NAEP Science Assessment. When you only look at overall average scores, Kentucky scores statistically significantly higher than the national public school average. However, when you only consider scores for White students in each state, Kentucky’s score statistically significantly lower than the national average. Once you learn that in this assessment Kentucky’s NAEP student sample was 85% White, the importance of this additional information becomes far more apparent.

As you can see in the next example from the Page 32 extract, things can work the opposite way, as well. When you only consider overall average scores, Florida’s 8th Graders scored statistically significantly below the national public school average in the 2009 NAEP Science Assessment. But Florida’s Hispanic 8th Graders scored statistically significantly higher than the national public school average for all Hispanics.

In both cases, the picture presented only by the overall average scores is incomplete and might be rather misleading. Very simply, good analysis with NAEP requires more. Those who fail to provide it are not presenting strong arguments for whatever case they are trying to make.

Another example, Mississippi’s Grade 8 Reading improvement

The first comment in this thread deals with general examples from NCES about how comparing state NAEP results by only looking at overall average student scores can provide a very incomplete picture of the true relative performances of those states. In this section, I provide data from the 2013 and 2024 NAEP Grade 8 Reading Assessment to show that comments still being made by educators that Mississippi has not made any headway in moving its notable 4thgrade performance up to the 8th grade are no longer accurate.

It is true, as some engaged in these discussions have written, that when you look at only overall average NAEP Grade 8 Reading scores, Mississippi still ranks towards the bottom of the stack. But what such pundits aren’t telling us is that once you break the data out by race, the improvement in Mississippi’s Grade 8 NAEP Reading performance compared to other states is hard to miss.

The tables below show separate breakouts of scores for White students in the top half and Black students in the bottom half. Some states did not get score reports for Black students as the samples were inadequate in some way, most likely due to low enrollment numbers of Black students in those states. The states are ranked by their reported NAEP Scale Score for Grade 8 Reading in the two listed years.

As you can see below, back in 2013, the year Mississippi started enacting some key education reform legislation, both its White and Black students performed very poorly compared to other states. However, as shown in the right side of the graphic, both the state’s White and Black students’ rankings have moved up since 2013 and in the latest NAEP Grade 8 Reading Assessment are notably higher. But that isn’t the complete story.

One issue with NAEP rankings by scores is that this assessment only tests a sample of students in each state, so all the scores have statistical sampling errors. Due to the presence of sampling errors, it is possible that the true relative performance of two states fairly close in the table might actually be reversed if NAEP had tested all the students in those two states. The presence of this statistical sampling issue is why the tables below, which come from the NAEP Data Explorer web tool (https://nationsreportcard.gov/ndecore/landing), also show information about the statistical significance in the scores. One column of particular interest shows how many states scored statistically “significantly higher” than the state listed in each row. Let’s see how this statistical analysis works out for Mississippi.

For White students, back in 2013 a total of 43 states outscored Mississippi’s by a statistically significant amount in NAEP Grade 8 Reading. By 2024 that number had been cut remarkably as just 7 states outscored Mississippi by a statistically significant amount. Can any reasonable person think that isn’t notable progress?

For Black students, back in 2013, those in 27 other states statistically significantly outscored Mississippi’s. Flash forward to 2024 and now only 2 states can make that claim – just 2.

The message about Mississippi in the tables below also reiterates that you simply cannot get an accurate picture about what is happening in education by only looking at overall NAEP scores. As NCES points out in its 2009 NAEP Science Report Card and as pointed out in the examples using Mississippi, you have to look deeper. Those who don’t do more than look at overall scores are not doing adequate research and may just be cherry picking information to try and support an incorrect case. Don’t fall for that.

Posted in Curriculum & Instruction, K-12, Reading & Writing, Richard Innes, Testing/Assessment | Tagged | Leave a comment

What does NAEP say about the Teachers College Reading and Writing Project?

There is a lot of discussion of late about Lucy Caulkins’ Teachers College Reading and Writing Project (TCRWP)to teach reading. It got me thinking.

Back in 2003, then New York City schools chancellor Joel Klein really pushed TCRWP as THE program to be used to teach reading in the Big Apple.

Rather conveniently, the National Assessment of Educational Progress (NAEP) started doing what is called its Trial Urban District Assessments (TUDA) around the same time. Basically, the same NAEP used to create state reporting would also be used in some of the nation’s largest school districts. New York City has been in the TUDA for several decades.

TCRWP is on the way out in New York City, but a Chalkbeat study indicated about 48% (maybe more) of the city’s schools were still using TCRWP as of 2019 (https://tinyurl.com/3tesmv48).

With the above information in hand, I decided to look at how the city did on NAEP between 2003 and 2019. The table below has the results (note: years are listed in reverse order with 2019 at the top).

If you look at the percentages of each race testing “below Basic” over time, there doesn’t seem to be much change in the entire time period covered. The same unremarkable lack of change is apparent when you look at the percentages testing “at or above Proficient.”

Equally concerning are some of the reported percentages. Roughly half of the Black and Hispanic students in the city consistently scored in the “below Basic” achievement level. NAEP documentation indicates that means these students didn’t even have a partial mastery of reading.

Except for 2011, fewer than half of the White students scored “at or above Proficient” throughout the time period, as well.

NAEP certainly doesn’t provide very comforting information about reading in New York City during the TCRWP era. Apparently, New Yorkers, the current school chancellor included, have figured this out, too.

Posted in Curriculum & Instruction, K-12, Reading & Writing, Richard Innes, Testing/Assessment | Tagged , | Leave a comment

Reading performance in the US is a serious problem

Whether we use NAEP or state assessments, reading performance in the US is a serious problem, and trying to excuse this away just doesn’t work.

There’s been a lot of discussion from some teachers and Ed school professors about how the National Assessment for Educational Progress (NAEP) standard for reading proficiency is simply set too high. Some of that discussion centers on a NAEP process that develops equivalent NAEP scores for each state assessment’s proficiency standard (accessible here: https://tinyurl.com/4hspr6y5).

The results of that study for 2022 Grade 4 NAEP Reading are found in the graphic below. You can see that state standards very widely. Virginia, for example, set a proficiency standard below even the threshold score required to be rated a NAEP “Basic” performance. Massachusetts, at the other end of the scale, actually set a standard slightly above the threshold NAEP uses to declare a student proficient in reading.

About 1/3 of the way up the standards graphic from the least demanding state you will see Kentucky, highlighted with a blue arrow, set a proficiency standard about in the middle of the scoring range NAEP only considers to be only Basic level reading.

Given its easy standards, those who want us to believe there is no crisis in reading would surely want to be able to say that Kentucky is reporting far better results than what the NAEP reported.

However, as you can see in the insert below, Kentucky’s own state assessment didn’t return much rosier statistics for Grade 4 Reading in 2022 (Kentucky School Report Card Available here: https://reportcard.kyschools.us. NAEP Data Explorer online here: https://tinyurl.com/2hdcmrnr).

On Kentucky’s assessment, despite the state’s notably lower proficiency standard than NAEP, fewer than half, just 46%, of all the state’s public school fourth graders scored Proficient or better. That doesn’t seem all that much different from NAEP’s proficiency rate for the Bluegrass State of 31%. Even using its own, rather undemanding standards, Kentucky must report that fewer than half of its public school students are reading adequately.

How about the lowest performers, those which NAEP grades as “Below Basic” and which Kentucky reports are “Novice” readers? The difference here is even closer than for the proficient-to-proficient example, a spread of just 9 percentage points. So, even using Kentucky’s watered-down standards, almost 1 in 3 of the state’s students are in this very bottom performance category.

Label this what you will, this sort of performance is far below what is needed. Making excuses for a system that continues to provide such statistics isn’t acceptable.

Still unanswered is a serious question: which standard better reflects the performance students really need in reading – NAEP’s or Kentucky’s? I’ll have more on that shortly.

Score insert below by Innes

Image

Posted in Curriculum & Instruction, K-12, Reading & Writing, Richard Innes, Testing/Assessment | Leave a comment

Grade 4 Reading – Is NAEP’s standard for proficiency set too high?

There’s been a lot of discussion from some teachers and Ed school professors about how the National Assessment for Educational Progress’ (NAEP) standard for reading proficiency is simply set too high. These naysayers claim this creates a false sense of crisis when things actually are pretty much OK. But are the attacks on NAEP valid? Or, do the cautionary tales NAEP is telling us need to be taken very seriously?

I took a look at that question some time back by comparing the message from Kentucky’s NAEP scores to data for the state from two different tests from ACT, Inc. Those ACT tests provided statistics on the proportion of students that had reasonable chances of earning either a “C” (75% chance) or “B” (50% chance) on related college freshman courses. ACT calls these Readiness Benchmark scores and has reported them for a number of years.

Kentucky offered a unique opportunity to conduct this study because the state tested all public school students with the ACT’s Explore test for many years and also has tested all public school students with the ACT College Entrance Test for many years, as well.

So, how did that turn out? You can read this report for the full story: https://tinyurl.com/76uwaee8, but here is one example of what you’ll find. In these figures, the correlation between the percentages of Kentucky public school students scoring NAEP Proficient or Above and the percentages of the same cohorts of students reaching the ACT’s Readiness Benchmark scores are compared.

As you can easily see, the agreements are remarkably close.

If you read the full report, you will see even more evidence that NAEP, not most watered-down state standards, relates better to what our students need for solid success. Very simply, if we want students ready for living wage careers or college, we need the sorts of performances the NAEP is scoring Proficient or better.

Posted in Curriculum & Instruction, Reading & Writing, Richard Innes, Testing/Assessment | Leave a comment

Math Anxiety

I dealt with test anxiety among my honors students when I was teaching. From my perspective, it mostly arose from one of the following situations:

  1. The need to “get an A” all the time so that a student can please demanding parents, get into a prestigious university, and feel worthy of being in a class with other top-tier math students.
  2. The lack of confidence that seems to quickly develop when students who are accustomed to getting A grades suddenly find math difficult and the A grades begin to seem out of reach.
  3. The hitting of the proverbial wall when a subject area that always was easy no longer makes sense. Students who memorize how to do many kinds of problems but do not fully understand why are especially prone to develop test anxiety, in my experience.
  4. Awareness that other students submit tests much earlier, leading a slower working student to feel inept.
  5. Lack of test taking strategy, such as having the fortitude to leave a problem unsolved and move on with the rest of the test, making more efficient use of time. Sometimes problems later in a test can provide a spark of insight needed to return to a problem previously unsolved, but now with renewed confidence.
  6. Test anxiety can keep a student awake the night before the test, and even produce physical illness — headaches and nausea.

I saw many students over the years deal with these sorts of issues. The first strategy I would try would be removing time constraints and changing the test venue for the student, so that there was no opportunity to see how quickly someone else finished the test and there was no worry about not finishing on time. I often found the students who were given this modification not only finished their tests on time but improved their scores. After a couple of these strategies, they tended to regain their confidence and return to taking tests in the classroom with other students.

I think that teachers also should teach their students how to recall or recapture information they previously knew, but somehow forgot during the test — such as the factoring of a sum or difference of cubes. If one has to factor x^3-1, it is clear that x=1 is a zero of the polynomial expression, so that x-1 is a factor. A quick polynomial division can recoup the other factor. My students demonstrated this a few times on tests. If one is anxious, then memory of procedures and general information can be hindered, yet having the confidence and skill to derive what is needed goes a long way in reducing that anxiety.

I am hardly a researcher, but having taught honors pre-calculus for 22 straight years to my town’s best and brightest, I did notice a few things.

Posted in College prep, Curriculum & Instruction, Joye Walker, K-12, math, Mathematics, Testing/Assessment | Tagged , | Leave a comment

New article: Fact-checking Research Claims about Math Education in Manitoba

https://nonpartisaneducation.org/Review/Resources/Fact-checking_research_claims_about_math_education_in_Manitoba.pdf

EXECUTIVE SUMMARY

In a Winnipeg Free Press article, Mathematics education of Manitoba teachers should be based on research (November 13, 2024), Dr. Martha Koch, an Associate Professor in the Faculty of Education at the University of Manitoba, made several claims about recent amendments to the Teaching Certificates and Qualifications Regulation under The Education Administration Act. These amendments significantly reduced the subject-area expertise required for teacher certification. Koch used the phrase “research shows” 15 times in her article. Some key claims put forth in the article include:

  • 1. “The recent changes mean that Manitoba’s teacher certification requirements are better aligned with current research in mathematics education.”
  • 2. “Notably, research shows that early and middle years teachers (grades K-8) who have taken more undergraduate university courses in mathematics are not more effective teachers of mathematics. That is, their students do not have better outcomes in mathematics.”
  • 3. “In fact, some studies have shown that K-8 students actually have lower achievement in mathematics if their teachers have more undergraduate courses in mathematics.”

Since Koch’s statements seemed dubious, she was asked to provide supporting evidence. She responded by circulating an eight-page research synopsis referencing 22 articles and books. After reviewing all 22 references, we found that none credibly support the above claims, and some even contradict them.

Additionally, Koch made statements about research on “mathematics knowledge for teaching” (MKT) in her Winnipeg Free Press article. The references she provided contain repeated, unambiguous statements emphasizing mathematical subject content knowledge as a necessary component of MKT—an important detail omitted by Koch.

The potential consequences of relying on claims that appear to lack evidence are significant, particularly given their possible influence on public policy affecting Manitoba children.

Our main findings

1. Faulty premises and conclusions not aligned with evidence

  • Koch implied that pre-service K-8 teachers are being required to take standard undergraduate math courses—similar to those designed for physicists, mathematicians, and engineers—even though all Manitoba math departments offer specialized courses tailored for K-8 teachers.
  • In several cases, Koch appears to draw conclusions that are not supported by the articles.

2. Lack of supporting evidence

  • Not one article provided by Koch concludes that “K-8 students achieve lower outcomes when their teachers have more undergraduate math courses.”
  • Many of the articles appear to contradict Koch’s claims, have been applied out of context, or are irrelevant to the discussion.
  • Based on our analysis, the articles do not provide support for the idea that K-8 pre-service teachers should avoid math courses provided by university math departments.

3. Serious methodological issues

  • Several studies clearly lacked proper design, or reported results that lacked statistical significance, making causal inferences impossible.

4. Disregard of contradictory evidence

  • Several studies omitted by Koch indicate a positive correlation between math content courses
  • Several of the articles Koch cited emphasize the need for stronger math content preparation for prospective teachers. One even referred to a recommendation for a minimum of six credit hours in math as an admission requirement for K-8 pre-service teachers—contradicting Koch’s conclusions.

5. Impact of misinformation

  • In the Manitoba Legislative Assembly on November 22, 2024, Tracy Schmidt, the Acting Minister of Education and Early Childhood Learning, stated that the amendments to The Education Administration Act “were based on research on math education, not on opinion.” This raises concerns about the role of Koch’s claims in shaping these policy changes, particularly given concerns about the lack of supporting evidence.

Conclusions and recommendations

Our detailed review discusses each of the cited papers, demonstrating that none appear to substantiate Koch’s claims.

Given the serious implications of Koch’s statements, and their potential impact on public policy, we make the following recommendations:

  • 1) Retraction: Dr. Martha Koch should retract her Winnipeg Free Press article, as it gives readers the misleading impression that her claims are supported by research.
  • 2) Policy Caution: The Manitoba government should consult more broadly, and exercise greater caution when relying on education research, to inform policy decisions.

Posted in Curriculum & Instruction, Education policy, Governance, Higher Education, information suppression, K-12, licensure, Mathematics, Uncategorized | Tagged , , , , , , , | Leave a comment

New article: Qianruo Shen’s “Integrated Curriculum Reform and its Impact on Science Education — Why is the West Falling Behind East Asia in PISA and TIMSS?”

Abstract: This paper investigates why western nations underperform in PISA and TIMSS compared to East Asia and explores the root causes of the decline in science education. By analyzing TIMSS data and comparing the science curricula and teacher qualifications in North America and East Asia, the article reveals how the integrated curriculum reform, promoted by UNESCO, has led to a severe and protracted declination, particularly in physics and chemistry.

The excessive amount of non-basic contents and the interdisciplinary approach, of the integrated curricula, have diluted the focus on physics and chemistry, which are the foundation stone of science. Moreover, science teachers with general science degree other than subject-specific credentials lack knowledge in physical science, leaving students with poor mastery of the fundamental knowledge and skills; and underprepared for advanced scientific study and STEM careers. Consequently the U.S. and some other western nations are facing a crisis in cultivation of STEM talents and qualified labor force.

It is critical for the West to take bold steps immediately and rebuild the foundation of their science education. Some suggestions are provided.

Posted in constructivism, Curriculum & Instruction, Education policy, International Tests, K-12, Mathematics, Qianruo Shen, science, STEM | Tagged , , , , , , | Leave a comment

Comments on Zearn’s “Myth of the Math Kid”

https://time.com/7008332/math-kid-myth-essay/


I really find these kinds of articles tiresome, because the accusations or “myths” as expressed by Shalinee Sharma, are assumptions made by people who are generalizing about all math teaching in order to peddle their programs.

The first myth — “Math is only about speed” is such a simplistic point. No one that I know believes that math is only about speed. However, fluency with basic facts should include speed. By the time students are in algebra 1 and are trying to factor a trinomial such as x^2-2x-24, there is simply not enough time to draw pictures or arrays to figure out what are factors of 24. Students shouldn’t have to mentally apply the distributive property to figure out what 6 times 4 might be (e.g. (5+1)(4) would be 20+4, or 24.) In my mind, fluency would mean that a student could answer nearly immediately that 6 times 4 is 24. On the other hand, managing products of larger numbers might indeed include some properties, such as (13)(12), which could be thought of as 13×10 + 13×2.

Myth #2 says that “Math is a series of tricks.” It can certainly look that way if teachers do not carefully develop and derive algorithms. No doubt, many K-8 teachers do not understand math well enough to develop and provide clear explanations of standard algorithms and procedures. I am still puzzled by the almost criminal status the word “borrow” has in teaching and performing subtraction. The politically correct term is “rename,” which is certainly fine, except that “borrow” makes plenty of sense, too. The idea that one could borrow 10 ones from the 10s column while subtracting something like 32-19 is simple to explain: Instead of 30+2 minus 10+9, we have 20+12 minus 10+9, giving 10+3, or 13. (it looks better when presented vertically, which is also a convenient way to present subtraction of polynomials). I once asked my algebra 2 class why cross multiplication works and no one could answer me. I was asking in the context of solving rational equations. I showed them first something like x/2+1/3=3/4. We cleared the fractions by multiplying both sides of the equation by 12 — the least common denominator, producing the much simpler equation of 6x + 4 = 9. Then if we have 12/x =x/3, and multiply both sides by 3x — the least common denominator, we get the same result as cross multiplication — which is why it works! They seemed stunned that it was that simple. I have looked in middle school textbooks and pretty much, in presenting cross multiplication, it just says something like if you have a/b = c/d, then ad=bc, followed by some examples of how to use this technique — absolutely no development of why it works.

Myth #3 says that there is “only one way to do math.” All math teachers that I have ever met know that this is not true and in fact, the best attribute that a strong math student can have is flexibility. I once had a student who felt that he had to use the quadratic formula for all quadratic equations and he refused to use anything else. I asked him if he knew where the quadratic formula came from and he said he didn’t care — he only just wanted to use it. Of course, the quadratic formula is derived by completing the square, another technique used for solving quadratic equations. If one has to solve something like x^2=9, the quadratic formula is quite inefficient. If one is trying to find the x-intercepts of y=x^2-x-12, the quadratic formula is much less efficient than factoring. If one wants to put f(x)=x^2-x-12 into its vertex form, then completing the square is handy.

I don’t like the idea of using keywords to solve math problems, such as the problem about 6 bags of marbles costing $18 leading a student to use multiplication because of the word “of.” I think the keyword approach to solving problems is ineffective because it takes focus away from visualizing and making sense of a problem and tries to impose a “recipe” approach. Drawing pictures, acting out problems, and using physical objects can be helpful in getting young students to figure out how to solve their math problems.

The bottom line for me is that articles like this just promote the very myths that the author is trying to use to promote her “Zearn” program. That is, the accusation is made that kids struggle in math because teachers abide by the “myths,” and that online programs like “Zearn” are much better than teachers in a classroom. Sorry, but I don’t accept that. I tried to get some specific examples of this program but was always asked to “sign in.” I’m not sold that easily…

Posted in Curriculum & Instruction, Education journalism, Joye Walker, K-12, math, Mathematics, STEM | Tagged , , , , , , | 1 Comment

Texas School Districts Violated a Law Intended to Add Transparency to Local Elections

ProPublica and The Texas Tribune analyzed 35 Texas school districts that held trustee elections last fall and found none that posted all of the required campaign finance records.

by Lexi Churchill and Jessica Priest

ProPublica is a Pulitzer Prize-winning investigative newsroom. Sign up for The Big Story newsletter to receive stories like this one in your inbox.

Last year, in an effort to bring greater transparency to local elections, the Texas Legislature mandated that school districts, municipalities and other jurisdictions post campaign finance reports online rather than stow them away in filing cabinets.

But many agencies appear to be violating the law that took effect in September.

ProPublica and The Texas Tribune examined 35 school districts that held trustee elections in November and found none that had posted all of the required disclosures online that show candidates’ fundraising and spending. (Two of the districts did not respond to questions that would allow us to determine whether they were missing these reports.) And the agency tasked with enforcing the rules for thousands of local jurisdictions does not have any staff dedicated to checking their websites for compliance.

“The public not having access to those records because they’re not turned in or not posted in a timely fashion means that the public can’t make an informed decision based on where that candidate’s financial support is coming from,” said Erin Zwiener, a Democratic state representative from Driftwood who has pushed for campaign finance reform.

The interest in more transparency in local elections is bipartisan. “The local level has an amazing amount of funding and activity going through their respective districts, whether it be a school district, the city councils and the counties,” said Republican Carl Tepper, the state representative from Lubbock who authored the bill.

Of all the local government offices now required to upload campaign finance information online, the newsrooms focused on school boards because of the growing push by hard-line conservatives to reshape the elected bodies and advance vouchers as an alternative to public schools. Over the past several years, school boards across the country have shifted from traditionally nonpartisan bodies to increasingly polarized ones grappling with politically charged issues like mask mandates, book bans and bathroom policies for transgender people.

“If candidates are being pushed and funded to fight a proxy culture war in our school districts, I hope that that information can at least be public and easily available and that we can know how frequently that’s happening in Texas,” Zwiener said in an interview.

ProPublica and the Tribune contacted each of the school districts to ask about the missing documents. Some districts said they were aware of the mandate but still had not complied. Among their explanations: They did not receive enough instructions about the implementation and their websites were undergoing changes. A spokesperson for Lago Vista Independent School District, outside Austin, said simply, “Unfortunately, with the multitude of legislative mandates following the 88th session, this one got by us.”

Most often, school leaders said they had not known about the new law and subsequently uploaded the reports. The vast majority of districts, however, were still missing filings on their website because they never received or lost required reports from at least one candidate, actions that violate other parts of the state’s election law.

The newsrooms also found a handful of instances in which candidates or school districts hid donor names and parts of addresses, even though the law doesn’t allow for those redactions.

Had the late filings been submitted in one of Texas’ statewide races, they would have been flagged by the Texas Ethics Commission, the agency tasked with enforcing the state election laws, and the campaigns would have been automatically fined. For each of the 5,000 elected officials and candidates running for state office each year, the agency sends notices about upcoming filing deadlines, penalizes late filers and then considers their subsequent requests to reduce those fees. The commission also compiles all of their campaign finance reports into one searchable online database going back decades.

The agency does not follow any of these steps for local candidates. Instead, it investigates only when it receives a complaint.

None of the districts that responded to our questions sent a complaint to the commission. (The Texas Ethics Commission does not require them to do so.)

Matthew Wilson, an associate professor of political science at Southern Methodist University in Dallas, said it is reasonable to cut districts some slack for now because it’s a new requirement. But over time, without effective enforcement, local agencies won’t feel any pressure to comply with the new law.

“It’s one thing to have a law, but if it’s a law for the violation of which no one ever gets punished, you’re going to have a low level of compliance,” he said. “The ball is really in the court of TEC to decide whether this law is going to have teeth.”

The new law applies to elected officials and candidates seeking local positions across the state’s 254 counties, more than 1,000 school districts and roughly 1,200 cities and towns. In the past, their campaign finance details were kept on handwritten forms that offices were required to keep on file for two years before destroying them. They now have to be maintained online for five years.

Of the districts that uploaded their records after being contacted by ProPublica and the Tribune, most candidates raised a few thousand dollars or less, though the newsrooms found a few who had raised at least $10,000 or had the support of political action committees. Voters did not have easy access to this information at the time of the elections, which was the law’s intent.

One candidate in West Texas, Joshua Guinn, raised more than $30,000 in his run for Midland ISD school board. During a public forum in October, a few weeks before the election, Guinn said his large fundraising haul was attributable to “family, friends, just people that believe in me.” His filings showed that he spent more than $20,000 on advertising and consulting services provided by CAZ Consulting, a firm that the Texas Observer has connected to a widespread effort to support far-right candidates. Guinn ultimately lost his race to the former board president.

A spokesperson for Midland ISD said the district aims to be compliant with all legislative requirements but that it did not receive a specific notification from TEC or state education regulators about the new law. Christopher Zook Jr., president of CAZ Consulting, said in an email, “All campaign finance reports should be easily accessible to the public. Publicly available finance reports allow for greater transparency in the political process for everyone.”

In a Houston-area school district, Aldine ISD, campaign finance reports were not posted online for seven of the 10 candidates seeking a position on the board. Once the newsrooms reached out, the district uploaded a report from incumbent William Randolph Bates Jr. It showed that he raised more than $30,000, including $4,000 from two PACs. But the school district said Bates and six other candidates did not turn in their mandated filings before the election. Bates won reelection.

Neither Guinn nor Bates responded to interview requests.

And until we asked, Princeton ISD, about 40 miles north of Dallas, did not post the campaign finance reports for any of the four candidates seeking two at-large positions on the school board in November. This made it more difficult for voters to know who was behind a mailer sent by the Collin Conservatives United PAC. The two-sided pamphlet contrasted incumbent school board President Cyndi Darland, whom it said “we can trust,” against another candidate, Starla Sharpe, whom it claimed will encourage a “woke agenda,” won’t stop critical race theory and “won’t get rid of sexually explicit materials that harm our children.”

Sharpe said in an interview with the news organizations that the mailer contained false statements about her and that Darland told her she had nothing to do with the mailer. But when the district posted Darland’s report following our inquiries, it revealed that she contributed to the PAC behind the mailer.

“I absolutely think this would have been important for voters to be aware of and to see the caliber of the individuals that you are voting for and the integrity they have,” Sharpe said.

Darland declined a phone interview and did not answer questions by email because she said she had been in a car wreck and was in pain and on medication. Laura Dawley, treasurer of the Collin Conservatives United PAC, declined to comment. Darland and Sharpe won the two open seats.

Political activity within local races like school boards has not been a major concern until the last few election cycles, according to Brendan Glavin, deputy research director at OpenSecrets, a nonprofit that collects state and federal campaign finance data. Glavin said it is somewhat common for states to have local candidates’ filings remain at the local level, given those races historically do not generate a lot of money and were not considered overtly political.

“This is an area where the disclosure law is lagging behind what is becoming the political reality,” Glavin said, as these races become higher profile and attract money from outside the community.

Tepper, the Lubbock representative, began last year’s legislative session with a far more ambitious proposal to create a searchable database for all filings. But he quickly abandoned the idea once TEC officials told him it would cost around $20 million to maintain — a fraction of the cost of the state’s leading priorities like its $148 million program to bus newly arriving migrants out of state. Tepper told the newsrooms he thought the estimate was “a little outlandish” but decided to take “the path of least resistance” with his online posting idea instead.

Later that session, Zwiener alternatively proposed to require all local candidates and officeholders who raise or spend more than $25,000 to send their reports to TEC, but the Legislature did not move forward with that idea either.

TEC Executive Director J.R. Johnson said Tepper’s initial proposal would have increased the agency’s workload from 5,000 filers currently to nearly 50,000 filers each year if just two candidates ran for every local office.

Johnson would not comment on whether the agency has enough funding to keep up with its current tasks but instead referred the news organizations to the commission’s reports to the Legislature, which detail its rapidly increasing workload, “persistent staffing shortages” and practically stagnant budget.

The commission wrote that campaign finance reports have been “growing dramatically,” with statewide candidates’ average contributions quadrupling from $5.6 million in 2018 to $25.7 million in 2022. The resulting reports are lengthy — one surpassed 100,000 pages — and “have been testing the limits of the TEC’s server hardware for years,” the agency wrote. Yet when the commission requested funding to help the system run smoother in 2022, lawmakers denied the request. Shortly after, the servers failed.

All other regulatory agencies in the state receive more funding than TEC, the office wrote in a report to the Legislature, including the Texas Racing Commission, which oversees horse and greyhound races. “We were unable to find any state that invested less in its ethics agency on a per capita basis,” the report said.

The Legislature did increase the agency’s budget by about $1.2 million last year, which Johnson said has helped prevent turnover.

Johnson said the commission has made “significant efforts” to ensure that local authorities know about the new law, such as sending notices and presenting at the annual secretary of state conference for local jurisdictions, but that it can take time for entities to become educated about an updated requirement.

Tepper said he hopes the lack of compliance was due to the districts not knowing about the updated requirement and not flouting the law. He said in an interview that he appreciated the newsrooms “calling around and putting some spotlight on this so maybe they’ll be informed now and can comply with the state law.”

Methodology

The newsrooms aimed to examine compliance among all of the districts with November 2023 trustee elections, the first races since the new law went into effect in September. We reached out to more than a dozen statewide election and education agencies and associations to locate a calendar with all school board races dates, but none could provide one. In the absence of an official source, the Tribune and ProPublica pieced together our own list of November races through media clips and contacted 35 school districts.

Of those, we did not find any that were in full compliance with the state’s election laws. Two districts did not respond to questions that would allow us to determine whether they followed the rules. They are Spring ISD in north Houston, and Pleasant Grove ISD in East Texas.

Of the 33 districts we found out of compliance with state election laws, 21 had at least some reports on file but had not uploaded them, which broke the new regulation established by House Bill 2626. At least 16 of those districts were missing at least one report, though typically multiple reports, that they never received from candidates. Most of these districts have since uploaded their missing reports, though two districts have still not done so: New Caney and Shepherd ISDs, north of Houston.

The 12 other districts said they either never got any filings from candidates or they lost the records that should have been posted online. The ethics commission told the newsrooms this is not technically a violation of HB 2626, but it breaks other election laws that require candidates to file certain reports and mandate that districts keep them on file.

Posted in Ethics, information suppression, K-12 | Tagged , , , , | Leave a comment

The Malfunction of US Education Policy: Elite Misinformation, Disinformation and Selfishness [book review]

“Many who work in America’s public schools, teacher preparation programs, school district offices, and other such places often marvel at how out-of-touch education policy seems and wonder why it ignores the basic problems facing those in the trenches. In a masterful work, Phelps suggests this disconnect stems from the misinformation, disinformation, and selfishness of policy makers. The book’s eight chapters address the view of education policy from 2001, the triumph of strategic scholarship, the education establishment cartel, linchpins of the cartel alliance, the education reform cartel, the dense web of Common Core confederates, the permanent education press, and the view from 2023. While many of these topics have been examined before, Phelps brings a fresh, piercing, and astute outlook. This book would be a superb complement for a class using Fullan’s Leading in a Culture of Change (Jossey-Bass 2020), Sizer’s Horace’s Compromise (Harper, 2004), or Duke’s Leadership for Low-Performing Schools (Rowman & Littlefield, 2015). While essential for those interested in school leadership and change, the work will also be of interest to those interested in public policy, ethics, or the political process. Essential. Advanced undergraduates through faculty; professionals; general readers.”

— Choice Reviews

Posted in Uncategorized | Leave a comment

The Malfunction of US Education Policy: Elite Misinformation, Disinformation and Selfishness [book review]

Rowman & Littlefield Publishers, April 2023, 196 pages, ISBN 9781475869941

With scholarly precision, Phelps details the collection of actors that have driven and continue to propel U.S. education policy and preferred narratives. In doing so, he has laid out a web of collusion of an inter-connected “echo chamber” or, “mutual admiration society,” composed of education “non-profits,” various repetitively used sources, and education research “think tanks,” as well as funding sources for those policies and narratives that often lead back to just a handful of foundations, namely the Bill and Melinda Gates Foundation. Phelps also draws a clear line between those funding sources and journalists, as well as the education reporting outlets they represent which promulgate the education fad du jour pushed by the aforementioned think tanks. This book is a must-read for anyone taking education reporting at face value – especially policymakers and elected officials.


– A.P. Dillon, Education Reporter, North State Journal.

Posted in Common Core, Education Fraud, Education journalism, Education policy, Education Reform, information suppression, K-12, partisanship, research ethics, Richard P. Phelps, US Education Department | Tagged , , , , , , | Leave a comment

Mississippi: Progress Commanding Attention or Outright Miracle?

Due to comments from others about Mississippi, I thought it would be useful to post a short message with some of the data I have been looking at recently that tells me while Mississippi’s educational improvements are not in the miracle category, they are really notable and command attention. Let me provide some evidence.

The first attached jpg shows how Mississippi stacked up against other states in NAEP Grade 4 Reading in 2013, the year its reform legislation was enacted, and the latest 2022 results. This and the following jpgs were prepared using the NAEP Data Explorer, by the way.

Looking at the first jpg, there is no other way to consider this than a remarkable improvement for Mississippi.

First, note I have separately analyzed scores for white and Black students. If you only look at overall average scores, you won’t see what is really happening because by only looking at overall scores you wind up comparing a lot of kids of color in Mississippi to white students in a number of other states. Even the NAEP 2009 Science Report Card discusses this issue and you can check Page 32 in that report card if you want more on this topic.

Getting back to the first jpg, note that between 2013 and 2022, Mississippi has really jumped up in relative ranking for both white and Black students.

If we honor the statistical sampling errors present in all NAEP scores, Mississippi’s progress in NAEP Grade 4 Reading is still remarkable.

For example, in 2013, white students in 40 of the 50 states outscored MS’ whites. By 2022, only 2 states could claim they performed better after the NAEP sampling errors were considered.

Despite the general trend thanks to COVID, MS white scale score went from 222 to 230 between 2013 and 2022, as well. Consider what happened to 2013’s top-placed Maryland for a comparison. In 2013 Maryland’s whites scored 244. By 2022, Maryland only scored 232, which was statistically tied with Mississippi.

For Black students, in 2013, out of the 39 states that had Black scores in both years, 18 statistically significantly outperformed MS’ Black students. In 2022, no state in the nation statistically significantly outperformed MS’ Blacks. MS’ Black student scale score also rose from 197 to 204. Maryland was again the top performer in 2013, with a Black scale score of 214. By 2022, almost certainly thanks in part to COVID, Maryland had decayed to 202, which was not statistically significantly different from MS’ 204, but certainly wasn’t higher anymore.

A miracle? No. Darn attention-getting — YOU BET!

But, what about the claim it was all just due to MS retaining more students than any other state. Well, Fordham Institute actually put out a “Flypaper” claiming that back in 2019 (https://tinyurl.com/5dy47vam). But, if you look at it today, there is a disclaimer at the beginning of the article saying no, retention doesn’t explain away MS’ improvement. Fordham looked at some demographic data available in the NAEP Data Explorer which showed that MS always had high retention rates and the NAEP samples were always similarly impacted, so the change in performance could not be due to that. You can see that data Fordham talks about in the second jpg. This shows the percentages of students in the NAEP tested samples that were below, at, or above the modal age for Grade 4 NAEP, which is 9 years old. As you can see, things haven’t changed much all the way back to 1992, the first year State NAEP was given in Grade 4 reading.

Another factor could inflate NAEP scores — large exclusion rates of students. However, as various sections of Table 18 in the attached Excel spreadsheet show, from 2013 on MS has only excluded 1% of the raw sample NAEP wanted to test, the lowest rate for any participating state. So, that doesn’t explain away MS’ improvement either.

So, bottom line for me at this point is MS’ reading improvement in Grade 4 is real, and significant. Those who want to disregard MS are not conversant with all the data and/or are playing adult politics for other reasons (maintaining legacy, lucrative contracts, etc.).

But, what about the claim that the improvements in Grade 4 never showed up in Grade 8 NAEP.

Well, to be honest, since they only really showed in Grade 4 in 2019, it didn’t seem like there had been enough time for things to start improving in Grade 8. But, surprise! Take a look at the third jpg.

As of 2022, improvements in MS’ reading performance are starting to show up in Grade 8 NAEP, too! In 2013, whites in 43 states statistically significantly outperformed whites in MS. By 2022, only 5 states could make the same claim.

For Black students, out of 42 states with scores in 2013, Blacks in 27 states outscored MS’ Blacks. Flash forward to 2022, and Black students in only 1 state can make that claim!

For both whites and Black students, MS’ NAEP Grade 8 Reading Scale Score only increased 1 point, but in general scores declined elsewhere. For example, top-scoring Massachusetts scored 285 in 2013 but lost 10 points in the 2022 Grade 8 NAEP Reading results for whites. For Black students, top-scoring New Jersey in 2013 lost 11 points by 2022.

So, even in Grade 8, the Mississippi reform looks like it is starting to show notable progress.

Oh, the last jpg shows that retention doesn’t explain the Grade 8 results, either.

Those trying to deny this just don’t know the data or have other motives that are not in the best interests of students.

Posted in Curriculum & Instruction, Education journalism, K-12, math, Mathematics, Richard Innes, US Education Department | Tagged , , , | Leave a comment

The High Price of the Education Writers Association’s News

EWA’s Form 990 tax filings to the IRS for the five tax years 2015 to 2019 reveal the following:

Tax Year | Membership Dues ($000s) | Contributions (gifts, grants, etc.) ($000s)
2015 | 19.2 | 2,797.8
2016 | 20.6 | 3,419.6
2017 | 22.0 | 2,414.9
2018 | 21.4 | 3,088.9
2019 | 17.6 | 2,567.2

EWA’s income from contributions dwarfs that from membership dues by a ratio of about 150 to one (Internal Revenue Service, 2015–2019). Its contributors overwhelmingly supported Common Core.

As of 2019, EWA’s five “Officers, Directors, Trustees, Key Employees, and Highest Compensated Employees” all enjoyed six-figure salaries.

Current Sustaining Funders:
Bill & Melinda Gates Foundation, Carnegie Corporation of New York, Chan Zuckerberg Initiative, Foundation for Child Development, Funders for Adolescent Science Translation, The Joyce Foundation, The Kresge Foundation, Lumina Foundation, The Spencer Foundation, The Wallace Foundation, The Walton Family Foundation, William and Flora Hewlett Foundation

National Seminar Sponsors (2022):
ECMC Foundation, Chan Zuckerberg Initiative, SXSW EDU, EGF Accelerator, Arnold Ventures, IBM, American Institutes for Research, GreatMinds, Lumina Foundation, National Alliance for Public Charter Schools, Collaborative for Student Success, Network for Teaching Entrepreneurship, Flyover Zone, SAGA Education, American Federation of Teachers, National Education Association, University of California Riverside School of Education

Sponsorship Opportunities:
Website Messaging
Purchase announcement space on EWA’s website for four weeks.
• Run of site – $ 5,000
• Blogs – $ 2,000
• Jobs – $ 2,000
• Events – $ 1,200

Podcast Sponsorship
“EWA Radio produces a weekly podcast focused on journalism and the education beat. The EWA public editor hosts engaging interviews with journalists about education and its coverage in the media.”
• $ 3,000
Sponsorship Details
• Sponsorship of four EWA Radio podcast episodes
• Acknowledgment of sponsorship on promotional emails and materials
• Verbal acknowledgement of sponsorship by EWA representative during the podcast episode
• Acknowledgement of sponsorship on EWA website

Exclusive Newsletter Messaging
• $ 2,500
Details
• Four-week purchase
• Exclusive sponsorship of EWA e-newsletter sent on Monday, Wednesday, and Friday

Newsletter Messaging
• $1,000
Details
• Four-week purchase
• Space in EWA e-newsletter sent on Monday, Wednesday, and Friday

Print Messaging
• $1,000
Details
• Full-page, color announcement in printed program at topical/regional journalist-only seminar
• Available for any topic-based seminar. Previous topics include: higher education, Latino education, student safety and well-being, teacher training and evaluations, adolescent learning, student-centered learning, charters and school choice, assessments and testing, early education, and STEM education.

The registration fee for the Education Writers Association 2023 National Seminar in Atlanta:
$650/person (late fee $800)

References

Education Writers Association. (2023, March 27) EWA Today. Washington, DC: Author.

Internal Revenue Service. (2015). Form 990 (Return of Organization Exempt From Income Tax) for Education Writers Association. https://nonpartisaneducation.org/Malfunction/form990-237439790-education-writers-association-2016-09.pdf

Internal Revenue Service. (2016). Form 990 (Return of Organization Exempt From Income Tax) for Education Writers Association. https://nonpartisaneducation.org/Malfunction/form990-237439790-education-writers-association-2017-09.pdf

Internal Revenue Service. (2017). Form 990 (Return of Organization Exempt From Income Tax) for Education Writers Association. https://nonpartisaneducation.org/Malfunction/form990-237439790-education-writers-association-2018-09.pdf

Internal Revenue Service. (2018). Form 990 (Return of Organization Exempt From Income Tax) for Education Writers Association. https://nonpartisaneducation.org/Malfunction/form990-237439790-education-writers-association-2019-09.pdf

Internal Revenue Service. (2019). Form 990 (Return of Organization Exempt From Income Tax) for Education Writers Association. https://nonpartisaneducation.org/Malfunction/form990-237439790-education-writers-association-2020-09.pdf

Posted in Common Core, Education journalism, Education policy, Education Writers Association, K-12 | Leave a comment

The Malfunction of US Education Policy: Elite Misinformation, Disinformation, and Selfishness

Looks like ebook/kindle version is now available. “Look Inside” feature on Amazon shows Preface and Intro.

https://www.barnesandnoble.com/w/the-malfunction-of-us-education-policy-richard-p-phelps-founder-of-nonpartisan-education-review-author-and-editor-of-correcting-fallacies-about-educa/1142557816

https://www.amazon.com/Malfunction-Education-Policy-Misinformation-Disinformation/dp/1475869940/

Posted in Censorship, Common Core, Education journalism, Education policy, Education Reform, Education Writers Association, Richard P. Phelps | Leave a comment

This Private Equity Firm Is Amassing Companies That Collect Data on America’s Children



Vista Equity Partners has been buying up software used in schools. Parents want to know what the companies do with kids’ data



By: Todd Feathers



Originally published on themarkup.org



Over the past six years, a little-known private equity firm, Vista Equity Partners, has built an educational software empire that wields unseen influence over the educational journeys of tens of millions of children. Along the way, The Markup found, the companies the firm controls have scooped up a massive amount of very personal data on kids, which they use to fuel a suite of predictive analytics products that push the boundaries of technology’s role in education and, in some cases, raise discrimination concerns.



One district we examined uses risk-scoring algorithms from a company in the group, PowerSchool, that incorporate indicators of family wealth to predict a student’s future success—a controversial practice that parents don’t know about—raising troubling questions.



“I did not even realize there was anybody in this space still doing that [using free and reduced lunch status] in a model being used on real kids,” said Ryan Baker, the director of the University of Pennsylvania’s Center for Learning Analytics. “I am surprised and really appalled.” 



Vista Equity Partners, which declined to comment for this story, has acquired controlling ownership stakes in some of the leading names in educational technology, including EAB, which sells a suite of college counseling and recruitment products, and PowerSchool, which dominates the market for K-12 data warehousing and analytics. PowerSchool alone claims to hold data on more than 45 million children, including 75 percent of North American K-12 students. Ellucian, a recent Vista acquisition, says it serves 26 million students. And EAB’s products are used by thousands of colleges and universities. But parents of those students say they’ve largely been left in the dark about what data the companies collect and how they use it. 



“We are paying these vendors and they are making money on our kids’ data,” said Ellen Zavian, whose son was required to use Naviance, college preparation software recently acquired by PowerSchool, at Montgomery Blair High School in Silver Spring, Md.



After growing concerned about the questions her son was being asked to answer on Naviance-administered surveys, Zavian and other members of a local student privacy group requested access in 2019 to the data the company holds on their children from the district under the Federal Educational Rights and Privacy Act (FERPA). But to date, she has received back only usernames and passwords.



“Parents know very little about this process,” she said. 



The ed tech companies in Vista’s portfolio appear to operate largely independently, but they have entered into a number of partnerships that deepen the ties of shared ownership. PowerSchool and EAB, for example, have a data integration partnership aimed at “delivering data movement solutions that drive value and save time for Districts.”  The two companies also signed another deal last year that made EAB the exclusive reseller of some PowerSchool products. 



EAB did not respond to requests for comment.



To piece together the extent of the companies’ data collection, The Markup reviewed thousands of pages of contracts, user manuals, data sharing agreements, and survey questions obtained through public records requests. 



We found that the companies, collectively, gather everything from basic demographic information—entered automatically when a student enrolls in school—to data about students’ citizenship status, religious affiliation, school disciplinary records, medical diagnoses, what speed they read and type at, the full text of answers they give on tests, the pictures they draw for assignments, whether they live in a two-parent household, whether they’ve used drugs, been the victim of a crime, or expressed interest in LGBTQ+ groups, among hundreds of other data points. Each Vista-owned company doesn’t necessarily hold all the data points listed here.



Some of those data fields were recorded in the traffic between students’ computers and PowerSchool servers when students used their accounts. The Markup reviewed the accounts with students’ permission. Other data fields were listed in districts’ data privacy agreements with PowerSchool and the data library—a list of all available data fields—for one district’s PowerSchool database. Our review offers a more detailed picture of the company’s data operations than PowerSchool publicly discloses, but it is likely an incomplete portrait.



According to its contracts with school districts, PowerSchool has the right to de-identify the data it holds on their behalf—by removing fields such as names and social security numbers—and use it in any way it sees fit to improve and build its own products. 



In some districts, such as Miami-Dade County Public Schools, recent PowerSchool contracts have exceeded $2.5 million for a single year, according to copies of the deals obtained through public records requests.



“It’s hard for me to understand how PowerSchool would not be paying for the privilege” of extracting so much student data, said Alex Bowers, a professor of educational leadership at Columbia University’s Teachers College. “You don’t pay the oil company to come pump oil off your land; it’s the other way around.”



PowerSchool declined to answer specific questions about the data it collects and how it uses that information.



“At PowerSchool, ensuring student equity, privacy, and access to good quality education is our top priority and is foundational to everything we do,” Darron Flagg, the company’s chief compliance and privacy officer, wrote in a brief statement to The Markup. “PowerSchool strictly and proactively follows legal, regulatory, and voluntary requirements for protecting student privacy including the Family Educational Rights and Privacy Act (FERPA), state regulations, and the Student Privacy Pledge. PowerSchool customers own their student and school data. We do not sell student or school data; we do not collect, maintain, use, or share student personal information beyond what is authorized by the district, parent, or student.”



A Cautionary Tale: Elgin, Illinois



Many of PowerSchool’s newer product lines, including its predictive analytics tools and personalized learning platform, require troves of student data to train the underlying algorithms. But experts who reviewed The Markup’s findings said that some of the data being used for those purposes is bound to lead to discriminatory outcomes.



Consider School District U-46 in Elgin, Ill., which was the only district—out of 27 we submitted public records requests to—that provided a complete list of the data PowerSchool warehouses on its behalf. The district also provided documents detailing how PowerSchool’s predictive analytics algorithms draw on some of that data to influence students’ educational journeys.



U-46’s PowerSchool database contains nearly 7,000 data fields about Elgin students, parents, and staff, according to a copy of the data library The Markup obtained.



As early as first grade, algorithms from the company’s Unified Insights product line start generating predictions about whether students are at low, moderate, or high risk of not graduating high school on time, not meeting certain standards on the SATs, or not completing two years of college, among other outcomes. The district’s documents describe dozens of different predictive models available via PowerSchool, although U-46 says it does not use most of them.



The district begins displaying student on-time graduation risk scores to teachers and administrators beginning in seventh grade, according to Matt Raimondi, Elgin’s assessment and accountability coordinator.



Free and reduced lunch status—a proxy for family wealth—and student gender are among the most important factors in determining that risk score, according to the documents. At one point, Elgin’s models—developed by a company called Hoonuit that was acquired by PowerSchool in 2020 and rebranded as Unified Insights—also incorporated student race as a heavily weighted variable. 



Flagg, from PowerSchool, said race was removed from the models in 2017 before the company acquired Hoonuit.



The predictive models also draw on data points like attendance, disciplinary history, and test scores.



Learning analytics experts told The Markup that the use of demographic data like gender and free and reduced lunch status—attributes that students and school officials can’t change—to predict student outcomes is bound to encode discrimination into the predictive models.



“I think that having [free and reduced lunch status] as a predictor in the model is indefensible in 2021,” said Baker of the University of Pennsylvania’s Center for Learning Analytics. Baker has consulted with BrightBytes, a competitor of PowerSchool in the K-12 predictive analytics space.



“Unified Insights does provide the option for school districts to include free and reduced lunch status to enable districts to reduce dropout risk associated with economic hardship and identify additional social service supports that may be available to impacted students,” Flagg, from PowerSchool, wrote in an email. 



“Including these things that are not within the control of the family or the school is highly problematic,” said Bowers, from Columbia University Teachers College, because even the best-intentioned school cannot change all the systemic gender and wealth disparities that affect a particular student. Basing the risk scores so heavily on those factors therefore obscures the impact of other factors a school may be able to influence, he said.



Raimondi said U-46 has chosen not to use many of the predictive models PowerSchool makes available because of their reliance on immutable student characteristics



“Especially down at the early grades, we don’t even make it visible to any users besides myself and a programmer,” he said. “The models at the lower grades, they’re not that accurate and they rely a lot more heavily on demographic-type data.”



Each year, Elgin’s dropout risk model misses about 90 students in each grade level, out of 3,000 students per grade, who do not go on to graduate on time, according to a presentation prepared by a PowerSchool data scientist and obtained by The Markup.



“We have no comment on the sensitivity/specificity of the models,” U-46 spokesperson Karla Jiménez wrote in an email.



The Markup has previously reported on a similar dropout prediction tool EAB sells to colleges and universities. Some of those schools incorporated race as a “high impact predictor” of success, and their algorithms labeled Black students “high risk” at as much as four times the rate of their White peers, effectively steering students of color away from certain majors. After our reporting, Texas A&M University dropped the use of race as a predictive variable. 



The Data Empire Is Growing



Vista Equity Partners has been expanding its reach in the educational software industry for years. Along with that expansion, it’s put together a portfolio of companies that amass data and effectively track kids throughout their educational journeys. 



Since 2015, when Vista first purchased PowerSchool from Pearson for $350 million, Vista has been on a spending spree, acquiring other ed tech companies that collect different kinds of student data.



In 2017, PowerSchool bought SunGard K-12, which provided human resources and payroll software for schools. In 2019, it purchased Schoology, a widely used learning management system that served as the digital backbone for many schools’ curriculum and lesson plans. It acquired Hoonuit, which provides the predictive risk scoring used by districts like Elgin, in 2020. 



Last March, it completed the purchase of the college preparatory software Naviance, and in November it purchased Kickboard, a company that collects data about students’ behavior and social-emotional skills. In presentations to investors, PowerSchool officials have said more acquisitions are a key part of the company’s growth plan.



EAB has been on a similar purchasing spree, acquiring companies like Wisr, YouVisit, Cappex, and Starfish that are used for college recruitment, advertising, and tracking students on campus. It also announced the creation of Edify, a “next-generation data warehouse and analytics hub” designed to “break down data silos.”



Last June, Vista also acquired a co-ownership stake in Ellucian, which sells a variety of educational technology products. The company claims to serve more than 26 million students across 2,700 institutions.



That consolidation of data and power has triggered a backlash from privacy-minded parents, some of whom have been trying, unsuccessfully, to find out what the deals mean for their children’s sensitive data.



Piercing the veil of secrecy can be difficult, even when parents turn to privacy laws designed to increase transparency.



Illinois, for example, has a state law that requires school districts to post specific information about the ed tech vendors they use, including all written agreements with vendors and lists of the data elements shared with those vendors.



Despite that, districts like Chicago Public Schools have yet to post any of the required material pertaining to PowerSchool and Naviance. CPS has, however, posted data use disclosures for other vendors. Across Illinois, 5,800 schools use PowerSchool software, according to the company.



FERPA has also proven of little use for some parents.



Cheri Kiesecker, a Colorado parent of two, said that she requested her children’s records under the law from PowerSchool earlier this year after it completed the Naviance deal. 



“Each school district owns and controls access to its students’ data, Flagg, from PowerSchool, wrote in an email to The Markup. “Any requests from parents for access to their children’s data must be managed through their respective school districts.



PowerSchool instructed Kiesecker to request the records through the school, which she did. When PowerSchool did not comply with the school’s subsequent request by the statutory 45-day deadline, her school’s attorneys sent a legal demand to the company, which The Markup reviewed. To date, Kiesecker said, she has still not received her children’s complete records, although PowerSchool has provided partial documentation.



Deborah Simmons, a Texas parent, said she began looking into the Vista-owned companies after discovering that her school had automatically uploaded her child’s data into Naviance. She filed public records requests and grievances with her school but still doesn’t know the full extent of the data the companies hold or who else it’s been shared with.



“These tech companies want to eliminate the data silos and merge and streamline all of this stuff, but no, our children aren’t products,” Simmons said. That’s what they do, they treat our children like products. They’re human beings and they deserve privacy and freedom.”


This article was originally published on The Markup and was republished under the Creative Commons Attribution-NonCommercial-NoDerivatives license.

Posted in Ethics, FERPA, K-12, privacy | Leave a comment

Iowa Academic Standards Hold Teachers Hostage

By Joye Walker

I retired more than a year ago, giving me many months to process the discomfort I felt in my last few years of teaching. It was a difficult time for many reasons, but one big reason stands out: a problematic curriculum that holds teachers hostage.

The Iowa Academic Standards is a set of “Clear and rigorous learning standards educators use to ensure all students are college, career, and future ready.” They are “required for all students by state law.” (https://educateiowa.gov/iowa-academic-standards)

While the intentions of the Standards are admirable, the administration of actually delivering a curriculum that satisfies these Standards is fraught with problems. Teachers are required by state law to deliver a curriculum that consists of topics strictly outlined by grade level in the Iowa Academic Standards. Teaching math is an art that requires great flexibility on the part of teachers. Most administrators do not understand what is involved in teaching mathematics or how mathematics should be taught. Within a single classroom, students demonstrate a wide range of abilities and degrees of mastery of content previously taught to them. Teachers are faced with the monumental task of figuring out how to present content in order to bring each student forward in learning new concepts.

The Iowa Academic Standards consist of some major content domains, and within each domain are found specific standards. This is a simplistic view of school mathematics. It implies that mathematics can be reduced to a finite list of topics to be taught at each grade level. Realistically, mathematics is a complex intertwining of all math and other studies learned since elementary school, including reading, science, and social studies. Mathematics builds upon previously learned skills, including reading skills, language skills, computational skills, and logic skills. At any given grade level, the Iowa Academic Standards are written with the assumption that all students have some mastery of previously taught math standards. The reality is that no two students are at the same place in terms of concept mastery, in any given classroom. The skill of teaching is to bring students along, weaving previously learned skills and concepts with new ones.

Mathematics learning is a continuum. It makes no sense to have a finite list of standards to be taught one at a time when students encounter many skills and concepts that appear within a single problem. A teacher needs to determine where students are deficient in their skills and figure out how to address such deficiencies, which vary greatly within a given classroom. It should be the teacher’s call how to determine when this is accomplished and when they are ready to go on. School administrators frown upon reteaching concepts and skills for which students have incomplete learning, with the argument that such skills are below grade level and have already been taught. A teacher knows that not all students learn at the same pace and not all students master all topics. In fact, sometimes students struggle with a first exposure to a concept, but come to understand it much better after several more encounters with that same concept. Then, mastery can and will follow. The fact that students learn at different rates should not be a problem in school math classes, but teachers are discouraged from providing necessary remediation. Teachers also have to hurry students to learn more concepts when they have had woefully insufficient practice, most particularly with basic computation including with fractions and decimals.

To just say that students should be learning grade level standards while ignoring the fact that many students are not prepared to do this is not going to help. Administrators believe that teachers should be focused on grade level standards and, if necessary, choose only those that are most important. To most math teachers, it makes no sense to try to select those topics that are most important because they ALL are important! If we must do this, then we must not pretend that students who only studied a few topics are getting a full course in algebra or geometry or precalculus and we must not pretend that they are prepared for college math or entry into a STEM field.

Administrators also discourage the use of textbooks, encouraging teachers to use online or other sources. A good textbook is written in a sequence that develops new concepts by leading students from what was previously learned to new and related concepts. Development is carefully done in coherent textbooks, and also happens with good teaching, so that students can move forward in their learning. If, instead, teachers merely look at the list of standards relevant to their course and select materials about this topic or that one from various sources, there is no guarantee of a logical and sequential progression. Instead, a choppy, seemingly unrelated hodgepodge of topics ensues with the absence of extremely careful, time-consuming and technical consideration by the teacher. Students are left confused and are often unable to make connections among seemly random topics.

The Iowa Academic Standards is not a set of performance standards. In other words, it does not spell out the level of mastery that students must demonstrate in the form of concrete examples. Take for example, standard A-REI.B.3, which states:

“Solve linear equations and inequalities in one variable, including equations with coefficients represented by letters (For educators, mathematics DOK 1)”
(https://educateiowa.gov/standard/mathematics/algebra)

{DOK is an acronym for Depth of Knowledge, and has levels 1-4 with 1 being the lowest. It is a measure often used in test development by organizations such as ACT.}

If the Iowa Core would give examples of what is meant by a standard or the DOK designation, teachers might find it easier to use. For example, offer something like “Students should be able to solve a linear equation that contains variables on one or both sides, including numbers that may be fractional or decimal, such as − 3x − 5 = 7x + 3 or ax + b = c or 0.4x − 6(3x + 0.1) = 7 − 1/2x .” However, the Iowa Academic Standards are not presented this way, so it can be a mystery to determine what exactly is required of a student to demonstrate mastery of a given standard.

Instead, parents (and of course, teachers), reading the description above as stated in the Iowa Academic Standards in an effort to determine whether their child is being taught this particular standard, must first understand what is meant by linear and what is the meaning of the word coefficient.

Next, it is necessary to determine what is meant by DOK 1. Here is the description linked from the standard A-REI.B.3:

“Math Level 1 (Recall) includes the recall of information such as a fact, definition, term, or a simple procedure, as well as performing a simple algorithm or applying a formula. That is, in mathematics, a one-step, well defined, and straight algorithmic procedure should be included at this lowest level.”
(https://educate.iowa.gov/depth-knowledge-levels-descriptions-mathematics)

Not clear, is it?

Here are a few equations that are linear in the variable x:

x + 3 = 5

2x + 3 = 5

2x + 3 = 5x + 7

2x + 3 − 8x = 4(3x − 2) + 5

2/3 (4x − 9) = 1/4 (5 − 7x)

4ax − 3bx = cx + d

Which of these equations are DOK 1, according to the description provided above?

The first equation sets a pretty low bar. The second one isn’t much tougher. Where does this list cease to offer equations of DOK 1? I have been in rooms with seasoned educators who cannot agree on what constitutes DOK 1. And therein lies the problem. It is not clear to what expectations students are (or should be) held. I would expect my students to manage all of these equations in a high school algebra class. As you can see, the fifth equation requires fraction manipulation, as well as calculation with negative numbers. The student who did not master fraction computation or rules for combining negative numbers in previous grade levels is going to struggle at this point. Anyone who has taught algebra has seen this time and time again, yet such under-prepared students continue to be placed in algebra. However, if it is deemed that the first two equations are sufficient to satisfy the standard for algebra students in high school, then I do not hold out much hope for their success in post-secondary education math or other quantitative courses and most certainly, no hope for STEM field entry.

Even if all students in one school district are held to common expectations, there is absolutely no guarantee that all students in the next school district will be held to the same ones, due to the nebulous descriptions offered in the Iowa Academic Standards. One problem that teachers all over the nation face is dealing with the movement of students from one school district to another. Part of the art of teaching is figuring out how to catch students up if they enter a school with higher performance expectations. Requiring teachers to use Standards, then, does not ensure an equivalent educational experience from school district to school district or even from building to building within a school district. It requires great skill on the part of the math teacher to properly place and take care of incoming students. School administrators want to place students with their age group, regardless of deficiencies that would inhibit success in any given course.

My last point is about honors mathematics classes, which are falling out of favor in many circles. The idea is that honors classes are not clearly defined and because of this, should not be offered. It is acceptable to have vague curriculum descriptions, but somehow, great precision is required in describing honors level classes. “Elitist” and “biased” are words used to describe honors classes. I wonder how it is that coaches are allowed to select the starting team without being accused of having implicit bias, but teachers are not deemed professional enough to select the students who can handle a much higher level of study taken at a faster pace.

Today, a great many disparate levels of capability exist in our math classes. For teachers, it is very difficult to work with so many levels in one classroom. Teachers need to keep their most able students learning and progressing at high levels while simultaneously addressing sometimes profound deficiencies of students in the same class. Factor in the Iowa Academic Standards, and it becomes a study in frustration.

Mathematics has been my passion for decades, and it was an honor and a privilege to have the opportunity to teach students of all levels for twenty-three years. My experiences both in the classroom and in life have given me many perspectives on the application of the math that I so enjoy. I was once told by a high-ranking school administrator in Iowa that veteran math teachers should not be trusted to teach math right, and that they should all teach from scripts. If you haven’t been bothered by anything else I have written here, this should bother you.

Teaching has been reduced to a robotic kind of job that does not involve creativity, decision making, or professionalism. It is a micromanaged kind of work that stifles passionate teachers and takes away their opportunities to provide students with curricula that make sense. Teachers are kept from holding all students to high standards, academically and behaviorally. If we are to educate generations of people who must tackle increasingly difficult problems, then we should be providing our students with tools – the highest level of education that we can possibly offer. High level education includes opportunities to learn vocational and technical skills that are so valued in our workforce. Such skills can be infused into our daily classes. However, the Iowa Academic Standards hold teachers hostage as they prescribe a curriculum, which may not be the best one for everyone. High quality education should also include the expectation of adherence to deadlines, regularity of attendance, respectful behavior, and clarity of requirements for earning various grades, including failing grades. Teachers need to be able to have expectations in place, backed by administrative support for those expectations. It’s time to rethink what we are doing to our children, and start expecting the very best education that we can offer.

Posted in Common Core, Curriculum & Instruction, Joye Walker, K-12, math, Mathematics, STEM | Tagged , , , | Leave a comment

The absolute worst “real world” problem I have ever encountered

by Joye Walker

It was in the UCSMP Algebra 2 book and I encountered it during my first year of teaching. Here was the opening linear programming example.

***

Stuart Dent decided to investigate one of his typical meals, fried chicken and corn on the cob. He compiled the data in the following table

 Vitamin APotassium(mg)Iron (mg)Calories
Fried chicken10001.2122
Corn3101511.070


Stu let f= the number of pieces of chicken and e= the number of ears of corn. After deciding the minimum amounts of each needed from this meal he wrote the system:

f>=0
e>=0
100f+310e>=1000 (at least 1000 units of vitamin A)
151e>=200 (at least 200 mg potassium)
1.2f+e>=6 (at least 6 mg iron)
122f+70e>=600 (at least 600 Calories)

***

The point was to find the cheapest diet for a healthy life. One could argue that we are not talking about healthy foods here, but let’s not bog down with that. The objective function is C=0.90f+0.75e where a piece of chicken costs $0.90 and an ear of corn costs $0.75. Let’s also not bog down about whether those prices are reasonable, even back when UCSMP algebra was written, probably the early 1990s. The vertices of the feasible region, rounded to the nearest hundredth when necessary, are (0, 60/7), (3.76, 2.01), and (5.89, 1.32).

1. No one eats 5.89 pieces of chicken and 1.32 ears of corn. Instruction is needed (but not provided in the example) to help students find the lattice points nearest the vertices of the feasible region, but that are contained in the feasible region. Recall that this is the opening example of linear programming.

2. I sketched the feasible region on graph paper, taking great pains to use a ruler and be accurate. The inequalities were not pleasant to graph. I used the two-intercept method to graph each line, but when one of the boundaries is y=200/151, it took a bit of hand waving to make a good graph.

3. If a student is inclined to work through the examples in the book (I was always that student, and over the years of my career, I taught many such students), it is extremely tedious to get it graphed with such nasty coefficients in the inequalities, and it takes some pretty good precision to graph such that identified intersections are indeed vertices of the feasible region.

4. Opening examples should not contain nasty numbers. Students need to learn the concepts first. The tedious calculations can come later. We don’t start teaching students to multiply 1.96 times 6.7789. We start with multiplying 2 times 7 first. We get more sophisticated when they can handle it.

5. Students who are not good at following examples such as linear programming for the first time, are not going to stay with this one. They will tune out and not have a clue how to tackle their homework.

6. I completely rewrote that section of the textbook (my first year of teaching) and shared my lessons and problems with colleagues. I did use “real-world” problems in the sense that, for example, a farmer had a certain amount of acres to plant in corn and beans, and used information that gave reasonable vertices and objective functions. I completely avoided the nasty numbers because, while they can appear in higher level exposures to linear programming, they shouldn’t appear in the very first examples when students have to translate the given problem into a system of inequalities.

I absolutely refuse to ask my students to work problems that were unnecessarily cumbersome and where the numbers took away from the concepts being explored. I’m sure they were thrilled when I retired last year.

Posted in Curriculum & Instruction, Joye Walker, K-12, math, Mathematics | Tagged , | Leave a comment

Do We Still Need Public Schools?

Sandra Stotsky, April 2022

Do we still want a chief policy maker in in the Department of Education with little classroom teaching experience beyond grade 5 who has never administered a middle or high school? No particular ethnicity or race or gender seems to have worked. We’ve tried using all these sociocultural criteria for selecting top education administrators, especially in our major cities. But no sociocultural criterion has led to an effective policy maker.

Are recent nation-wide riots, looting, and arson all expressions of frustration with seemingly failed or ineffective educational institutions? We haven’t tried yet to make other institutions for public health or safety responsible for educating the nation’s children. There are several questions we should ask to try to understand the basis for the many waves of rioting in our major cities.

1. Why haven’t our educational institutions found effective remedial strategies for low achievers by now, over 50 years after the first federal grants to low-income schools and communities in 1965 or so under the Elementary and Secondary Education Act (ESEA)?
2. Do children of low-income parents in other countries perform similarly on the TIMSS, PIRLS, and PISA tests? These have been the chief international tests available for our states to participate in.

Maybe education researchers have not asked the right questions, such as:

1. How much reading or other homework do teachers assign their students in K-12?
2. How many parents check how much their children read or practice every day?
3. Why have pre-schools on average, or after-school programs extending school teaching hours, failed to create equity among demographic groups in the K-12 school population in this country?
4. Why has the use of literary texts and curriculum-aligned textbooks whose subject matter and vocabulary have been reduced in difficulty (such as recent Afrocentric curricula like Nikole Hannah-Jones’ 1619 Project, located at the New York Times) failed to boost scores of children deemed marginalized?
5. What untried but new educational policies would their parents support?

Perhaps all parents would agree that an effective policy maker in the U.S. Department of Education knows well at least one of the subjects typically taught in K-12 and has read a lot and writes well. All parents might also agree that it would be useful to have a policy maker in Education who is familiar with beginning reading and arithmetic research as well as with the features of successful high schools like the old Dunbar High School in Washington, D.C.

Why hasn’t a regularly increasing amount of federal and state money in over fifty years hasn’t helped low-income students in education? Why hasn’t Congress targeted the areas of influence on school achievement noted in the 1966 Coleman Report and the 1965 Moynihan Report, the two most comprehensive reports on differences in academic achievement in this country? They both found social factors more important than educational interventions. The Coleman Report also noted that the teachers of non-black students had greater knowledge and verbal skills than did the teachers of black children. Wouldn’t all students, not just low-achieving students, benefit from academically stronger teachers? Recent information on the benefits of academically strong teachers can be found in https://nces.ed.gov/pubsearch/pubsinfo.asp?pubid=2020144.

Unfortunately, whatever our public schools have done since WWII in the name of equity hasn’t increased general achievement in low achievers. Some scholars have even argued that no increase in achievement was ever intended. https://www.jamesgmartin.center/wp-content/uploads/2019/02/The-Politicization-of-University-Schools-of-Education.pdf

In recent years, many educators have promoted school choice, especially via charter schools, as ways to strengthen low achievers. But school choice is useful only if curriculum choices and the portability of funds for individual students are allowed. As a Harvard economist found when he had an opportunity to design his own intervention program for thousands of Houston, Texas, students, trying to implement the features of effective charter schools doesn’t necessarily lead to much higher academic achievement.

Schools with chiefly low-income students or low achievers are considered high-performing if their test results are higher than expected. One of their characteristics, we are told, is “excellence in teaching and leadership.” According to a report on “strategies to improve low-performing schools” issued by the Center for American Progress in 2016, the phrase has been used by Roland Fryer, a prominent economist known for his attempt to inject seemingly successful charter school practices into “traditional” schools. According to the Center’s report, the vast school-improvement program he helped to design in 2010 for Houston “implemented the following best practices of high-performing charters” based on Fryer’s research on effective schooling models: (1) data-driven instruction; (2) excellence in teaching and leadership; (3) culture of high expectations; (4) frequent and intensive tutoring, or so-called high-dosage tutoring; and (5) extended school day and year.

The long-term results of Houston’s massive Apollo program, which Fryer designed, have been described as “statistically significant” gains in mathematics but “negligible” gains in reading. Moreover, “high-dosage tutoring” seems to be the source of the mathematics gains. For Fryer’s account of the Houston program and its results, see his 2011 or 2014 article. Houston’s results left policy makers with a conundrum. Low achievers seemed to respond to intensive math tutorials (all Houston students had regular math classes; only some had tutorials, too). On the other hand, it wasn’t clear that targeted and intensive tutoring could achieve more than immediate higher test results. In other words, tutoring didn’t seem to lead to lasting gains in both reading and math.

There is another problem that Houston educators needed to consider. Rice University’s evaluation report recommended not only more math tutorials but also tutorials in reading for the future. What could the statistical effectiveness of math tutorials in Houston tell teacher -preparation programs and professional developers to focus on? In this study, statistical significance likely reflects the large number of students in the Apollo program. And teacher -preparation programs and professional development do not typically show teachers how to do tutorials in any subject. A master’s degree program in remedial reading might show teachers how to do one-on-one clinical work in reading, but that is not the same thing as a tutorial in reading.

But school choice may be the best strategy now, as Thomas Sowell noted in his recent book titled Charter Schools and Their Enemies. Letting public money be used in every state for children in schools their parents want them to attend (whether private or secular schools), without mandates to use particular standards, tests, textbooks, and teachers, may finally enable school choice to be the motivational mechanism its supporters envisioned.

To ensure civic equity, however, we need to nationalize the one subject where it would make sense to ensure that all students share common historical and contemporary knowledge, such as the basic political principles embedded in the United States Constitution.

Some educators have strongly supported the use of some of the questions on our naturalization quiz as the basis for a high school graduation test. That is one way to ensure similar knowledge in diverse groups of graduating high school seniors. To ensure diverse voices in history and geography at the classroom level (in addition to what is taught about the Constitutional Period), teachers should invite the parents of students in their elementary or middle school classes to recommend or provide good ethnic stories/poems to discuss in class, with close relatives invited to attend and participate.

The road to effective education is paved with local financial control and parent choice. All students do not want to go to college. High schools could establish several sets of standards rather than a single set of academic standards and let students take course sequences that appeal to them. For a discussion of effective standards and K-12 curricula and tests, listen to Ingrid Centurion’s interview with Sandra Stotsky on education: https://youtu.be/14yBwwWNPwU. Centurion was a candidate for public office in South Carolina and doesn’t want public schools closed down. Stotsky was the chief administrator in the Massachusetts Department of Elementary and Secondary Education in 2000 and was considered responsible for the state’s new or revised state standards and licensure regulations in 2000/3 that led to the “Massachusetts Education Miracle.” With parent-supported reforms, schools of choice can give all students the schools they want: https://www.americanthinker.com/articles/2021/03/the_case_for_closing_public_schools_indefinitely_.html

Posted in College prep, Curriculum & Instruction, Education Reform, K-12, math, reading, Sandra Stotsky | Tagged , , | Leave a comment

In Praise of Memorization

by Pearl Leff

I once worked at a small company of insanely productive engineers. They were geniuses by any account. They knew the software stack from top to bottom, from hardware to operating systems to Javascript, and could pull together in days what would take teams at other companies months to years. Between them they were more productive than any division I’ve ever been in, including FAANG tech companies. In fact, they had written the top-of-the-line specialized compiler in their industry — as a side project. (Their customers believed that they had buildings of engineers laboring on their product, while in reality they had less than 10.)

I was early in my career at the time and stunned by the sheer productivity and brilliance of these engineers. Finally, when I got a moment alone with one of them, I asked him how they had gotten to where they were.

He explained that they had been software engineers together in the intelligence units of their country’s military together. Their military intelligence computers hadn’t been connected to the internet, and if they wanted to look something up online, they had to walk to a different building across campus. Looking something up online on StackOverflow was a major operation. So they ended up reading reference manuals and writing down or memorizing the answers to their questions because they couldn’t look up information very easily. Over time, the knowledge accumulated.

Memorization means purposely learning something so that you remember it with muscle memory; that is, you know the information without needing to look it up.

Every educator knows that memorization is passé in today’s day and age. Facts are so effortlessly accessible with modern technology and the internet that it’s understanding how to analyze them that’s important. Names, places, dates, and other kinds of trivia don’t matter, so much as the ability to logically reason about them. Today anything can be easily looked up.

But as I’ve gotten older I’ve started to understand that memorization is important, much more than we give it credit for. Knowledge is at our fingertips and we can look anything up, but it’s knowing what knowledge is available and how to integrate it into our existing knowledge base that’s important.

You Can’t Reason Accurately Without Knowledge

You know a lot of things.



A lot of life involves reasoning: taking this information you have and making hypotheses that connect different pieces in a way that provides a deeper understanding of them.


The more information you have muscle memory for, the more you can use to reason about.


But you can’t draw connections between things you don’t know exist, or don’t have a good “feel” for.

The problem with not memorizing is that you’re limited by the lack of data points, or nodes that you can make connections between. In short, you’re limited by your lack of understanding of what to look up.

Here’s a small illustration.

Many would argue that there is no point for kids to memorize the world map today. But if you know basic geography, you will hear all kinds of political analysis that only works because the person arguing it doesn’t have any idea where anything is on a map. This is the problem with not making school kids learn basic geography. You can look up any country on Google, but if you’ve never had to memorize approximately where they are, either voluntarily or in school, you’ll never get a sense of why things are the way they are.

Here are some examples that show how that works.

Why does Oman have so much power in today’s Middle East – enough power that it can stay neutral in the various regional conflicts and still be a dominant political player?

This is why:

Oman controls the Strait of Hormuz, the only water-based entry to the Persian Gulf. Any country that messes with Oman risks being denied access to the Persian Gulf.

A second example: at the time of the writing of this article, Russia was a month into an invasion into Ukraine. This is not the first time in even the decade that Russia has tried to take over its neighbor: in 2014 Russia illegally invaded and annexed Crimea, and it is still controlled by Russia today. Why does Russia want to control Crimea so badly? If it’s a power play, why not threaten Belarus or Latvia, which also border Russia and would be easy to take over?

This is why:

Crimea’s Port of Sevastopol is a highly-desired prize for Russia: it gives Russia control over the Black Sea and trade access to the Mediterranean Sea. Russia has only two warm-water ports that don’t freeze during the winter: Vladivostok, which opens to the Pacific, and St. Petersburg, which opens to the Baltic. There are other factors as well, obviously, but Russia’s pursuit of warm-water ports is a frequently recurring theme in its history.

This may seem basic, but many people have never thought to look up these places on a map. If you were trying to just think of the answers you could easily miss them entirely. But if you memorized the map at some point, you knew where those places are and probably could have thought of the answer.

Or try a basic historical example: if you know that the printing press was invented in 1450, you can make the connection between that and the Protestant Reformation in 1517.

The point is that memorizing data gives you a bank of material to run through when forming and testing a hypothesis. When you rely solely on analysis as a form of knowledge-synthesis, you’ll often reach the wrong conclusions simply because you do not have good data to base your deductions on. Of course you can and should research, but you’ll be much more accurate much more quickly when you’ve got the information in your head at hand.

Chances are, you won’t naturally remember all these facts, and that’s where the memorization comes in.

To paraphrase a saying that LessWrong readers will recognize, your map is not the territory. Your job is to add as many features to your map as you can to make it resemble the territory as closely as possible. The more detailed the features on your map, the closer you will be to having an accurate idea of the territory.

Memorizing Organizes Your Knowledge

You know that feeling when you’ve got a lot of information about something, but it’s all jumbled and confusing and fragmented? You might feel this way about car parts, or historical events. Did the Babylonians come before or after the Persians? Did Frederick William I of Prussia come before or after Frederick I? Or William I?

When you look up every fact you want to know independently of its context, you risk it being jumbled and vague and fuzzy in your head. For example, if you heard that Daylight Savings Time started in 1916, you’d likely quickly forget the date.

But if you have key checkpoints of information memorized, new data has a solid place to lodge itself in your mind. If you know that World War I started in 1914 and ended in 1918, and someone mentions that Daylight Savings Time started in 1916, you’ll quickly deduce that they are related. You’ll also remember the approximate date that Daylight Savings Time began: sometime during World War I.

Imagine you’re an engineering manager. Who would you rather hire: the person who knows exactly what features are available in PHP 7 and which are only available in PHP 8, or the one who will figure it out by trial-and-error while writing each application and seeing what fails? Of course, the second engineer certainly may produce quality work. But the first one unquestionably has a comprehensively organized framework of the tools he has at his disposal.

Memorizing information gives you a concrete organizational scaffolding and context in which to put new information. Memorizing an organized set of facts means that new information can be inserted in an orderly way, sandwiched or enhancing other facts in an organized framework.

It Stays With You

My high school completely eschewed memorization as a way of learning. Because of that, students were outraged when, in tenth grade, an older teacher tried to require the class to memorize the equivalent of about four sentences of poetry for a test. All hell broke loose. Being asked to memorize forty words was slightly less outrageous than being asked to memorize the collected works of William Shakespeare. The students brought articles in proofs they had found online that memorization isn’t a good way to learn, that it would doom us all to a life of lifeless brain-dead chanting of facts, that it would cause all their neurons to flop over and die from the effort. If I remember correctly, they even tried to get parents involved.

But the school stood behind the teacher, and the teacher stood firm, and ultimately we had to be able to repeat back the lines of the poem via a fill-in-the-blank section on the test.

Over the years those lines have come back to me many times, and I understand them on a much deeper level. There’s no way I’d ever look them up, but having them accessible has made my life immeasurably richer.

Subconsciously, when you learn a piece by heart, its message penetrates deep inside you. It lies at your fingertips, ready for you to make use of it. Many cultures have long understood this. In Islam, people who memorize the entire Koran are given the special title of hafiz, or guardian. In a secular equivalent, I know people who have memorized Rudyard Kipling’s poem If— to give them a moral helping hand at times of crisis.

Even if you don’t really understand it the first time, memorizing information and literature gives you the opportunity to come back to it. In the words of a college professor of mine, the point of a liberal arts education is to give you what to think about. Having literature, poetry, or even just quotations at the tip of your fingers makes for a more vivid, vibrant, and resonant life.

*Originally published at Pearl Leff’s blog: http://www.pearlleff.com/in-praise-of-memorization

Posted in constructivism, Curriculum & Instruction | Leave a comment