HOME:  Dismissive Reviews in Education Policy Research
  Author Co-author(s) Dismissive Quote type Title Source Funders Link1 Link2 Note
1 Jennifer L. Jennings Douglas Lee Lauen "Despite the ongoing public debate about the meaning of state test score gains, no study has examined the impact of accountability pressure from NCLB on multiple tests taken by the same students." 1stness Accountability, Inequality, and Achievement: The Effects of the No Child Left Behind Act on Multiple Measures of Student Learning, p.222 The Russell Sage Foundation Journal of the Social Sciences, Volume 2, Number 5, September 2016, pp. 220-241   https://muse.jhu.edu/article/633744  
2 Jennifer L. Jennings Douglas Lee Lauen "Still, little is known about the effects of accountability pressure across demographic groups on multiple measures of student learning; addressing this gap is one goal of our study." Dismissive Accountability, Inequality, and Achievement: The Effects of the No Child Left Behind Act on Multiple Measures of Student Learning, p. 223 The Russell Sage Foundation Journal of the Social Sciences, Volume 2, Number 5, September 2016, pp. 220-241   https://muse.jhu.edu/article/633744  
3 Jennifer L. Jennings Douglas Lee Lauen "In sum, all of the studies described here establish positive average effects of NCLB beyond state tests but do not assess the generalizability of state test gains to other measures of achievement. Our study…" 1stness Accountability, Inequality, and Achievement: The Effects of the No Child Left Behind Act on Multiple Measures of Student Learning, p. 223 The Russell Sage Foundation Journal of the Social Sciences, Volume 2, Number 5, September 2016, pp. 220-241   https://muse.jhu.edu/article/633744  
4 Jennifer L. Jennings Douglas Lee Lauen "Our study contributes to a small but growing literature examining the relationship between school-based responses to accountability pressure and student performance on multiple measures of learning, which requires student-level data and test scores from multiple exams." Dismissive Accountability, Inequality, and Achievement: The Effects of the No Child Left Behind Act on Multiple Measures of Student Learning, p. 223 The Russell Sage Foundation Journal of the Social Sciences, Volume 2, Number 5, September 2016, pp. 220-241   https://muse.jhu.edu/article/633744  
5 Jennifer L. Jennings Douglas Lee Lauen "Only one study has examined the effect of accountability pressure on multiple tests, but this study is from the pre-NCLB era. Jacob (2005) used item-level data to better understand the mechanisms underlying differential gains across tests." Dismissive Accountability, Inequality, and Achievement: The Effects of the No Child Left Behind Act on Multiple Measures of Student Learning, p. 223 The Russell Sage Foundation Journal of the Social Sciences, Volume 2, Number 5, September 2016, pp. 220-241   https://muse.jhu.edu/article/633744  
6 Jennifer L. Jennings Douglas Lee Lauen "While the studies reviewed here have established the effects of accountability systems on outcomes, they have devoted less attention to studying heterogeneity in how educators perceive external pressures and react to them. Because the lever for change in accountability systems is educational improvement in response to external pressure, this is an important oversight." Denigrating Accountability, Inequality, and Achievement: The Effects of the No Child Left Behind Act on Multiple Measures of Student Learning, p. 224 The Russell Sage Foundation Journal of the Social Sciences, Volume 2, Number 5, September 2016, pp. 220-241   https://muse.jhu.edu/article/633744  
7 Jennifer L. Jennings Douglas Lee Lauen "A unique feature of this study is the availability of multiple test scores for each student— both the Texas Assessment of Knowledge and Skills (TAKS) and the Stanford Achievement Test battery." 1stness Accountability, Inequality, and Achievement: The Effects of the No Child Left Behind Act on Multiple Measures of Student Learning, p. 225 The Russell Sage Foundation Journal of the Social Sciences, Volume 2, Number 5, September 2016, pp. 220-241   https://muse.jhu.edu/article/633744  
8 Jennifer L. Jennings Douglas Lee Lauen "Whether focusing on predictable content is a desirable practice depends on the relevance of each standard to the inference one wants to make from state test scores. State policymakers may believe that some standards are more important than others and explicitly build such guidance into their instructions to test designers. However, we are aware of no states that provided guidance to test firms at the individual standard level during the NCLB era; ultimately, testing contractors have made these decisions." Dismissive Accountability, Inequality, and Achievement: The Effects of the No Child Left Behind Act on Multiple Measures of Student Learning The Russell Sage Foundation Journal of the Social Sciences, Volume 2, Number 5, September 2016, pp. 220-241   https://muse.jhu.edu/article/633744  
9 Jennifer L. Jennings Douglas Lee Lauen "We believe that our study provides the best available evidence about the effects of accountability pressure on multiple tests in the NCLB era, ..." p.238 Denigrating Accountability, Inequality, and Achievement: The Effects of the No Child Left Behind Act on Multiple Measures of Student Learning The Russell Sage Foundation Journal of the Social Sciences, Volume 2, Number 5, September 2016, pp. 220-241   https://muse.jhu.edu/article/633744  
10 David J. Deming Sarah Cohodes, Jennifer Jennings, Christopher Jencks "In fact, we know very little about the impact of test-based accountability on students’ later success." Dismissive When does accountability work? Education Next, WINTER 2016 / VOL. 16, NO. 1 Harvard Kennedy School; Thomas B. Fordham Foundation & Institute http://educationnext.org/when-does-accountability-work-texas-system/  
11 David J. Deming Sarah Cohodes, Jennifer Jennings, Christopher Jencks "In this study, we present the first evidence of how accountability pressure on schools influences students’ long-term outcomes." 1stness When does accountability work? Education Next, WINTER 2016 / VOL. 16, NO. 1 Harvard Kennedy School; Thomas B. Fordham Foundation & Institute http://educationnext.org/when-does-accountability-work-texas-system/  
12 David J. Deming Sarah Cohodes, Jennifer Jennings, Christopher Jencks "What we don’t know is: Do these improvements on high-stakes tests represent real learning gains? " Dismissive When does accountability work? Education Next, WINTER 2016 / VOL. 16, NO. 1 Harvard Kennedy School; Thomas B. Fordham Foundation & Institute http://educationnext.org/when-does-accountability-work-texas-system/  
13 David J. Deming Sarah Cohodes, Jennifer Jennings, Christopher Jencks "Our study overcomes the limits of short-term analysis by asking: when schools face accountability pressure, do their efforts to raise test scores generate improvements in higher education attainment, earnings, and other long-term outcomes?" Denigrating When does accountability work? Education Next, WINTER 2016 / VOL. 16, NO. 1 Harvard Kennedy School; Thomas B. Fordham Foundation & Institute http://educationnext.org/when-does-accountability-work-texas-system/  
14 Sean P. Corcoran Jennifer L. Jennings "Second, we have limited evidence on the extent to which teachers' short-run effects on achievement correspond to long-term impacts on achievement, attainment, and well-being (Chetty, Rockoff, and Friedman, 2011)." Dismissive Teacher effectiveness on high- and low-stakes tests  NYU Steinhardt School of Culture, Education, and Human Development, New York "We  would  like  to  thank  ... the IES pre-doctoral training program for providing research support.  Jennings received additional support for  this  project  from  IES-AERA  and  Spencer  Foundation  dissertation  fellowships." https://www.nyu.edu/projects/corcoran/papers/Corcoran_Jennings_Houston_Teacher_Effects.pdf  
15 Sean P. Corcoran Jennifer L. Jennings "Comparatively less attention has been given to the outcome measure itself. While some studies have examined the role test scaling plays in value-added, (e.g., Ballou, 2009; Briggs and Weeks, 2009; Koedel and Betts, 2009), fewer have validated teacher effects against other short- or long-run outcomes of interest." Dismissive Teacher effectiveness on high- and low-stakes tests  NYU Steinhardt School of Culture, Education, and Human Development, New York "We  would  like  to  thank  ... the IES pre-doctoral training program for providing research support.  Jennings received additional support for  this  project  from  IES-AERA  and  Spencer  Foundation  dissertation  fellowships." https://www.nyu.edu/projects/corcoran/papers/Corcoran_Jennings_Houston_Teacher_Effects.pdf  
16 Jennifer L. Jennings Heeju Sohn "Our study is the first to bring together these two issues and isolate the relevance of proficiency standard difficulty for inequality in academic achievement on both high- and low-stakes tests." 1stness Measure for Measure: How Proficiency-based Accountability Systems Affect Inequality in Academic Achievement Sociology of Education, 87(2) 125–141 "The research reported here was supported by the Institute of Education Sciences, U.S. Department of Education, through Grant R305AII0420, and by the Spencer Foundation, through Grants 201100075 and 201200071, to the President and Fellows of Harvard College. Sohn’s work was supported by the National Institute of Child Health and Human Development, NIH through Grant 5T32 HD-007242."    
17 Jennifer L. Jennings Jonathan Marc Bearak "Despite the ongoing public debate about the meaning of state test score gains under NCLB, no study has attempted to quantify the extent to which NCLB-era state tests had features that enabled teaching to the test." p.381 Dismissive “Teaching to the Test” in the NCLB Era: How Test Predictability Affects Our Understanding of Student Performance Educational Researcher, 43(8), 381–389 "Funding for this study was provided by the Spencer Foundation (Grant/Award Nos. 201100075 and 201200071) and the Institute for Education Sciences (Grant/Award No. R305AII0420)." https://files.eric.ed.gov/fulltext/EJ1044311.pdf  
18 Jennifer L. Jennings Jonathan Marc Bearak "Nor have previous papers attempted to clarify the concept of teaching to the test," p.381 Denigrating “Teaching to the Test” in the NCLB Era: How Test Predictability Affects Our Understanding of Student Performance Educational Researcher, 43(8), 381–389 "Funding for this study was provided by the Spencer Foundation (Grant/Award Nos. 201100075 and 201200071) and the Institute for Education Sciences (Grant/Award No. R305AII0420)." https://files.eric.ed.gov/fulltext/EJ1044311.pdf  
19 Jennifer L. Jennings Jonathan Marc Bearak "Our study is one of the first to empirically test for a specific opportunity for teaching to the test in NCLB-era tests—predictability—and to estimate whether predictability is associated with improved performance on these items." p.381 1stness “Teaching to the Test” in the NCLB Era: How Test Predictability Affects Our Understanding of Student Performance Educational Researcher, 43(8), 381–389 "Funding for this study was provided by the Spencer Foundation (Grant/Award Nos. 201100075 and 201200071) and the Institute for Education Sciences (Grant/Award No. R305AII0420)." https://files.eric.ed.gov/fulltext/EJ1044311.pdf  
20 Jennifer L. Jennings Jonathan Marc Bearak "Our study is the only one of which we are aware that identifies and tests for a specific mechanism of teaching to the test in multiple states during the NCLB era." p.386 1stness “Teaching to the Test” in the NCLB Era: How Test Predictability Affects Our Understanding of Student Performance Educational Researcher, 43(8), 381–389 "Funding for this study was provided by the Spencer Foundation (Grant/Award Nos. 201100075 and 201200071) and the Institute for Education Sciences (Grant/Award No. R305AII0420)." https://files.eric.ed.gov/fulltext/EJ1044311.pdf  
21 Daniel M. Koretz Holcombe, Jennings “To date, few studies have attempted to understand the sources of variation in score inflation across testing programs.” p. 3 Dismissive The roots of score inflation, an examination of opportunities in two states’ tests  Prepublication draft “to appear in Sunderman (Ed.), Charting reform: achieving equity in a diverse nation   http://dash.harvard.edu/bitstream/handle/1/10880587/roots%20of%20score%20inflation.pdf?sequence=1  
22 Daniel M. Koretz Jennifer L. Jennings “We find that research on the use of test score data is limited, and research investigating the understanding of tests and score data is meager.” p. 1 Dismissive The Misunderstanding and Use of Data from Educational Tests  Prepared for Spencer Foundation meetings, Chicago, IL, February 11, 2010. Revised November 21, 2010 Spencer Foundation http://www.spencer.org/data-use-and-educational-improvement-initiative-activities/ http://www.spencer.org/resources/content/3/3/8/documents/Koretz--Jennings-paper.pdf
23 Daniel M. Koretz Jennifer L. Jennings “Because of the sparse research literature, we rely on experience and anecdote in parts of this paper, with the premise that these conclusions should be supplanted over time by findings from systematic research.” p. 1 Dismissive The Misunderstanding and Use of Data from Educational Tests  Prepared for Spencer Foundation meetings, Chicago, IL, February 11, 2010. Revised November 21, 2010 Spencer Foundation http://www.spencer.org/data-use-and-educational-improvement-initiative-activities http://www.spencer.org/sites/default/files/pdfs/Koretz-Jennings-paper.pdf
24 Daniel M. Koretz Jennifer L. Jennings "...the relative performance of schools is difficult to interpret in the presence of score inflation. At this point, we know very little about the factors that may predict higher levels of inflation —for example, characteristics of tests, accountability systems, students, or schools." p.4 Dismissive The Misunderstanding and Use of Data from Educational Tests  Prepared for Spencer Foundation meetings, Chicago, IL, February 11, 2010. Revised November 21, 2010 Spencer Foundation http://www.spencer.org/data-use-and-educational-improvement-initiative-activities http://www.spencer.org/sites/default/files/pdfs/Koretz-Jennings-paper.pdf
25 Daniel M. Koretz Jennifer L. Jennings “We focus on three issues that are especially relevant to test-based data and about which research is currently sparse:
  How do the types of data made available for use affect policymakers’ and educators’ understanding of data?
  What are the common errors made by policymakers and educators in interpreting test score data?
  How do high-stakes testing and the availability of test-based data affect administrator and teacher practice? (p. 5)
Dismissive The Misunderstanding and Use of Data from Educational Tests  Prepared for Spencer Foundation meetings, Chicago, IL, February 11, 2010. Revised November 21, 2010 Spencer Foundation http://www.spencer.org/data-use-and-educational-improvement-initiative-activities http://www.spencer.org/resources/content/3/3/8/documents/Koretz-Jennings-paper.pdf
26 Daniel M. Koretz Jennifer L. Jennings Systematic research exploring educators’ understanding of both the principles of testing and appropriate interpretation of test-based data is meager.”, p.5 Dismissive The Misunderstanding and Use of Data from Educational Tests  Prepared for Spencer Foundation meetings, Chicago, IL, February 11, 2010. Revised November 21, 2010 Spencer Foundation http://www.spencer.org/data-use-and-educational-improvement-initiative-activities http://www.spencer.org/resources/content/3/3/8/documents/Koretz-Jennings-paper.pdf
27 Daniel M. Koretz Jennifer L. Jennings "Although current, systematic information is lacking, our experience is that that the level of understanding of test data among both educators and education policymakers is in many cases abysmally low.", p.6 Dismissive The Misunderstanding and Use of Data from Educational Tests  Prepared for Spencer Foundation meetings, Chicago, IL, February 11, 2010. Revised November 21, 2010 Spencer Foundation http://www.spencer.org/data-use-and-educational-improvement-initiative-activities http://www.spencer.org/resources/content/3/3/8/documents/Koretz-Jennings-paper.pdf
28 Daniel M. Koretz Jennifer L. Jennings "There has been a considerably (sic) amount of research exploring problems with standards-based reporting, but less on the use and interpretation of standards-based data by important stakeholders." p.12 Dismissive The Misunderstanding and Use of Data from Educational Tests  Prepared for Spencer Foundation meetings, Chicago, IL, February 11, 2010. Revised November 21, 2010 Spencer Foundation http://www.spencer.org/data-use-and-educational-improvement-initiative-activities http://www.spencer.org/resources/content/3/3/8/documents/Koretz-Jennings-paper.pdf
29 Daniel M. Koretz Jennifer L. Jennings "We have heard former teachers discuss this frequently, saying that new teachers in many schools are inculcated with the notion that raising scores in tested subjects is in itself the appropriate goal of instruction. However, we lack systematic data about this..." p.14 Dismissive The Misunderstanding and Use of Data from Educational Tests  Prepared for Spencer Foundation meetings, Chicago, IL, February 11, 2010. Revised November 21, 2010 Spencer Foundation http://www.spencer.org/data-use-and-educational-improvement-initiative-activities http://www.spencer.org/resources/content/3/3/8/documents/Koretz-Jennings-paper.pdf
30 Daniel M. Koretz Jennifer L. Jennings "Research on score inflation is not abundant, largely for the reason discussed above: policymakers for the most part feel no obligation to allow the relevant research, which is not in their self-interest even when it is in the interests of students in schools. However, at this time, the evidence is both abundant enough and sufficiently often discussed that that the existence of the general issue of score inflation appears to be increasingly widely recognized by the media, policymakers, and educators." p.17 Dismissive The Misunderstanding and Use of Data from Educational Tests  Prepared for Spencer Foundation meetings, Chicago, IL, February 11, 2010. Revised November 21, 2010 Spencer Foundation http://www.spencer.org/data-use-and-educational-improvement-initiative-activities http://www.spencer.org/resources/content/3/3/8/documents/Koretz-Jennings-paper.pdf
31 Daniel M. Koretz Jennifer L. Jennings "The issue of score inflation is both poorly understood and widely ignored in the research community as well." p.18 Denigrating The Misunderstanding and Use of Data from Educational Tests  Prepared for Spencer Foundation meetings, Chicago, IL, February 11, 2010. Revised November 21, 2010 Spencer Foundation http://www.spencer.org/data-use-and-educational-improvement-initiative-activities http://www.spencer.org/resources/content/3/3/8/documents/Koretz-Jennings-paper.pdf
32 Daniel M. Koretz Jennifer L. Jennings "Research on coaching is very limited." p.21 Dismissive The Misunderstanding and Use of Data from Educational Tests  Prepared for Spencer Foundation meetings, Chicago, IL, February 11, 2010. Revised November 21, 2010 Spencer Foundation http://www.spencer.org/data-use-and-educational-improvement-initiative-activities http://www.spencer.org/resources/content/3/3/8/documents/Koretz-Jennings-paper.pdf
33 Daniel M. Koretz Jennifer L. Jennings "How is test-based information used by educators? … The types of research done to date on this topic, while useful, are insufficient." p.26 Denigrating The Misunderstanding and Use of Data from Educational Tests  Prepared for Spencer Foundation meetings, Chicago, IL, February 11, 2010. Revised November 21, 2010 Spencer Foundation http://www.spencer.org/data-use-and-educational-improvement-initiative-activities http://www.spencer.org/resources/content/3/3/8/documents/Koretz-Jennings-paper.pdf
34 Daniel M. Koretz Jennifer L. Jennings … We need to design ways of measuring coaching, which has been almost entirely unstudied." p.26 Dismissive The Misunderstanding and Use of Data from Educational Tests  Prepared for Spencer Foundation meetings, Chicago, IL, February 11, 2010. Revised November 21, 2010 Spencer Foundation http://www.spencer.org/data-use-and-educational-improvement-initiative-activities http://www.spencer.org/resources/content/3/3/8/documents/Koretz-Jennings-paper.pdf
35 Daniel M. Koretz Jennifer L. Jennings “We have few systematic studies of variations in educators’ responses. …” p. 26 Dismissive The Misunderstanding and Use of Data from Educational Tests  Prepared for Spencer Foundation meetings, Chicago, IL, February 11, 2010. Revised November 21, 2010 Spencer Foundation http://www.spencer.org/data-use-and-educational-improvement-initiative-activities http://www.spencer.org/resources/content/3/3/8/documents/Koretz-Jennings-paper.pdf
36 Daniel M. Koretz Jennifer L. Jennings "Ultimately, our concern is the impact of educators’ understanding and use of test data on student learning. However, at this point, we have very little comparative information about the validity of gains, ....  The comparative information that is beginning to emerge suggests..." p.26 Dismissive The Misunderstanding and Use of Data from Educational Tests  Prepared for Spencer Foundation meetings, Chicago, IL, February 11, 2010. Revised November 21, 2010 Spencer Foundation http://www.spencer.org/data-use-and-educational-improvement-initiative-activities http://www.spencer.org/resources/content/3/3/8/documents/Koretz-Jennings-paper.pdf
                   
  IRONIES:                
  Daniel M. Koretz Jennifer L. Jennings "Unfortunately, it is often exceedingly difficult to obtain the permission and access needed to carry out testing-related research in the public education sector. This is particularly so if the research holds out the possibility of politically inconvenient findings, which virtually all evaluations in this area do. In our experience, very few state or district superintendents or commissioners consider it an obligation to provide thepublic or the field with open and impartial research. Data are considered proprietary—a position that the restrictions imposed by the federal Family Educational Rights and Privacy Act (FERPA) have made easier to maintain publicly. Access is usually provided only for research which is not seen as unduly threatening to the leaders’ immediate political agendas. The fact that this last consideration is often openly discussed underscores the lack of a culture of public accountability."   The Misunderstanding and Use of Data from Educational Tests, pp.4-5 Prepared for Spencer Foundation meetings, Chicago, IL, February 11, 2010. Revised November 21, 2010   http://www.spencer.org/data-use-and-educational-improvement-initiative-activities/ http://www.spencer.org/resources/content/3/3/8/documents/Koretz--Jennings-paper.pdf
  Daniel M. Koretz Jennifer L. Jennings "This unwillingness to countenance honest but potentially threatening research garners very little discussion, but in this respect, education is an anomaly. In many areas of public policy, such as drug safety or vehicle safety, there is an expectation that the public is owed honest and impartial evaluation and research. For example, imagine what would have happed if the CEO of Merck had responded to reports of side-effects from Vioxx by saying that allowing access to data was “not our priority at present,” which is a not infrequent response to data requests made to districts or states. In public education, there is no expectation that the public has a right to honest evaluation, and data are seen as the policymakers’ proprietary sandbox, to which they can grant access when it happens to serve their political needs."   The Misunderstanding and Use of Data from Educational Tests, p.5 Prepared for Spencer Foundation meetings, Chicago, IL, February 11, 2010. Revised November 21, 2010   http://www.spencer.org/data-use-and-educational-improvement-initiative-activities/ http://www.spencer.org/resources/content/3/3/8/documents/Koretz--Jennings-paper.pdf
                   
      Cite selves or colleagues in the group, but dismiss or denigrate all other work            
      Falsely claim that research has only recently been done on topic.