You do not have permission to edit this page, for the following reason:
The action you have requested is limited to users in one of the groups: newuser, fileuploaders, CargoAdmin.
Free text:
The '''National Assessment of Educational Progress''' ('''NAEP''') is the largest continuing and nationally representative assessment of what U.S. students know and can do in various subjects. NAEP is a congressionally mandated project administered by the [[National Center for Education Statistics| National Center for Education Statistics (NCES)]], within the [[Institute of Education Sciences| Institute of Education Sciences (IES)]] of the [[United States Department of Education]]. The first national administration of NAEP occurred in 1969.<ref>{{Cite web |title=History and Innovation - What is the Nation's Report Card {{!}} NAEP |url=https://nces.ed.gov/nationsreportcard/about/timeline.aspx |access-date=2022-04-05 |website=nces.ed.gov |language=EN}}</ref> The National Assessment Governing Board (NAGB) is an independent, bipartisan board that sets policy for NAEP and is responsible for developing the framework and test specifications.The National Assessment Governing Board, whose members are appointed by the [[U.S. Secretary of Education]], includes governors, state legislators, local and state school officials, educators, business representatives, and members of the general public. Congress created the 26-member Governing Board in 1988. NAEP results are designed to provide group-level data on student achievement in various subjects, and are released as The Nation's Report Card.<ref>{{cite web|url=https://nces.ed.gov/nationsreportcard/?src=ft|title=Nation's report card}}</ref> There are no results for individual students, classrooms, or schools. NAEP reports results for different demographic groups, including gender, socioeconomic status, and race/ethnicity. Assessments are given most frequently in [[mathematics]], [[reading (process)|reading]], [[science]] and [[writing]]. Other subjects such as [[arts|the arts]], [[civics]], [[economics]], [[geography]], technology and engineering literacy (TEL) and [[us history|U.S. history]] are assessed periodically. In addition to assessing student achievement in various subjects, NAEP also surveys students, teachers, and school administrators to help provide contextual information. Questions asking about participants' race or ethnicity, school attendance, and academic expectations help policy makers, researchers, and the general public better understand the assessment results. Teachers, principals, parents, policymakers, and researchers all use NAEP results to assess student progress across the country and develop ways to improve education in the United States. NAEP has been providing data on student performance since 1969.<ref>{{Cite report |url=https://www.air.org/sites/default/files/downloads/report/50-Years-of-NAEP-Use-June-2019.pdf |title=White Paper on 50 Years of NAEP Use: Where NAEP Has Been and Where It Should Go Next |last=Mullis|first=Ina V. S. |date=2019 |publisher=American Institutes for Research|access-date=2021-03-12}}</ref><ref>{{Cite journal |last=Jones |first=Lyle V. |date=1996 |title=A history of the National Assessment of Educational Progress and some questions about its future |url=https://www.jstor.org/stable/1176519 |journal=Educational Researcher |volume=25 |issue=7 |pages=15–22 |doi=10.3102/0013189X025007015 |jstor=1176519 |s2cid=145442224 |via=JSTOR |access-date=2021-03-12|url-access=subscription }}</ref> NAEP uses a [[sampling (statistics)|sampling]] procedure that allows the assessment to be representative of the geographical, racial, ethnic, and socioeconomic diversity of the schools and students in the United States.{{citation needed|date=March 2021}} Data is also provided on students with disabilities and English language learners. NAEP assessments are administered to participating students using the same test booklets and procedures, except accommodations for students with disabilities,<ref name="ga">{{Cite web |date=2018-11-05 |title=Revised, November 2018 G ALLOWABLE ACCOMMODATIONS FOR STUDENTS WITH DISABILITIES |url=https://www.gadoe.org/Curriculum-Instruction-and-Assessment/Assessment/Documents/General%20Presentations/Accommodations_SW-EL_18-19-Nov_18.pdf |website=gadoe.org}}</ref><ref name="freed">{{Cite web |last=Freedman |first=Miriam |date=2009-03-04 |title=NAEP and Testing Students with Disabilities and English Language Learners |url=https://www.nagb.gov/content/dam/nagb/en/documents/who-we-are/20-anniversary/freedman-NAEP-Accommodations.pdf |website=NAGB}}</ref> so NAEP results are used for comparison of states and urban districts that participate in the assessment. There are two NAEP websites: the [http://nces.ed.gov/nationsreportcard NCES NAEP website] and [http://www.nationsreportcard.gov The Nation's Report Card website]. The first site details the NAEP program holistically, while the second focuses primarily on the individual releases of data. == History == NAEP began in 1964, with a grant from the [[Carnegie Corporation of New York|Carnegie Corporation]] to set up the Exploratory Committee for the Assessment of Progress in Education (ESCAPE). The first national assessments were held in 1969. Voluntary assessments for the states began in 1990 on a trial basis and in 1996 were made a permanent feature of NAEP to be administered every two years. In 2002, selected urban districts participated in the state-level assessments on a trial basis and continue as the Trial Urban District Assessment (TUDA).[need citation] The development of a successful NAEP program has involved many, including researchers, state education officials, contractors, policymakers, students, and teachers.<ref>{{cite web |url=http://nces.ed.gov/nationsreportcard/about/naephistory.asp |title=Measuring Student Progress Since 1964 |publisher=National Center for Education Statistics|access-date=2011-09-29}}</ref> == Assessments == There are two types of NAEP assessments, [http://nces.ed.gov/nationsreportcard/subjectareas.asp main NAEP] and [http://nces.ed.gov/nationsreportcard/ltt/ long-term trend NAEP]. This separation makes it possible to meet two objectives: #As educational priorities change, develop new assessment instruments that reflect current educational content and assessment methodology. #Measure student progress over time. === Main === Main NAEP assessments are conducted in a range of subjects with fourth-, eighth- and twelfth-graders across the country. Assessments are given most frequently in mathematics, reading, science, and writing. Other subjects such as the arts, civics, economics, geography, technology and engineering literacy (TEL), and U.S. history are assessed periodically. These assessments follow subject-area frameworks that are developed by the NAGB and use the latest advances in assessment methodology.<ref>{{cite web |url=http://www.nagb.org/publications/frameworks.htm |title=Frameworks and Specifications |publisher=National Center for Education Statistics|access-date=2011-09-29}}</ref> Under main NAEP, results are reported at the national level, and in some cases, the state and district levels. ==== National ==== National NAEP reports statistical information about student performance and factors related to educational performance for the nation and for specific demographic groups in the population (e.g., race/ethnicity, gender). It includes students from both public and nonpublic (private) schools and depending on the subject reports results for grades 4, 8, and 12. ==== State ==== State NAEP results are available in some subjects for grades 4 and 8. This allows participating states to monitor their own progress over time in mathematics, reading, science, and writing. They can then compare the knowledge and skills of their students with students in other states and with the nation. The assessments given in the states are exactly the same as those given nationally. Traditionally, state NAEP was assessed only at grades 4 and 8. However, a 2009 <ref>{{cite web |url=http://nces.ed.gov/nationsreportcard/pubs/main2009/2011455.asp#section3 |title=Results for public school students in 11 states available for the first time |publisher=National Center for Education Statistics|access-date=2011-09-29}}</ref> pilot program allowed 11 states (Arkansas, Connecticut, Florida, Idaho, Illinois, Iowa, Massachusetts, New Hampshire, New Jersey, South Dakota, and West Virginia) to receive scores at the twelfth-grade level. Through 1988, NAEP reported only on the academic achievement of the nation as a whole and for demographic groups within the population. Congress passed legislation in 1988 authorizing a voluntary Trial State Assessment. Separate representative samples of students were selected from each state or jurisdiction that agreed to participate in state NAEP. Trial state assessments were conducted in 1990, 1992, and 1994. Beginning with the 1996 assessment, the authorizing statute no longer considered the state component a "trial.” A significant change to state NAEP occurred in 2001 with the reauthorization of the [[Elementary and Secondary Education Act]], also referred to as [[No Child Left Behind Act of 2001|"No Child Left Behind"]] legislation. This legislation requires that states which receive Title I funding must participate in state NAEP assessments in mathematics and reading at grades 4 and 8 every two years. State participation in other subjects assessed by state NAEP (science and writing) remains voluntary. Like all NAEP assessments, state NAEP does not provide individual scores for the students or schools assessed. ==== Trial Urban District Assessment ==== The '''Trial Urban District Assessment''' ('''TUDA''') is a project developed to determine the feasibility of using NAEP to report on the performance of public school students at the district level. As authorized by congress, NAEP has administered the mathematics, reading, science, and writing assessments to samples of students in selected urban districts. [http://nces.ed.gov/nationsreportcard/about/district.asp TUDA] began with six urban districts in 2002, and has since expanded to 27 districts for the 2017 assessment cycle. {|border="1" cellpadding="5" cellspacing="0" align="center" |- ! scope="col" style="background:#efefef;" | District ! scope="col" style="background:#efefef;" | 2002 ! scope="col" style="background:#efefef;" | 2003 ! scope="col" style="background:#efefef;" | 2005 ! scope="col" style="background:#efefef;" | 2007 ! scope="col" style="background:#efefef;" | 2009 ! scope="col" style="background:#efefef;" | 2011 ! scope="col" style="background:#efefef;" | 2013 ! scope="col" style="background:#efefef;" | 2015 ! scope="col" style="background:#efefef;" | 2017 |- |[[Albuquerque Public Schools]] | | | | | |align = "center" |x |align = "center" |x |align = "center" |x |align = "center" |x |- |[[Atlanta Public Schools]] |align = "center" |x |align = "center" |x |align = "center" |x |align = "center" |x |align = "center" |x |align = "center" |x |align = "center" |x |align = "center" |x |align = "center" |x |- |[[Austin Independent School District]] | | |align = "center" |x |align = "center" |x |align = "center" |x |align = "center" |x |align = "center" |x |align = "center" |x |align = "center" |x |- |[[Baltimore City Public Schools]] | | | | |align = "center" |x |align = "center" |x |align = "center" |x |align = "center" |x |align = "center" |x |- |[[Boston Public Schools]] | |align = "center" |x |align = "center" |x |align = "center" |x |align = "center" |x |align = "center" |x |align = "center" |x |align = "center" |x |align = "center" |x |- |[[Charlotte-Mecklenburg Schools]] | |align = "center" |x |align = "center" |x |align = "center" |x |align = "center" |x |align = "center" |x |align = "center" |x |align = "center" |x |align = "center" |x |- |[[Chicago Public Schools]] |align = "center" |x |align = "center" |x |align = "center" |x |align = "center" |x |align = "center" |x |align = "center" |x |align = "center" |x |align = "center" |x |align = "center" |x |- |[[Clark County School District|Clark County (NV) School District]] | | | | | | | | |align = "center" |x |- |[[Cleveland Metropolitan School District]] | |align = "center" |x |align = "center" |x |align = "center" |x |align = "center" |x |align = "center" |x |align = "center" |x |align = "center" |x |align = "center" |x |- |[[Dallas Independent School District]] | | | | | |align = "center" |x |align = "center" |x |align = "center" |x |align = "center" |x |- |[[Denver Public Schools]] | | | | | | | | |align = "center" |x |- |[[Detroit Public Schools]] | | | | |align = "center" |x |align = "center" |x |align = "center" |x |align = "center" |x |align = "center" |x |- |[[District of Columbia Public Schools]] |align = "center" |x |align = "center" |x |align = "center" |x |align = "center" |x |align = "center" |x |align = "center" |x |align = "center" |x |align = "center" |x |align = "center" |x |- |[[Duval County Public Schools|Duval County (FL) Public Schools]] | | | | | | | |align = "center" |x |align = "center" |x |- |[[Fort Worth Independent School District|Fort Worth (TX) Independent School District]] | | | | | | | | |align = "center" |x |- |[[Fresno Unified School District]] | | | | |align ="center" |x |align ="center" |x |align = "center" |x |align = "center" |x |align = "center" |x |- |[[Guilford County Schools|Guilford County (NC) Schools]] | | | | | | | | |align = "center" |x |- |[[Hillsborough County Public Schools|Hillsborough County (FL) Public Schools]] | | | | | |align ="center" |x |align = "center" |x |align = "center" |x |align = "center" |x |- |[[Houston Independent School District]] | align="center" |x | align="center" |x | align="center" |x | align="center" |x | align="center" |x | align="center" |x |align = "center" |x |align = "center" |x |align = "center" |x |- |[[Jefferson County Public Schools (Kentucky)|Jefferson County (KY) Public Schools]] | | | | | align="center" |x | align="center" |x | align = "center" |x | align = "center" |x |align = "center" |x |- |[[Los Angeles Unified School District]] | align="center" |x | align="center" |x | align="center" |x | align="center" |x | align="center" |x | align="center" |x |align = "center" |x |align = "center" |x |align = "center" |x |- |[[Miami-Dade County Public Schools]] | | | | | align="center" |x | align="center" |x |align = "center" |x |align = "center" |x |align = "center" |x |- |[[Milwaukee Public Schools]] | | | | | align="center" |x | align="center" |x |align = "center" |x | |align = "center" |x |- |[[New York City Department of Education]] | align="center" |x | align="center" |x | align="center" |x | align="center" |x | align="center" |x | align="center" |x |align = "center" |x |align = "center" |x |align = "center" |x |- |[[School District of Philadelphia]] | | | | | align="center" |x | align="center" |x |align = "center" |x |align = "center" |x |align = "center" |x |- |[[San Diego Unified School District]] | | align="center" |x | align="center" |x | align="center" |x | align="center" |x | align="center" |x |align = "center" |x |align = "center" |x |align = "center" |x |- |[[Shelby County Schools (Tennessee)|Shelby County (TN) Schools]] | | | | | | | | |align = "center" |x |- |} === Long-term trend === Long-term trend NAEP is administered to 9-, 13-, and 17-year-olds periodically at the national level. Long-term trend assessments measure student performance in mathematics and reading and allow the performance of today's students to be compared with students since the early 1970s. Although long-term trend and main NAEP both assess mathematics and reading, there are several differences between them. In particular, the assessments differ in the content assessed, how often the assessment is administered, and how the results are reported. These and other differences mean that results from long-term trend and main NAEP cannot be compared directly.<ref>{{cite web |url=http://nces.ed.gov/nationsreportcard/about/ltt_main_diff.asp |title=What are the main differences between Long-Term Trend NAEP and Main NAEP? |publisher=National Center for Education Statistics|access-date=2011-09-29}}</ref> Although NAEP has been administered since the 1970's, in 2021 US DOE officials have decided to postpone the assessment in math and reading due to the COVID-19 pandemic. The reasons for postponing include the possibility of skewed student samples as well as results due to differing distance learning options and because of safety concerns for proctors and students.<ref>{{Cite web|title=Commissioner's Remarks-Due to COVID Pandemic, NCES to delay National Assessment of Education Progress (NAEP) assessment-November 25, 2020|url=https://nces.ed.gov/whatsnew/commissioner/remarks2020/11_25_2020.asp|access-date=2021-01-28|website=nces.ed.gov}}</ref> === Assessment schedule === NAGB sets the calendar for NAEP assessments. Please refer to the entire [http://nces.ed.gov/nationsreportcard/about/assessmentsched.asp assessment schedule] for all NAEP assessments since 1968 and those planned through 2017. Main NAEP assessments are typically administered over approximately six weeks between the end of January and the beginning of March of every year. Long-term trend assessments are typically administered every four years by age group between October and May. All of the assessments are administered by NAEP-contracted field staff across the country. === NAEP State Coordinators (NSC) === NAEP is conducted in partnership with states. The NAEP program provides funding for a full-time NSC in each state. He or she serves as the liaison between NAEP, the state's education agency, and the schools selected to participate. NSCs provide many important services for the NAEP program and are responsible for: *coordinating the NAEP administration in the state, *assisting with the analysis and reporting of NAEP data, and *promoting public understanding of NAEP and its resources == New digitally-based assessments (DBA) == While most NAEP assessments are administered in a paper-and-pencil based format, NAEP is evolving to address the changing educational landscape through its transition to digitally-based assessments. NAEP is using the latest technology available to deliver assessments to students, and as technology evolves, so will the nature of delivery of the DBAs. The goal is for all NAEP assessments to be paperless by the end of the decade. The 2011 writing assessment was the first to be fully computer-based. === Interactive Computer Tasks (ICTs) === In 2009, ICTs were administered as part of the paper-and-pencil science assessment. The computer delivery affords measurement of science knowledge, processes, and skills not able to be assessed in other modes. Tasks included performance of investigations that include observations of phenomena that would otherwise take a long time, modeling of phenomena on a very large scale or invisible to the naked eye, and research of extensive resource documents. === Mathematics Computer-Based Study === This special study in multi-stage testing, implemented in 2011, investigated the use of adaptive testing principles in the NAEP context. A sample of students were given an online mathematics assessment which adapts to their ability level. All of the items in the study are existing NAEP items. === Technology and Engineering Literacy (TEL) Assessment === The TEL assessment framework describes technology and engineering literacy as the capacity to use, understand, and evaluate technology as well as to understand technological principles and strategies needed to develop solutions and achieve goals. The three areas of the assessment are: *'''Technology and society''' – deals with the effects that technology has on society and on the natural world and with the sorts of ethical questions that arise from those effects. *'''Design and systems''' – covers the nature of technology; the engineering design process by which technologies are developed; and basic principles of dealing with everyday technologies, including maintenance and troubleshooting. *'''Information and communication technology''' – includes computers and software learning tools; networking systems and protocols; hand-held digital devices; and other technologies for accessing, creating, and communicating information and for facilitating creative expression. Eighth-grade students throughout the nation took the assessment in winter of 2014. Results from this assessment were released in May 2016. === Writing Computer-Based Assessment === In 2011, NAEP transitioned its writing assessment (at grades 8 and 12) from paper and pencil to a computer-based administration in order to measure students' ability to write using a computer. The assessment takes advantage of many features of current digital technology and the tasks are delivered in multimedia formats, such as short videos and audio. Additionally, in an effort to include as many students as possible, the writing computer-based assessment system has embedded within it several [[universal design]] features such as text-to-speech, adjustable font size, and electronic spell check. In 2012, NAEP piloted the computer-based assessment for students at grade 4. == Studies using NAEP data == In addition to the assessments, NAEP coordinates a number of related special studies that often involve special data collection processes, secondary analyses of NAEP results, and evaluations of technical procedures. === Achievement gaps === [[Achievement gap in the united states|Achievement gaps]] occur when one group of students outperforms another group and the difference in average scores for the two groups is statistically significant (that is, larger than the margin of error). In initial report releases NAEP highlights achievement gaps across student groups. However, NAEP has also releases a number of reports and data summaries that highlight achievement gap. – Some examples include the School Composition and the Black-White Achievement Gap and the Hispanic-White and the Black-White Achievement Gap Performance.<ref>[http://nces.ed.gov/nationsreportcard/studies/gaps/ Achievement Gaps] NAEP (homesite), retrieved 13 April 2013</ref> These publications use NAEP scores in mathematics and/or reading for these groups to either provide data summaries or illuminate patterns and changes in these gaps over time. Research reports, like the School Composition and Black-White Achievement Gap, also include caveats and cautions to interpreting the data. === High School Transcript Study (HSTS) === {{unreferenced section|date = February 2024}} The [http://nces.ed.gov/nationsreportcard/hsts/ HSTS] explores the relationship between grade 12 NAEP achievement and high school academic careers by surveying the curricula being followed in our nation's high schools and the course-taking patterns of high school students through a collection of transcripts. Recent studies have placed an emphasis on [[Science, Technology, Engineering, and Math|STEM]] education and how it correlates to student achievement on the NAEP mathematics and science assessments.{{cn|date = February 2024}} === NAEP-TIMSS Linking Study === The [[TIMSS|Trends in International Mathematics and Science Study (TIMSS)]] is an international assessment by the International Association for the Evaluation of Educational Achievement (IEA) that measures student learning in mathematics and science. NCES initiated the NAEP-TIMSS linking study so that states and selected districts can compare their own students' performance against international benchmarks. The linking study was conducted in 2011 at grade 8 in mathematics and science. NCES will "project", state and district-level scores on TIMSS in both subjects using data from NAEP. === National Indian Education Study (NIES) === The [http://nces.ed.gov/nationsreportcard/nies/ NIES] is a two-part study designed to describe the condition of education for American Indian/Alaska Native students in the United States. The first part of the study consists of assessment results in mathematics and reading at grades 4 and 8. The second part presents the results of a survey given to American Indian/Alaska Native students, their teachers and their school administrators. The surveys focus on the students' cultural experiences in and out of school. === Mapping State Proficiency Standards === Under the 2001 reauthorization of the Elementary and Secondary Education Act (ESEA) of 1965, states develop their own assessments and set their own proficiency standards to measure student achievement. Each state controls its own assessment programs, including developing its own standards, resulting in great variation among the states in statewide student assessment practices. This variation creates a challenge in understanding the achievement levels of students across the United States. Since 2003, NCES has supported research that compares the proficiency standards of NAEP with those of individual states. State assessments are placed onto a common scale defined by NAEP scores, which allows states' proficiency standards to be compared not only to NAEP, but also to each other. NCES has released the Mapping State Proficiency Standards report using state data for mathematics and reading in 2003, 2005, 2007, 2009, and most recently 2013.<ref>[http://nces.ed.gov/nationsreportcard/studies/statemapping/ Mapping State Proficiency Standards] [[National Center for Education Statistics]], retrieved 13 April 2013</ref> === Past studies === Over the years, NCES has conducted a number of other studies related to different aspects of the NAEP program. A few studies from the recent past are listed below: *The [http://nces.ed.gov/nationsreportcard/studies/ors/ Oral Reading Study] was undertaken to discover how well the nation's fourth-graders can read aloud a typical grade 4 story. The assessment provided information about students' fluency in reading aloud and examined the relationship between oral reading, accuracy, rate, fluency, and reading comprehension. *[http://nces.ed.gov/nationsreportcard/studies/charter/ America's Charter Schools] was a pilot study conducted as a part of the 2003 NAEP assessments in mathematics and reading at the fourth-grade level. While [[charter school]]s are similar to other public schools in many respects, they differ in several important ways, including the makeup of the student population and their location. *[http://nces.ed.gov/nationsreportcard/studies/privateschools/ Private Schools] educate about 10 percent of the nation's students. In the first report, assessment results for all private schools and for the largest private school categories—Catholic, Lutheran, and Conservative Christian—were compared with those for public schools (when applicable). The second report examined differences between public and private schools in 2003 NAEP mean mathematics and reading scores when selected characteristics of students and/or schools were taken into account. *[http://nces.ed.gov/nationsreportcard/studies/tbaproject.asp Technology-Based Assessment project] was designed to explore the use of technology, especially the use of the computer as a tool to enhance the quality and efficiency of educational assessments. == Criticism == NAEP's heavy use of [[statistical hypothesis testing]] has drawn some criticism related to interpretation of results. For example, the Nation's Report Card reported "Males Outperform Females at all Three Grades in 2005" as a result of science test scores of 100,000 students in each grade.<ref>{{cite web|title=Male and Female Students Make Gains Since 2000 at Grade 4; Males Outperform Females at all Three Grades in 2005|url=http://nationsreportcard.gov/science_2005/s0110.asp|work=The Nation's Report Card|publisher=U.S. Department of Education|access-date=16 September 2012}}</ref> Hyde and Linn criticized this claim, because the mean difference was only 4 out of 300 points, implying a small [[effect size]] and heavily overlapped distributions. They argue that "small differences in performance in the NAEP and other studies receive extensive publicity, reinforcing subtle, persistent, biases."<ref>{{cite journal|last=Hyde|first=Janet Shibley|author2=Marcia C. Linn|title=Gender similarities in mathematics and science|journal=Science|date=27 October 2006|volume=314|issue=5799|pages=599–600|doi=10.1126/science.1132154|url=https://www.science.org/doi/abs/10.1126/science.1132154|access-date=16 September 2012|pmid=17068246|s2cid=34045261}}</ref> NAEP's choice of which answers to mark right or wrong has also been criticized, a problem which happens in other countries too.<ref name="aus">{{Cite news |last=Cassidy |first=Caitlin |date=2023-11-18 |title=An urgent overhaul of VCE exams is needed after multiple errors, experts say. But how did this happen? |language=en-GB |work=The Guardian |url=https://www.theguardian.com/australia-news/2023/nov/19/victoria-vce-exams-errors-overhaul-vcaa-review |access-date=2023-11-20 |issn=0261-3077}}</ref> For example, a history question asked about the 1954 ''[[Brown v. Board of Education]]'' ruling, and explicitly referred to the 1954 decision which identified the problem, not the 1955 decision which ordered desegregation. NAEP asked students to "describe the conditions that this 1954 decision was designed to correct." They marked students wrong who mentioned segregation without mentioning desegregation. In fact the question asked only about existing conditions, not remedies, and in any case the 1954 decision did not order desegregation.<ref name="lang">{{Cite web |last=Liberman |first=Mark |date=2011-06-22 |title=Language Log » A reading comprehension test |url=https://languagelog.ldc.upenn.edu/nll/?p=3213 |access-date=2020-09-07 |website=U of Pennsylvania}}</ref><ref name="lati">{{Cite news |last=Wineburg |first=Sam |author-link=Sam Wineburg |date=2011-10-24 |title=Testing students' knowledge of the civil rights movement |language=en-US |work=Los Angeles Times |url=https://www.latimes.com/opinion/la-xpm-2011-oct-24-la-oe-wineburg-civil-rights-education-20111024-story.html |access-date=2020-09-07}}</ref> The country waited until the 1955 ''[[Brown v. Board of Education#Brown II|Brown II]]'' decision to hear about "all deliberate speed." Another history question marked students wrong who knew the US fought Russians as well as Chinese and North Koreans in the [[Korean War#Aerial warfare|Korean War]]. Other released questions on math and writing have had similar criticism. Math answers have penalized students who understand [[Square root#Square roots of positive integers|negative square roots]], interest on loans, and errors in [[Extrapolation#Quality|extrapolating]] a graph beyond the data.<ref name="hnn">{{Cite web |last=Burke |first=Paul |title=Wrong "Correct" Answers: The Scourge of the NAEP |url=http://hnn.us/articles/140129.html |access-date=2020-09-07 |website=History News Network - George Washington University}}</ref><ref name="wapos">{{Cite news |last=Burke |first=Paul |date=1990-08-28 |title=U.S. STUDENTS THE MYTH OF MASSIVE FAILURE |language=en-US |work=Washington Post |url=https://www.washingtonpost.com/archive/opinions/1990/08/28/us-students-the-myth-of-massive-failure/cfd377c9-33f6-4986-a82f-f36055e704c4/ |access-date=2020-09-07 |issn=0190-8286}}</ref> NAEP's claim to measure critical thinking has also been criticized. UCLA researchers found that students could choose the correct answers without critical thinking.<ref name="wapo17">{{Cite news |last=Winebeurg |first=Sam, Mark Smith and Joel Breakstone |date=2017-09-19 |title=The 'nation's report card' says it assesses critical thinking in history — but NAEP gets an F on that score |language=en-US |work=Washington Post |url=https://www.washingtonpost.com/news/answer-sheet/wp/2017/09/19/the-nations-report-card-says-it-assesses-critical-thinking-in-history-but-naep-gets-an-f-on-that-score/ |access-date=2020-09-07 |issn=0190-8286}}</ref> NAEP scores each test by a statistical method, sets cutoffs for "basic" and "proficient" standards, and gives examples of what students at each level accomplished on the test. The process to design the tests and standards has been criticized by [[Western Michigan University]] (1991), the [[National Academy of Education]] (1993), the [[Government Accountability Office]] (1993), the [[National Academy of Sciences]] (1999),<ref name="wap11">{{Cite news |last=Harvey |first=James |date=2011-11-04 |title=NAEP: A flawed benchmark producing the same old story |language=en-US |work=Washington Post |url=https://www.washingtonpost.com/blogs/answer-sheet/post/naep-a-flawed-benchmark-producing-the-same-old-story/2011/11/03/gIQAbnonmM_blog.html |access-date=2020-09-07}}</ref><ref name="fairtest">{{Cite web |title=NAEP Levels Found To Be Flawed |url=https://www.fairtest.org/naep-levels-found-be-flawed |access-date=2020-09-07 |website=www.fairtest.org}}</ref> the [[American Institutes for Research]] and [[RTI International]] (2007),<ref name=ascd/> [[Brookings Institution]] (2007<ref name="bkg"/> and 2016<ref name=ascd/>), the [[Buros Center for Testing]] (2009),<ref name="wap11"/> and the [[National Academies of Sciences, Engineering, and Medicine]] (2016).<ref name=ascd/> Interpretation of NAEP results has been difficult: NAEP's category of "proficient" on a reading test given to fourth graders reflects students who do well on the test and are at seventh grade level.<ref name="ascd">{{Cite journal |last=Harvey |first=James |date=February 2018 |title=The Problem with "Proficient" |url=http://www.ascd.org/publications/educational-leadership/feb18/vol75/num05/The-Problem-with-%C2%A3Proficient%C2%A3.aspx |journal=Educational Leadership}}</ref> NAEP's category of "proficient" on a math test given to eighth graders reflects students who do well on the test and are at twelfth grade level.<ref name="bkg">{{Cite web |last=Loveless |first=Tom |date=2016-06-13 |title=The NAEP proficiency myth |url=https://www.brookings.edu/blog/brown-center-chalkboard/2016/06/13/the-naep-proficiency-myth/ |access-date=2020-09-07 |website=Brookings |language=en-US}}</ref> The fact that few eighth graders are proficient by this standard and achieve at twelfth grade level has been misinterpreted to allege that few eighth graders achieve even at eighth grade level.<ref name="wap16">{{Cite news |last=Strauss |first=Valerie |date=2016-05-23 |title=Why a social media fight between Campbell Brown and her critics matters |language=en-US |work=Washington Post |url=https://www.washingtonpost.com/news/answer-sheet/wp/2016/05/23/why-a-social-media-fight-between-campbell-brown-and-her-critics-matters/ |access-date=2020-09-07 |issn=0190-8286}}</ref> NAEP says, "Students who may be proficient in a subject, given the common usage of the term, might not satisfy the requirements for performance at the NAEP achievement level"<ref name=ascd/> James Harvey, principal author of [[A Nation at Risk]], says, "It's hard to avoid concluding that the word was consciously chosen to confuse policymakers and the public."<ref name=ascd/> ==See also== * [[Reading#Reading achievement: national and international reports|NAEP - Reading achievement]] == References == {{Reflist|30em}} ==Further reading== * {{citation |title=Grading the Nation's Report Card: Evaluating NAEP and Transforming the Assessment of Educational Progress |year=1999 |editor1=James W. Pellegrino |editor2=Lee R. Jones |editor3=Karen J. Mitchell |doi=10.17226/6296 |isbn=978-0-309-06285-5 |url-access=registration |url=https://archive.org/details/gradingnationsre0000unse }} == External links == * {{official website|http://nces.ed.gov/nationsreportcard/}} * [http://nces.ed.gov/nationsreportcard/nde/ NAEP Data Explorer] * [http://nces.ed.gov/nationsreportcard/itmrls/ NAEP Questions Tool] * [http://nationsreportcard.gov NAEP assessment reports since 2005] * [http://www.doe.mass.edu/mcas/naep/ Massachusetts NAEP Web page] * [http://www.ericdigests.org/pre-9218/naep.htm The National Assessment of Educational Progress (NAEP)] - From the [[Education Resources Information Center]] Clearinghouse on Tests Measurement and Evaluation. * [http://www.nagb.org/ National Assessment Governing Board] {{ED agencies}} {{Authority control}} {{DEFAULTSORT:National Assessment Of Educational Progress}} [[Category:United States Department of Education]] [[Category:Student assessment and evaluation]]