Questions and Answers on School Performance in Sumner County
The following are questions we received from residents of Sumner County as a result of mailing them information on the performance of local schools (letter and brochure here in PDF format). Click on the question that interests you and you'll be forwarded to the response from Dr. J.E. Stone, president of ECF.
Question 1: Why are gains in Sumner and Smith Counties so different?
The 2006 to 2007 school system averages shown in our recent letter to Sumner residents were included mainly to show how the recently released (November 2, 2007) value added numbers have changed from 2006 to 2007 in Sumner and surrounding counties. We will be updating our charts to the 2007 data in the coming weeks, and we wanted to give readers a notion of whether to expect significant changes.
We don't encourage parents and others to compare school system averages because children attend a school, not a system. A school system with a low average might have some great schools and a system with a high average may have one that performs poorly. For example, in Smith County, it looks like there are substantial differences between Gordonsville and Carthage within the P-6 group.
From a parent standpoint, I would compare school systems on the basis of how many opportunities they offer for my child to attend an effective school. A businessman thinking about locating a business in a area might look at the same consideration. In Smith County, it appears that there are 4 opportunities, i.e., Gordonsville (K-4), New Middleton, Forks River, and Defeated. Defeated is one of the highest value-added-gaining schools in the state. By comparison, Sumner County is a much larger school system (roughly 25,000 students versus 3500), so Sumner, not surprisingly, offers more opportunities.
Average value-added gains within a school system are undoubtedly influenced by factors such as the system director's emphasis on the production of school-level outcomes, but the factors that most directly influence achievement gains are at the school and classroom level. So, to answer your question as to why the county averages are so different, I would say that the primary answer lies in the mix of schools and teachers within the two school systems. There are differences in the way that children are taught by different teachers and there are differences among principals in what they support and emphasize within their schools.
More than anything, value-added achievement gains reflect the effectiveness of how schools and teachers go about getting the best out of their students.
Question 2: What happens to bright kids at low-gaining schools?
One of the large but hidden problems in many schools is that the top students are not being stretched to reach their potential. The schools they are attending may have average or above average TCAP achievement scores but their TVAAS gain scores show that students are simply being permitted to coast along. In fact, in many cases, the high scoring children are dropping closer to the national average every year they stay in school.
Part of the problem is that the data can be misleading. The brochure that we sent to Sumner Residents shows percentages of students reaching the "proficient or advanced" level of achievement. Most schools are in the nineties--a number that would lead everyone to assume that schools are doing a great job. The facts are otherwise.
1. Tennessee has weak standards for the various levels of proficiency, i.e., "below basic," basic," "proficient," and "advanced." The National Assessment of Educational Progress shows Tennessee as much lower that the state reports itself: http://www.nytimes.com/2005/11/26/education/26tests.html?pagewanted=print
2. Most schools report the percentage of students who have reached either the proficient or advanced levels. For with talented and capable students, many students who should be reaching the advanced level are only reaching the proficient standard.
3. In some instances, schools suggest to parents (and school boards) that the school's achievement gains (i.e., their TVAAS performance) are low because their TCAP achievement test scores are so high. In other words, they are suggesting that their students who score at the top of the test have hit a ceiling imposed by the test, so the school cannot have exemplary annual gains.
The argument is misleading. The TCAP tests taken by children in grades 3-8 is comprised of items that range across several years of the curriculum. For example, a fifth grader's test may contain items that range from grade 3 to grade 7 just so the test does not impose an artificial ceiling (or floor) on student performance. The results of the test are reported not in terms of the percentage of items answered correctly but in terms of percentiles (techinically NCE scores)--a number that describes the child's performance relative to other 5th graders. Even a youngster who scores at the 99%ile has probably not answered all of the questions correctly and somehow reached the limits of demonstrated achievement gain afforded by the test.
Bottom line: TCAP scores are like a mile marker and TVAAS scores are more like a speedometer. The average mile marker reached by students at a given school is to a very great extent governed by the child's school readiness when he or she reached the entering grade. It is the school's job to do all they can to move the child along from that beginning point within the years they have available. The rate at which a given school is causing its students to progress is shown by its TVAAS scores; and plainly, some schools are doing a far better job than others. The best schools are the ones that consistently maximize TVAAS gains for the mix of students they have enrolled at any given time.
Question 3: How do state and system policies influence value-added gains?
At the school system level, it is fairly obvious (in Tennessee, at least) that some school directors and school boards are paying far more attention to whether their schools are producing value-added achievement gains than are others. Knoxville, for example, is an urban system that has improved its value-added outcomes over the past 10 years or so, and the results are evident http://www.education-consumers.org/tnproject/spc.htm. Many districts, however, appear content to simply live with some number of schools that are low value-added performers.
The Education Consumers Foundation believes that the root of this problem is a lack of public awareness. The most common problem is that local officials and the public are given to believe that the TCAP averages or percentage of students reaching "proficient" are all that really matters. Of course, that's like trying to declare the winner of a race by looking only at the horses' pedigrees.
Schools with the highest TCAP scores are generally the ones whose students begin school with high TCAP scores, i.e., the students who have had the greatest head start. Schools with high value-added scores, however, are the ones that are the most effective in bringing about the educational growth of their students, regardless of their starting level.
In addition to the overemphasis of TCAP and AYP by schools and in the media, ECF believes that much can be done to simply make the value-added and other school report card data understandable to the public. Tons of data are available, but it would take an individual hours and days to draw definitive conclusions about which schools within a district are best. Of course, that's where we hope that our school performance charts will prove useful. Parents can at least get a birds eye view of which schools are doing the best job of carrying out a school's most important tasks.
As to the question of what it is that states and school districts can do if there is greater public demand for improved results, the full answer is more than I want to get into in this message. The short version of it, however, is that states and school systems can get low performing schools to imitate a key aspect of what the high performing schools seem to do: They persist in working with each student in each class until they reach mastery--especially in fundamentals like reading and math. With these building blocks in place, so many of schooling's other problems can be avoided.
Not incidentally, we interviewed the six principals who were repeat winners of the ECF's Value-Added Achievement Award http://www.education-consumers.org/tnproject/vaaa.htm and from their responses compiled a list of the key ingredients to their success. Here is our just-released report: http://www.education-consumers.org/tnproject/EffectiveSchools_CommonPractices_ECF.pdf . If states and districts want to improve, this report might be a good place to start.
Question 4: Could the curriculum be a problem with Sumner County's math scores?
In the early nineties, schools nationwide adopted math textbooks based on standards proposed by the National Council of Teachers of Mathematics (NCTM). Like so many educational fads, they were introduced without adequate testing. Eventually they were revised by NCTM ( http://www.nctm.org/news/content.aspx?id=588), yet many districts continue to use them.
There are several published versions:
For primary grades, Everyday Math and the TERC Investigations (sometimes merely called "TERC");
For the middle schools, CMP (the Connected Math Program; and
For high school, Core-plus and IMP (the Interactive Math Program.
Critics call these curricula "fuzzy math."
I do not know whether these materials are now being used in the Sumner County schools; but if they are, you will find these YouTube video's very instructive.
I can tell you that the demonstrations of the teaching methodologies presented in the videos are accurate and that the commentators provide a competent assessment.
More about "fuzzy" or "new-new" math can be found here: http://www.math.rochester.edu/people/faculty/rarm/debate_appendix.html
Question 5: Do high TCAP averages preclude high TVAAS gains?
Earlier, I received message from a parent who assumed that his school had little chance of making high TVAAS gains because it already has a 98% TCAP average. The following is an edited version of my reply.
Forgive me, but I must tell you that your view reflects a common misunderstanding of TVAAS data and of the school performance information represented in our School Performance Charts. The percentages shown on our brochure do not refer to the percentage of test questions answered correctly by the students attending each school.
The brochure that we mailed to you shows charts of Sumner County's elementary and middle schools ranked according to the gains in student achievement made by their students over the last 3 years. Although the charts do not display a numerical scale for the vertical axis, the bars are based on the Normal Curve Equivalent (NCE) scores that are reported by the Tennessee Department of Education.
What I am afraid misled you was the two tables listing the percentages of students who have met or exceeded the level of academic achievement that the TN Department of Education defines as "proficient" on the TCAP/Math and TCAP/Reading tests. This rather low benchmark is nowhere near the top score that can be earned on the TCAP test. In fact, Tennessee has recently been criticized for setting such a low benchmark: http://www.commercialappeal.com/news/2007/nov/23/study-student-gains-not-so/
It was, perhaps, a bad choice on our part to include such information in our brochure, but our aim was to show parents that there is little relationship between a school's effectiveness in raising student achievement and the kind of information about school performance that is typically reported to the public, i.e., the percentages of students reaching the "proficient" minimum.
We have some explanatory information on our website about NCE scores and how our charts were developed, but the important point to understand in our discussion is that the "y" axis is not based on percentages. Rather, the NCE scores represented by our chart range from a low of approximately -9 to a high of +11. They represent very substantial differences among schools in how much students are gaining in achievement from one year to the next and there is nothing inherent in TVAAS or NCE scores that artificially limits that achievement gains that are obtainable from year to year.
Given the way that the Department of Education reports its data, an NCE gain score of zero represents a level of student achievement gain equal to the Tennessee statewide average in 1998. On our chart, zero is the horizontal midline from which the bars deflect up or down. Students attending schools whose bar deflects upward are gaining more in achievement per year than students did in 1998. Students attending schools with a bar deflecting downward are learning less per year than students did in 1998.
The state average achievement gain in 1998 was designated as (approximately) a letter grade of "C." Because Tennessee's schools are doing a better job of producing achievement gain, the state average is now represented by the second horizontal line that you see on the chart. It is slightly above the minimum needed for a letter grade of "A."
Again, thank-you for raising this question. Our mission is to translate the sophisticated performance data used by the state into something that is more understandable. Our effort is a work in progress and questions like yours are invaluable in helping us see where improvement is needed.
For more information about TCAP scores and their relationship to TVAAS see:
TCAP Parent Guide : http://www.state.tn.us/education/assessment/doc/Form_R_Parent.pdf
ECF Primer for Parents: http://www.education-consumers.org/tnproject/tnprimer.htm
TCAP Educator Guide: http://www.state.tn.us/education/assessment/doc/Form_R_Educ.pdf
Scatterplot of value-added gains versus achievement for Tennessee elementary and middle schools: http://www.education-consumers.com/Birdshot6.pdf