EdVisions Comprehensive Assessment Program
Today on the blog we are sharing findings from the ECAP – EdVisions Comprehensive Assessment Program – which was completed over the 2017-2018 school year in coordination with eight charter schools. ECAP was lead by Dr. Ron Newell of EdVisions, Scott Wurdinger, PhD,professor of experiential learning and leadership studies at Minnesota State University in Mankato, Minnesota, and En Sun Kim, Experiential Education Graduate Student, MNSU, Mankato, Minnesota.
EdVisions Comprehensive Assessment Program
The ECAP was presented at the beginning of the 2017-2018 school year to schools in the EdVisions network of schools. Eight schools were chosen and agreed to provide data for us to attempt an assessment of four items, three of which are not typically assessed in middle or high schools. As the EdVisions network schools are charters, and have a distinctive program model, the alternative assessment program was intended to showcase what the personalized, project-based, teacher powered schools, can do for students in their care.
The eight schools are comprised of six charter schools in Minnesota and two charters in Wisconsin. All eight schools have been successfully operated for more than 15 years. They all started utilizing the Ed Essentials of creating small communities and full-time advisories; personalized, project-based learning; authentic assessments; and teacher ownership. As teachers in charters, they were in control of their own environment, and were accountable for their results.
The data from the six Minnesota schools compared to the state of Minnesota data in the following manner: both the EdVisions Schools and the state of Minnesota averaged approximately the same number of white to non-white, with white numbering 68% in EdVisions Schools, 67% statewide; they were also very close in the number of free and reduced populations – 40.6% to 37.2%. There is divergence in special populations, however – EdVisions schools averaged 35.8% special education students to the state’s 15.7%; EdVisions Schools spent on average 25.2% of their ADM on special education, where schools across the state spent 18.7%. Despite the large number of special education students, the ACT averages were quite similar – 20.73 to 21.77. College enrollees from EdVisions Schools consisted of 59% of graduates, 75% for the state; but they were again similar in the percentage of students who persisted into a second year and accrued credits, 70% to 79%.
What do we make of the above data? The fact that so many special education students are flocking to charters ought to indicate the need for different types of schools. The small community schools that are made up of full-time advisories and personalized, hands-on, authentic projects, appear to meet the needs of students who learn differently, or do not fit in, or have been labeled, or who have been harassed.
As charter school enrollees come to the schools with more deficits than previously, the school personnel are taxed to their maximum capacity to achieve academic goals; yet they do so quite well. The reading proficiency of students in EdVisions Schools was 63%, and in Minnesota 60% in 2016. Not only do they educate the difficult to educate, but they also meet other goals, goals not addressed or assessed by the states.
Attention paid to the social-emotional state of students has gained credence within the educational community in the past 5 years. Our assessment program has looked at both the social emotional climate of the schools, and also the life skills students are developing. Those two elements are rarely, if ever, attended to in middle and high schools.
The social-emotional assessment used by EdVisions is the Hope Survey, which measures hope, engagement, autonomy, belongingness, goal orientations, and academic press. The surveys are designed to determine the level of hope and the level of engagement of the students. The rest of the surveys are intended to show a school whether their institutional climate is conducive to raising hope, and capable of reversing the national trend of a lowering of student engagement each year from 6th through 12th grades.
We chose to utilize NWEA’s Measure of Academic Progress for academic growth because all but one school gave the assessment twice a year. The RIT score from that assessment is a sliding scale number, as are the hope and engagement scales. RIT scores can be followed each year to see how students are progressing. The scores from 2017 to 2018 could be spring to spring, or fall to spring, depending on how the schools administered the assessments.
For life skills, we found that the eight Partner Schools had many forms of assessments of student growth in life skills. We chose to look at several rubrics used by the schools, pull out some basic elements from each, and to utilize nationally known life skill assessments.
In the end, we decided that we ought to minimize the number of traits and settle on two major concepts of life skills all schools developed in students; Self-directedness, and Collaboration/Interaction. These two traits were witnessed in schools with high levels of emotional engagement, autonomy, and mastery goal orientations. The project-based model, with full-time advisors who oversee all factors of a student’s development, generally produces a student that is an independent thinker, a goal setter, uses resources well, can manage themselves, has internal motivation, is reflective and can evaluate themselves, and is adaptable. Those traits were woven into a rubric.
Also, witnesses, and case studies of students from these schools, constantly referred to how well students could interact with each other and adults, noting how mature and respectful students were. As there were several local rubrics that included some aspects of these traits, we decided to include them as a second general area. The traits of a collaborative and interactive student include; has communication skills, has developed social skills, they exhibit responsibility and tolerance of differences, interact with peers and adults in mature and respectful ways, develop good presentation skills, and the exhibit skills as a organizers and leaders. These skills were also woven into a rubric.
The scoring on the rubric was on an eight-point scale. Level one was where a beginner more than likely would have little awareness of how to manage the personalized, project-based system. In other words, a novice. Level 2 would be some awareness of what are the needs and expectations, and the student has attempted to meet some of them; a 3 is where students are advancing novices and demonstrate some items on the rubric, but still need prodding and coaxing; a 4 means an advanced novice who has inconsistently exhibited most traits; a level 5 means the student is becoming a strategic learner and demonstrates most traits with encouragement; a level 6 demonstrates the learner has demonstrated enough of the traits at a high level and would be ready for graduation; a level 7 means the learner needs little support and is an emerging expert; a level 8 means the student can function at a high level on their own, and is confident of their ability to succeed.
The rubrics were presented to each school via an online instrument called Qualtrics. The online system worked well, as we had 646 responses. Most assessors were comfortable with those concepts, and could utilize their own rubric assessments in scoring the rubric. We believe this system worked well enough in this study to not only continue to use it, but to broaden the assessments to more schools in the future.
Findings from the Surveys
The first item assessed was engagement. Engagement was measured by a self-perception survey embedded in the Hope Survey. Engagement is measured on a scale of -10 to +10. It is measured in two categories; behavioral and emotional. Any score under 0 is considered very low, low 1.00 to 1.49, moderate from 1.50 to 2.99, high from 3.00 to 4.49, and very high any score above 4.50.
These surveys indicated that the eight EdVisions Partner Schools were able to increase engagement in new students that enrolled in their schools in the fall of 2017, by 1.13 points in behavioral engagement, and 3.78 points in emotional engagement. Over the past two years, the schools were able to maintain high engagement in the schools (on average) of 3.59 points in behavioral engagement, and 3.74 points in emotional engagement. As both of those numbers are considered to be in the high range, and the fact that most of these schools maintain this high average across the grade levels, the model program using the Ed Essentials has proven to more fully engage students than is expected in most schools.
Engagement has been correlated to hope. “Hope” reflects an individual’s perceptions regarding their ability to clearly conceptualize their goals, develop the specific strategies to reach those goals (i.e., pathways thinking), and initiate and sustain activity based upon those strategies (i.e., agency thinking). According to hope theory, a goal can be anything that an individual may desire to experience, create, obtain, accomplish, or become.
Higher hope has been linked to student behavior, attendance, academic achievement, and increased confidence as achievers. Higher hope students set more challenging goals for themselves, and perceive they will be successful in achieving their goals. Higher hope students will have a greater chance of success in college and beyond.
It was the intention of the EdVisions Assessment Program to measure hope in students over the 2017-2018 calendar year to see if the 8 schools indeed raised hope and engagement. In a past study in 2007-2008, hope was found to have a high correlation to achievement in math and reading. This study intended to see if this was still true, and to include not only hope, but also a life skills assessment.
For this assessment, we wanted to have matched student-by-student scores. In the eight schools there were 604 students who took both a 2017 survey and a 2018 survey. The average level of hope for these students increased from 47.87 to 49.03, an increase of 1.15. This is considered a significant rise. The hope scale is a scale of 8-64, and 48 is considered an average score across the nation.
Many of the students come to EdVisions Schools with hope levels below 40. We also find that low hope students’ scores fluctuate more wildly than do higher hope students. If a school has significant number of low hope students (the average incoming student across the 8 schools is now approximately 44), then we have more students who have difficulty raising their hope levels. Moods affect them negatively and at times we see large drops in scores. We also see large gains, so they do even out. But we must remember, it is in raising hope that these young people will have a chance to be successful in their future.
The gain of 1.15 in hope was encouraging. If students can gain that much a year, it is possible for one scoring 44 upon enrolling to raise hope to 48 in four or fewer years. One discouraging note is that only 56% of the students increased their hope levels. Schools need to have a positive influence on more students than that in the future. This leads us to think that we must strengthen network schools by placing more emphasis on raising hope than on raising math and reading scores.
Historically strengthening hope and developing higher hope in students is done by paying attention to the other variables in the Hope Survey; engagement, autonomy, belongingness, mastery goal orientation, and academic press. We know that the Ed Essentials put into operation by skilled and caring advisors can raise engagement and hope. We also believe that higher hope in students will make a difference in their achievement, whatever a school sees as achievement.
This goal of this study was to see if the Partner Schools indeed did have high engagement, could raise hope, had good achievement results, and could assess life skills as part of that achievement. We have shown that engagement can be raised, and that hope was raised in matched students.
The data from all assessments was downloaded and shared with Dr. Scott Wurdinger of the Experiential Education Department ay Minnesota State University – Mankato, and a graduate assistant researcher from the same university, Ms. En-Sun Kim. Together we crunched the numbers and analyzed the data.
The following chart indicates what we found:
|N||Assessment||Score 17||Score 18||Change|
The scores held some surprises, but generally the data indicates what we expected to see – a raise in hope, positive movement along the RIT scale, and an increase in life skills. One surprise was how well the schools did in raising math skills. This had been difficult years ago when these schools were previously studied. Because many students come to these schools behind, in math especially, the schools had to develop means of providing the necessary resources, time, personnel, and method to increase math skills. They are doing well in that area.
There was an expectation that reading skills would grow at a greater rate than math, as that was the case in the 2008-2009 study. Sustained silent reading, with students reading what they wish to read according to a personal reading plan, generally helps students gain in skills assessed. This RIT gain was a bit less than expected, but still respectable.
We had no expectations concerning the life skill rubric – but where the numbers fell indicate that the average student (if there is ever such a thing) is in the middle of the rubric. The rise of over one point on the rubric for both skills assessed is, we believe, an expected change that advisors would see in their charges. As this was the first time such a scoring rubric was used, it will take more time to see trends and have expectations.
The Correlation Study
We did a correlation study to determine if hope was correlated to other achievements, and to see if there were correlations of life skills to reading and math. To determine a correlation between elements, a group of students that were assessed in all variables had to be found. A correlation study cannot be done with varying populations. Amongst all the students, we found only 252 that had all ten variables assessed.
This is less than half of those assessed in other areas, and is unfortunate. Because we only have these 252 responses on all items, we do not have a strong picture of how these items correlate. As will be seen when the data is studied, there are some differences in the changes from the chart above. There are differences in the averages in hope, for example.
Why were there so few in the correlation study? We found that there were many students who missed one assessment or another; a pretest or posttest, a survey taken in one year, but not the other, etc. Schools find it difficult in a highly mobile and transitory population to have all students assessed on every item.
The Findings from the Correlation Study
The following chart indicates the numbers in the variables, with pretest and posttest changes:
|N = 252||Score 17||Score 18||Change|
As can be seen by comparing the previous chart to this one, Hope was overall lower and raised less, and RIT scores went up more. Obviously, in this group of students, there is no correlation of Hope to RIT scores. And the derivation from the formula showed just that. This is different from a study done on EdVisions Schools 10 years ago, where strong correlations were found.
The numbers presented below are correlations between data for the 252 matching students from the 2018 data. The 2017 data showed very little difference. Take into account that a weak correlation is any number over .10 to .30, a moderate correlation is from .30 to .50, a strong correlation is anything over .50. So what did we find?
There was a weak correlation of Hope to Collaboration/Interaction (.23); and Hope to Math (.12). There was a moderate correlation between Hope and Self-Direction (.31); between Math and Self-Direction (.37); between Math and Collaboration/Interaction (.31); and between Reading and Self-Direction (.31). There were two strong correlations: between Math and Reading (.70), and between Self-Direction and Collaboration/Interaction (.86). Both of these are obvious and totally expected.
So what can we say concerning the correlations? We were disappointed that there were not strong correlations between Hope and anything else studied. But there is a positive in that Hope is moderately correlated to Self-Directedness, and has some small correlation to Collaboration/Interaction. Creating environments that encourage self-directedness, collaboration, and interaction skills is necessary for the well-being and success in the future. The growing of those skills and hope are worthy goals in and of themselves, never-mind the lack of correlation to math and reading.
In fact, taking in consideration the study done 10 years ago showing a strong correlation of Hope to Math and Reading, it could be said that paying more attention to the academic needs may have had an adverse effect on the growth of Hope! Having to pay so much attention to Math means less time on interest-driven projects, which may be more valuable to most students in that the build the life skills. Something to ponder.
The lack of correlation of Hope to academic achievement in this study is no reason to believe Hope does not matter. Hope is, in and of itself, a goal well worth attention. We took a look also at whether at students raised in hope versus those who had a loss of hope on the survey. There were 56% of the students who were assessed on all assessments who raised Hope – 44% that did not. Does it matter if students are scoring lower on the Hope Survey?
The changes are listed below:
|Assessment||Higher Hope||Lower Hope|
Obviously the larger differential is in Hope. But the point is that if Hope is not attended to, students who lose Hope will not do as well on other assessments. The fact that the assessments show such gains, even those with lower hope, is a credit to the schools and advisors. And, of course, the students. But it is also obvious that you want to affect students’ concept of themselves as goal setters, positively seeing themselves as achievers. It would be interesting to see that if the percentage of students who raise Hope is over 60%, for example, what would happen to other assessment results?
A correlation study was done on these two groups. We will refer to the following chart:
|Correlation||Hope Increased||Hope Decreased|
|Weak (.1 to .3)||Hope to Self-Direction;
Hope to Collaboration/Interaction
|Hope to Math;
Reading to Self-Direction;
Reading to Collaboration/Interaction
|Moderate (.3 to .5)||Math to Self-Direction;
Math to Collaboration/Interaction;
Reading to Self-Direction;
Reading to Collaboration/Interaction
|Hope to Self-Direction;
Hope to Collaboration/Interaction;
Math to Self-Direction;
Math to Collaboration/Interaction
|Strong||Math to Reading;
Self-Direction to Collaboration/Interaction
|Math to Reading;
Self-Direction to Collaboration/Interaction
Hope only appears as a weak correlation to the Life Skills among the hope increasing group. Apparently, as Hope rises, so will Life Skills to some degree. Or, as Life Skills grow, Hope may also be positively affected. But stronger correlations exist among Reading and Math with the Life Skills. To those whose Hope is rising, Life Skills apparently are tied to student’s skills in Math and Reading. This would appear to be logical, as does a small correlation to Hope. It is apparent also that Math and Reading skills went up to greater extent among this group than those who lost some measure of Hope.
To those who did lose some measure of Hope, their Hope was a stronger correlation to Life Skills than with those whose Hope rose. In other words, the fact that their Hope decreased, their Life Skills rose at lower rates. Reading and Math are more linked to Life Skills amongst those whose Hope increased; Math and Hope more linked to Life Skills amongst those whose Hope decreased.
Raising math scores takes a great deal of time and effort. Might it be that increasing Math scores had a detrimental effect on Hope and Life Skills (might they have raised at a greater rate if fewer students lost Hope)? What did Life Skills have to do with Hope decreasing? The data will probably not answer those questions, as those hypotheses were not really measured.
Summing all this up is difficult, but we have some measure of what is possible in schools that adhere to the Ed Essentials. First, Hope for over 600 students increased 1.15 points on the Hope scale, from just below average, to slightly above average, an increase of 1.8%. The Math RIT scores rose 3.46 points, an increase of 1.2% for the 560 students who had pretest and posttest data. The Reading RIT increased 2.01 points, an increase of .7%. The Life Skill of Self-Direction rose 1.02 points, for an increase of 12.9%; and Collaboration/Interaction rose 1.01 points for an increase of 12.7%.
RIT scores can be compared to national average gains – the Math and Reading increases commensurate to the national average growth for 7th and 8th graders and well above the norm for 9-11th grades. We know that a Hope gain of 1.15 is quite significant, as we can compare to gains in past years. The Life Skills assessment was used for the first time, and it will take a number of more uses to establish norms.
Not having a strong correlation of Hope to other measurements is not terribly concerning – weak and moderate correlations were found. But we re-iterate the point that Hope is not just a corollary to learning – it is a core outcome, and paying attention to growth in Hope will benefit students in many more ways. The fact that schools can in fact raise Hope, especially when so many are coming to our schools with lower than average Hope, is gratifying; yet it is daunting, as well. We have to do more for students than ever – raising Hope for the future ought to be a goal, a mission. We must pay attention to the needs of children with little Hope.
We need to continue this kind of study – to see trends, gather data on more students and more schools, to see if different types of schools with different programs have different data, etc. EdVisions would be interested in any school community that would like to account for the social-emotional well-being of their students, and to see if that affects basic skills and life skills. Join us in assessing what really matters in schools; raising hope and developing life skills, while raising achievement.