ECAP Year Two

2017-2018 report

Ed°Visions Comprehensive Assessment Program Report 2019

Below is a summary of Dr. Scott Werdinger, Dr. Ron Newell and En Sun Kim’s published findings in Improving Schools and The Charter Schools Resource Journal

Year Two: 2017-2019

The Schools

The 2017-2018 study included eight schools, six charter schools from Minnesota, and two from Wisconsin. These schools were under the influence of our Ed°Essentials model and had been in operation for 15 or more years. For year two, 2017 – 2019, we attempted to obtain the data from the same schools, plus add 4 new schools, schools which were younger but had used many elements of the Ed°Essentials. We were able to contract with three more schools, all of which were newer to the Ed°Essentials and had had less support over the past three years. The rationale was to see, on the part of the returning schools, whether or not certain growth patterns persisted, and whether or not correlations were consistent. For the newer schools, we wished to compare what happens in schools that are earlier in the process versus those which have been operating under the system for some time.

Of the first eight schools, seven responded with the same type of data for 2018-2019 as they did for the previous year. One of the 2017-2018 schools did not do the hope surveys nor the life skills surveys in 2019, consequently we had fewer numbers we could track throughout the two years. Of the newer schools, we obtained only three positive responses on assessments. Of those three, we were able to obtain hope survey scores only from one, hope survey scores and life skills from another, and hope survey scores and life skills from the third. One of the latter schools sent us one set of RIT scores only, so no growth could be calculated.  Not the response we wanted, but enough to make some assumptions.

The demographic makeup of the schools were similar to the first one-year study. The demographics include the three new schools. The school’s averaged 141 students, 51.2% of which were male, 69.2% were white, 40.3% were of Free/Reduced status, and 39.0% were on IEP,s. In the Minnesota Ed°Visions schools, an average of 28.5% of the budgets were spent on Special Education students, whereas the percentage spent by the average school in Minnesota is 18.4%. Graduation rates are difficult to report because these schools are highly personalized, and students graduate when they complete their number of standards and project hours, regardless of which year they started or how long they have been in the school.


Regardless of circumstance, the results for the eight 2-year study schools were very good. All growth in the various assessments had significant growth, as shown by a t-test on pre-test and post-test results. For students who were assessed in 2017, 2018 and 2019, the results are:

Assessment Number Score ’17 Score ’19 + / – t-test p value
Hope N = 327 47.95 49.21 +1.26 P<0.005
Math RIT N = 247 228.85 233.94 +5.09 P<0.001
N = 233 223.65 228.48 +4.83 P<0.001
Self-directed N = 266 3.52 5.12 +1.60 P<0.001
Collaborative N = 266 3.76 5.26 +1.50 P<0.001

A graph of results in hope, Math and Reading gives a very good illustration of what this growth looks like when put another way:

The growth in hope over a two-year period was found to be significant although only slightly higher than the hope growth for one-year in 2018. The growth in Reading and Math are not only significant, but the scores illustrate what a Project-based, personalized school can accomplish in basic skills. The Reading overall score for the seven schools that had pre and post test scores was, on average of all students, above the national norm (223.7). The Math score was not above the national norms, but the growth toward that mark (238.3) was phenomenal, especially when you consider that most students were behind by more than one grade level upon enrollment, and 39% of students are on IEP’s!

Not only did these schools fulfill their responsibilities in basic skills, but also raised student hope levels from slightly less than average (48), to above average (49.21) as a whole. Raising hope, as was discussed in last year’s report, is as important as content and basic skills. Hope is what leads young people to believe in themselves, overcome obstacles, and find ways to become successful, rather than give in to circumstances. Raising hope is often the precursor to students becoming successful at academic endeavors.

The growth in the Life Skill categories of Self-direction and Collaboration/Interaction was also found to be statistically significant. These scores, like the gains in hope, increased at a lesser pace over the two-year period. But the growth is still significant statistically. The first year found students gaining about 1 point on the Life Skills scale, and about .5 or .6 since. This is due to the fact that students who are in the schools mature over time, and they come to a point where they have already maxed out some of the categories. We expected to run into this “ceiling effect” when looking at same-students over time.

Correlations of Variables

As was done last year, we also did a correlation test to determine if these variables are correlated. Again, correlations do not determine cause and effect, but show that as the one variable rises with particular students, another variable is, or is not, likely to follow suit. So if there is a high correlation (i.e., .90 in self-direction and collaboration/interaction), it means that there is a strong relationship between the two variables. A negative number means no relationship. A small number (i.e., .07) means a small connection, one in which you can have no confidence. The r value is influenced by the number of participants, so often in a large number of participants, an r value of .14 can be significant (see chart below).

A p value means a probability factor. The formula details the probability of such an occurrence happening again by
chance. A number such as .05 means there is a 95% probability that the experiment would produce the same result over time, and only a 5% probability it was by chance. So the lower the number (.01 = 99% confidence that the result is not due to chance), the more confidence you can have in the variables being connected.

The chartfor 2017 – 2019 appears very similar to the one from last year. Not only are the correlations significant or not significant as they were last year, but the r values are almost the same.

Significance with a p value of < 0.01 (all scores are from 2019 data)

Variables   r value Significance Participants
Hope Math 0.14 p<0.01
Hope Reading 0.07 Not
Hope Self
0.28 p<0.01
Hope Collaboration 0.27 p<0.01
Math Reading 0.68 p<0.01
Math Self-Direction 0.35 p<0.01
Math Collaboration 0.35 p<0.01
Reading Self-Direction 0.39 p<0.01
Reading Collaboration 0.36 p<0.01
Self-Direction Collaboration 0.90 p<0.01

We can therefore say that Reading and Math have high correlations, and self-direction and collaboration/interaction are intertwined. Students good in one are more than likely to be good in the other. Why there is a correlation of hope to all other variables except Reading is difficult to determine. If you notice the graph above, the growth curves look very similar. It appears that Reading is not dependent upon hope, and so it appears raising a student’s hope will not raise Reading scores. However, it is difficult to say firmly that it is not necessary to have high hope in order to learn to read. Perhaps not. But raising one’s hope would still appear necessary for success, as it is correlated to all other items assessed, and hope has been shown to be correlated to many other measures of success (GPA’s, College Entrance Exams, graduation from college, etc. [Snyder, et al.]).

We have been attempting to make a case, however, that hope is correlated to Life Skills, and with our two-year study find that hope does correlate, again, with the two categories of Life Skills we assess. And we find that Life Skills are even more highly correlated to Math and Reading than to hope! I believe we can make a case for the fact that schools building hope are also building Life Skills, and those building up Life Skills are building up hope. And those same schools are, at the same time, doing wonderfully at building up basic skills as well! They appear to be intertwined. Perhaps it would be premature to say that the Hope Surveys can measure Life Skills, but I believe there are some very direct ties between them. We will have to take that up another time.

The One-Year Study

One of our reasons for inclusion of newer schools was to determine if growth patterns in hope precluded growth in basic skills and/or life skills. It has been our experience in the past that increases in engagement due to autonomy, positive relationships and support, and positive goal orientations, would exhibit themselves in rising hope, and this would happen before growth in other assessments. However, we received only spotty information, so will have difficulty making that point.

Assessment Number Score ’18 Score ’19 + / –
Hope 178 42.81 46.46 + 3.55
Math 28 227.04
Reading 31 218.90
Self-Directed 111 3.64 4.26 + 0.68
Collaborative 111 3.80 4.41 + 0.61

We know from experience and study that utilizing the Ed°Essentials in their entirety is essential in growing hope over time. The schools that have the best growth, most years if not all, do follow the original design, and incorporate the full project-based model. And, again, raising hope is the way to gain positive growth in other assessments, as well as prepare young people for their next steps in life.

The fact that hope is continually growing in schools of this sort, and that some staff members of certain schools have determined just what methods and tools it takes to do so, means Ed°Visions has a model well worth taking a second look. Not all students, nor all schools, exhibit the kind of growth we see continually in schools that adopt the Ed°Essentials. Those of you have done so, congratulations on choosing what works! And for those who have not, what are you waiting for?