Loading...

Science Adoption Updates

Loading...
Science Adoption Updates - Hallo friend SMART KIDS, In the article you read this time with the title Science Adoption Updates, we have prepared well for this article you read and download the information therein. hopefully fill posts Article baby, Article care, Article education, Article recipes, we write this you can understand. Well, happy reading.

Title : Science Adoption Updates
link : Science Adoption Updates

see also


Science Adoption Updates

This post is written by former Board President Sue Peters as a letter to the Board because of pushback of her analysis about outcomes using Amplify Science.  This is a lengthy post but worth reading if you care about this issue.

Dear Directors,

Thank you for considering the information I have shared with you about the science curriculum adoption process, including concerns about the efficacy of AmplifyScience and evidence of bias in favor of this one product over others. In response to MaryMargaret Welch’s April 3, 2019 letter to the Board challenging the findings of my data analysis, please consider these corrections to Ms. Welch’s claims along with further observations.

(Attached please find a copy of this letter in PDF form, which might be easier to read, as well as a copy of the science waiver applications submitted by SPS schools in 2017.)

First, please know that the analysis I provided was created by a professional data analyst who compiled the data set and aggregated the results, with my assistance. Our sources were OSPI and the district’s own curriculum waiver applications from SPS schools that used AmplifyScience rather than existing district materials.

Secondly, since curriculum adoptions are one of the most impactful and long-lasting decisions the Board can make, as I’m sure you know, they must be grounded in data, per policy and common sense. So I believe whatever information is available should be shared with the Board and inform the Board’s decision.

To that end, it is my sincere goal to help the Board make an informed decision by providing information, as a longtime SPS parent myself and as a former director who empathizes with the Board on the challenges of such decisions.

Summary of inferences from the PowerPoint data

In sum, the information I have provided shows that, based on available data, the SPS schools using Amplify Science fared more poorly on the new Next Generation State Standards science test than the SPS schools not using Amplify Science.

Between the two science tests taken by SPS 8th graders in 2017 and 2018 (MSP and WCAS), pass rates fell across the board in SPS schools (with the exception of some HCC schools), but where they fell the most overall were in the schools using Amplify Science.

Also, the students whose pass rates declined the most were low-income students using Amplify Science.

Ms. Welch and other staff have stated that AmplifyScience curricular materials are better aligned to the new Next Generation Science Standards (NGSS), and would therefore better prepare students for the standards as demonstrated on the NGSS assessment, first administered in 2018. That was their whole rationale for using mass waivers in 2017 to deploy AmplifyScience in as many as 19 schools. But, quite simply, the data does not support that claim. And there is no evidence that AmplifyScience closes the achievement gap.

Validity of the data and analysis

Ms. Welch claims that the basis for my analysis was flawed, and goes to great lengths to question the validity of the data and my methods, indicating she consulted with SPS and OSPI staff for their opinions. To this I offer the following counterpoints:

1. In the analysis, we clearly acknowledge the limitations in the data, and address them in our notes. I acknowledge two different tests were used. But we are not comparing the two tests to each other – or “apples to oranges” – as Ms. Welch would lead you to believe.
 
We are simply comparing how well each school was able to pass the state standardized test, whatever form it took, from one year to the next, based on OPSI reported pass rates. This was a comparison of the schools to themselves – apples to apples, if you will. We also acknowledge that pass rates typically drop whenever a new test is administered. Indeed that is what happened across the district in SPS in 2018 with the new NGSS aligned science test (except in HCC schools). And, of course, it is always better and more accurate to have multiple data points and control groups.
image.png

2. But what I used was the only data available. And there are valid comparisons and inferences that can still be made from this data.
3. The most salient fact the Board should know, and the great irony of Ms. Welch’s objection to my analysis, is that it uses the exact same measures and methodology that SPS staff themselves said they would use to measure the efficacy of AmplifyScience. This is stated in the approved waiver application documents (both below and attached), as you can see for yourselves:
“Data and test score to be used in evaluation:

Data generated from the Washington State Science Learning Standards Next Generation Science assessment for grade 8 to begin Spring 2018. Comparisons will be made between schools using the Amplify Science online platform compared to schools using our current kit based program.

Other Pertinent Data Collection: A robust evaluation system will be implemented to assess the effectiveness of this web-based program that will include, but not be limited to, pre and post students and teacher attitudinal surveys. The SPS Science Dept. will also conduct focus group interviews. The University of Washington School of Education will serve as our partners in this data collection.” 
My analysis follows the framework described. So why is Ms. Welch now questioning the validity of the very measures that she and staff promised to use themselves? It is irrational.

If such comparisons are indeed “apples to oranges” and “inappropriate,” then please ask Ms. Welch why such measurements were acceptable for every school that asked for waivers to use Amplify to fulfill waiver requirements.

And if my method and data are inaccurate, why did Ms. Welch choose to use them to make her own point and graph about the three middle schools she (incorrectly) claimed were miscategorized in our analysis? Her reasoning is inconsistent.

The fact is, comparing an individual school’s pass rates from one year to the next with all variables acknowledged is valid. Comparing trends across the district is also valid. And despite Ms. Welch’s intimations and oversimplifications of the demographics of the north and south ends of the district, these facts hold true even when you disaggregate by income and race. For example, even low-income students in the north-end Eckstein Middle School who were not using Amplify passed the new NGSS rate at a higher rate than low-income students in south-end Mercer Middle School who did use Amplify.

Even in lateral comparisons — apples to apples, if you will—north-end schools not using Amplify fared better than north-end schools that are using Amplify. The same is true for south-end schools.
Conclusion: The numbers tell a story that the Board needs to hear

Nothing Ms. Welch or R&E staff have said changes the fact that SPS schools that used AmplifyScience in 2017-18 saw a greater drop in their pass rates on the new NGSS test (MCAS) in 2018 compared to their pass rates on the previous MSP science test, than schools that did not use AmplifyScience. And the students that fared the worst were low-income students using AmplifyScience.

So there is no evidence that AmplifyScience closes the achievement gap.
image.png

image.png

  image.png

image.png

Further points and corrections:
Ms. Welch claims that I “should have looked at the waivers more carefully.” Please rest assured that I did. That was my very reason for making a public records request for them.

Contrary to Ms. Welch’s claims that three middle schools in our analysis were misclassified and did not use Amplify for 8th grade, the waiver applications show Hazel Wolf and Madison Middle School did in fact use Amplify for 8th grade. Only Blaine did not.

Ms. Welch claims that three schools were erroneously included in the Amplify group in our analysis, and that significantly changes the results. That is incorrect. A review of the waiver applications shows that, while that is true for Catharine Blaine, which only used Amplify in 6th and 7th grades (an oversight I acknowledge), Ms. Welch is incorrect about Hazel Wolf and Madison. Both their waivers clearly state they will use Amplify for 6-8 grade. (See attached)

We have subsequently re-run the results, moving Blaine to the non-Amplify group. This changes the group pass rate in 2017 to 2018 from -7% to -8% -- a statistically insignificant difference. Meanwhile, the overall decline in pass rate among schools remains unchanged, at -12% for those schools not using Amplify and -18% for schools that used Amplify, when HCC (outlier) schools are removed from the equation.

In other words, even when we move Blaine to the non-Amplify category, the story remains the same.

Mercer Middle School has been using Amplify since 2016 – not just “7 months” as Ms. Welch claimed. Its students had two years of Amplify instruction in preparation for the new NGSS test – yet some of the biggest declines in pass rates.
Contrary to Ms. Welch’s claims that all the Amplify schools had only been using the materials for 7 months, Mercer Middle School, the first, and one of the largest schools in the AmplifyScience group, and thus the best case study, has in fact been using those materials and instruction since 2016, as the waiver applications and district purchase records attest. That means the Mercer 8th graders who took the new NGSS test in 2018 had been using AmplifyScience for two years – 7th and 8th grade.

The results? Mercer saw one of the largest drops in test score pass rates from the old test to the new NGSS test – despite having used the allegedly NGSS aligned AmplifyScience materials for longer than any other school.

Mercer also saw one of the largest drops in pass rates in the district for low-income students. This is alarming, especially if the District is committed to closing gaps and addressing inequities.

Ms. Welch calls for cohort study – look to Mercer.

Ms. Welch criticizes my analysis for not following a cohort of students from one year to the next. That data simply is not available -- except for Mercer, the only school that has a cohort of students who used AmplifyScience for two years before taking the NGSS test in 2018. The results? Again, Mercer’s students saw some of the biggest drops in pass rates in the district.

Where is Ms. Welch’s counterpoint data that refutes our findings?

Throughout her refutation of our data, Ms. Welch fails to provide the Board with data to counter our findings. Please request that information. If it does not exist, or does not make a clear case for AmplifyScience, how can the Board approve a recommendation for this product? Policy prohibits it.

Where is the “robust evaluation” data?

The waivers also claim that not only will the schools provide a comparison between Amplify and non-Amplify schools on the 2018 NGSS test, they would also ensure: “robust evaluation system will be implemented to assess the effectiveness of this web-based program that will include, but not be limited to, pre and post students and teacher attitudinal surveys. The SPS Science Dept. will also conduct focus group interviews.”

Please ask Ms. Welch and Drs. Kinoshita and Anderson for the data from the promised “robust evaluation system.” Was this system implemented?
The Board is being asked to take a leap of faith: without supporting data, a long-term investment in AmplifyScience would be a risky, fiscally and academically irresponsible choice for SPS students.

In her letter, Ms. Welch essentially asks the Board to take a leap of faith. She asks you to simply trust her and R&E staff who promise to evaluate the effectiveness of AmplifyScience after you adopt it. That is backwards. Common sense and policy mandate that facts and data must precede and inform the adoption committee recommendation and Board vote, not come afterward.

Ms. Welch said: “R&E has made a commitment to partner with Science to evaluate the implementation and effectiveness of new instructional materials. Working with Research and Evaluation is a planned part of the science adoption. This is how we demonstrate an accountability to our public for entrusting us with new instructional resources.”

These promises of accountability are all in the future. It would be fiscally and academically irresponsible for the Board to accept such terms. It would also violate board policy which mandates data-based decisions to drive adoption decisions.

Staff is asking the Board to commit many years, many dollars and the education of countless SPS students to a curricular product they had abundant opportunity to test, but for which they are providing you no proof of efficacy. The data that is available, and which staff themselves said they would use, Ms. Welch and staff are now attempting to discredit. But this data is informative and does not support the use of AmplifyScience.
 
Other observations from the waivers:

Ms. Welch invited me to scrutinize the waiver documents more closely. I have, along with others in the community, and what we have found is very troubling -- beginning with the fact the Board was not informed of the waivers and “pilot” until Nov. 2017, seven months after (most all of) the waiver applications were submitted and the effort was already underway.

Within the applications, irregularities emerge: A number of schools failed to provide any evidence that the school decided as a community to seek waivers. In multiple cases there is evidence this was a top-down decision made by a few that did not come from the community. That is in violation of waiver policy (2020). One school, for example, said it would inform the community of the school administration decision sometime in the fall during the school barbeque with a flier in multiple languages – five months or so after the fact and with no option for input. That is not community engagement. Two schools submitted waivers after the May 1 due date, rendering them void. One was as late as Sept. 2017. Some schools did not submit required signatures, invalidating them. Some are simply incomplete. The waiver applications are 90 percent identical, demonstrating that they were not an organic school-based decision, as the waiver policy intended, but an orchestrated centralized effort.
Why does all of this matter?

This matters because our students and teachers deserve the very best curricular materials, and which have been carefully vetted and evaluated, based on facts, not bias, and whose recommendation was reached in full compliance with district policy, law and utmost integrity.

How can the Board or the greater SPS community have confidence in this process given the lack of data and the shifting explanations from staff? On the surface it appears that some members of staff decided back in 2016 to invest and commit to one untested product, AmplifyScience, orchestrated a mass use of waivers and called it a “pilot,” but failed to follow a scientific testing method that a true pilot would prescribe.

And despite the absence of convincing data, staff are now advocating to the Board for what appears to be a preordained outcome, and to commit to a longterm contract with this same vendor. Yet the existing data suggests that this product does not serve our students well, especially low-income students -- those who are “furthest from educational justice.” Sincerely,

Recommendations and Board decisions must be based on facts and data and a process conducted with objectivity and integrity. There is troubling evidence that all of these factors are lacking from this science curriculum adoption process.

Thank you again for your consideration of all the facts. Our children are depending on you to make the right decisions, with full confidence in the processes that led to them.
Sincerely,

Sue Peters
SPS parent since 2005 

SPS Board Director 2013-17


 End of letter

I am still investigating a possible source for the Amplify materials.  I know the head of Science, Mary Margaret Welch, gave her impassioned explanation to the Board at the last Work Session on this topic but I found it all a little less-than-believable.  I now have another lead and am following it; it could explain a lot. 

Here's a link to all the science waivers - 20 in all which is quite unusual.  To note in the Board policy 2020 on Waivers:  taking all relevant district and state assessments, and must, on average of the three-year waiver period, meet or exceed the gains demonstrated by peer schools that are using the district adoptioned matierals for all segments of their population in order to continue using the alternative basic instructional materials.  So how come there is pushback in comparing those outcomes?

There are some real oddities in there like:

- How come the majority of them have the same template language? 
- There's these notations:
  • Student subscriptions for the online Amplify Science platform and all correspoinding materials required for implementation will be provided by the District Science budget for kit-based instructional materials.  What? The district paid for the subscriptions?
  • Laptop computers required for the implementation of these instructional materials will be provided through a mini-grant offering 1 laptop for every 2 students.  Again, what? Who gave this "mini-grant?"  Are they saying that for science in these schools, they have 1 laptop for every 2 students?  That's seems like a fairly big deal.
  • However, there is also this from Catherine Blaine K-8's application: All corresponding materials required for implementation will be paid for by the PTSA.  Cascadia's appliation doesn't mention computers at all.  Neither does Decatur's.
  • Cedar Park's application does not mention computers and has this notation: Cedar Park staff have not been hired at the time of this waiver application.  So this was purely a principal decision.
  • A robust evaluation system will be implemented to assess the effectiveness of this web-based program that will include, but not be limited to, pre-and-post student and teacher attitudinal surveys.  The SPS Science Department will also conduct focus group interviews.  The University of Washington School of Education will serve as our partners in this data collection.  Well, I'll certain give Ms. Welch's department a ring for this data.
  • The challenge presented by the Amplify Science program that it (sic) required a one-to-one laptop program.  However, after some significant deliberations with the developers of Amplify Science, we have the opportunity to field test a "tech-light" version in 2-017-2018. Parnertings with DoTS will allow us an opportunity to implement this web-based platform in a segment of our middle schools and assess its effectiveness.  So what does this mean if SPS adopts Amplify?
  • The "community involvement" varied wildly from none to having PTSAs endorse it.
Also to note:

- There were only 4 pilot teachers utilized in the process of piloting and vetting the curriculum for the entire district. Out of those 4 pilot teachers, 3 were all at previous Amplify schools.

- These pilot teachers were likely able to pick which units to compare for each company and may have picked the strongest ones from Amplify - based on their experience with those units. 

- One of the teachers, plus Mary Margaret Welch, has been quoted in Amplify white papers and advertising.  So you have the head of Science of a district plus a teacher who is making decisions for the district as a whole both being quoted by a company up for a science adoption.

- I am told that Amplify's structure lacks flexibility for any kind of modification.

Schedule for final adoption:

April 23 - Curriculum & Instruction Meeting final review (Jill Geary is the Chair, with Rick Burke and Scott Pinkham as members)

May 1 - To Board for Introduction

May 7 - Some kind of meeting with middle and high school principals

May 15 - To Board for Action (I note that the dateline from the Science Department says (hopefully) "approval." That is not a given.)



thus Article Science Adoption Updates

that is all articles Science Adoption Updates This time, hopefully can provide benefits to all of you. Okay, see you in another article posting.

You now read the article Science Adoption Updates with the link address https://onechildsmart.blogspot.com/2019/04/science-adoption-updates.html

Subscribe to receive free email updates:

0 Response to "Science Adoption Updates"

Post a Comment

Loading...