Loading...
Title :
link :
England has become one of the world’s biggest education laboratories
A third of its schools have taken part in randomised controlled trials. The struggle is getting teachers to pay attention to the evidence
ASH GROVE ACADEMY, a state primary which sits in Moss Roe, a poor suburb on the outskirts of Macclesfield, is an excellent school. Recently, its team won a local debating tournament, besting fancier rivals; its pupils are exposed to William Shakespeare and Oscar Wilde; lessons are demanding and there are catch-up sessions for those who fall behind. Most important, teaching is based on up-to-date research into what works in the classroom. It is the sort of school that ministers dream of replicating across the country.
But how to do so? When the Conservative-Liberal Democrat coalition came to power in 2010, it set about freeing schools from local-authority control. International studies have suggested that such freedom improves results. But giving teachers autonomy doesn’t automatically mean that all will make good decisions. So in 2011 the government provided a grant of £135m ($218m) to establish the Education Endowment Foundation (EEF), a laboratory for education research which would provide teachers with the information to make smart choices.
In the seven years since its foundation, the EEF reckons it has commissioned 10% of all randomised controlled trials ever carried out in education research. In doing so it has turned the English school system into a giant test-bed, with a third of all state schools involved in at least one of its trials. Its work has been used in other parts of the world, like Australia and Latin America, and other countries are considering copying England’s example.
But at home, its efforts have raised difficult questions. Does providing teachers with evidence of what works change their behaviour? And if not, what next?
Where the evidence leads
The EEF was given two main jobs. First, it dished out cash to researchers with interesting ideas, becoming, on its creation, by far the biggest funder of schools research in the country. Educationalists are inclined to small-scale research projects—the sort of studies, says Stephen Gorard of Durham University, where “academics would write up three interviews with head teachers and call it research.” The EEF has prodded them in a more rigorous direction.
Some of its results have been influential. On March 19th the government set aside £26m to fund breakfast clubs, after an EEF study found that they boosted attainment. Just as significant, studies have disproved numerous teaching methods, which is important in a field where fads are common. One recent study found that a programme in which 13- and 14-year-olds assisted 11- and 12-year-olds with their reading did not help the youngsters improve.
Its second job is to disseminate existing research. Its online “teaching and learning toolkit” summarises the findings of more than 13,000 trials from around the world, rating initiatives on the basis of their cost, the strength of the evidence behind them, and their impact, which is measured in the number of months by which they advance children’s learning. Getting a pupil to repeat a year, for example, is expensive and there is adequate evidence to suggest that it sets them back by the equivalent of four months. The EEF also provides broader evidence summaries on areas of interest for schools, such as written marking and digital technology.
Teachers claim to pay attention. A report by the National Audit Office, an official spending watchdog, found that two-thirds of head teachers say they turn to EEF evidence for guidance. But the EEF has come to the realisation that the “passive presentation of evidence is not enough,” says Sir Kevan Collins, its boss. Naturally, it did this by testing its approach. Results published last year found that providing schools with high-quality evidence about teaching led to no improvement in pupils’ performance. The study did not investigate why this was the case. One possibility is that teachers did not take up the ideas. Another is that successful strategies are hard to replicate.
Thus the EEF is increasingly focused on working out how to change behaviour. “One thing we know”, says Sir Kevan, “is that teachers really trust other teachers.” The EEF has joined with officials who work with groups of schools, either in academy chains, local authorities or charities, to spread the evidence-based gospel. It has also increased its meetings with head teachers and has provided extra funding for trials of promising schemes in poorer parts of the country. As ever, all approaches will be scrutinised to see if they work.
The most ambitious shift is the recruitment of 23 “research schools”, of which Ash Grove is one. As a research school, it gets money to help around 150 other local schools, by putting on events to spread the latest research, training teachers and helping them to evaluate the effectiveness of classroom innovations. Jo Ashcroft, the director of education at Ash Grove’s group of academies, notes that the schools “don’t have endless amounts of money”, so every penny has to make a difference.
It is too soon to judge whether such an approach will work. Most educationalists agree that teachers have become more focused on research in recent years. A hard-core minority spend their weekends at conferences debating the merits of star scholars such as John Hattie and Carol Dweck. The challenge for research schools will be reaching beyond these enthusiasts.
It will not be easy. Tellingly, one of the most popular briefs published by the EEF found there was little evidence to support most marking schemes employed by schools, which often infuriate teachers with their pernicketiness. Teachers “like proof they are right”, says Becky Francis of the UCL Institute of Education; it is more difficult to change behaviour when they are wrong. The EEF hopes that evidence will be more compelling when it comes from a friendly face.
SOURCE
Colleges cut ties with acclaimed Boston organist amid sex allegations
Colleges in Massachusetts and Ohio have abruptly cut ties with a Boston-area concert organist of international acclaim amid allegations of sexual misconduct dating back decades.
James David Christie, widely regarded as one of the greatest organists of his generation, has resigned his post as distinguished artist-in-residence at the College of the Holy Cross in Worcester, the school said Thursday. He has also left Oberlin College and Conservatory, where he was a professor of organ and chair of the organ department. Christie has played with the Boston Symphony Orchestra since at least 1980, and he served as Wellesley College organist for years.
A group of former students wrote Holy Cross president Rev. Philip L. Boroughs earlier this month, charging that Christie “is an imminent danger to students on your campus.”
“Several of us were sexually abused by Prof. Christie while we were Holy Cross students,” the group wrote in its Aug. 3 letter. “Holy Cross has enabled Prof. Christie’s misconduct, and has a responsibility now to respond to our coming forward as quickly and decisively as possible.”
In a statement to the Globe on Thursday, Holy Cross said the college “was informed of allegations of serious misconduct and immediately placed Mr. Christie on administrative leave, in accordance with college policy. Mr. Christie had submitted a letter of resignation, and he will not be returning to the College.”
Inside the ‘cult’ that catapulted conductor James Levine’s career
At the beginning of the maestro’s 50-year career, Levine ran what some young musicians considered a cult, allegedly making members take loyalty tests and engage in group sex.
An Oberlin spokesperson referred the Globe to a general statement dated Thursday on the school’s website regarding recent allegations of sexual misconduct against unnamed faculty members. In a Thursday e-mail to Oberlin students, staff, and faculty obtained by the Globe, conservatory dean Andrea Kalyn said the school’s Title IX officer recently received reports that Christie had allegedly violated Oberlin’s sexual misconduct policy.
“Professor Christie was informed of these allegations and was placed on administrative leave pending an investigation,” wrote Kalyn. “He has resigned, and no longer teaches at Oberlin.”
A Wellesley spokeswoman said the college has “no record of any complaints” against Christie, adding that in 2016 he was classified as an independent contractor there.
Christie, 66, did not respond to multiple telephone and e-mail messages seeking comment.
Christie, who was named International Performer of the Year for 2017 by the American Guild of Organists’ New York City Chapter, has performed with many of the world’s great orchestras during his long career. He has appeared on numerous recordings and has strong ties to the Boston area, where his work with the BSO has been singled out by reviewers in recent years.
In a statement, a spokeswoman for the BSO said the orchestra had been unaware of the allegations against Christie, who performed with the orchestra as a freelance musician and has no formal title with the BSO.
“The Boston Symphony Orchestra has never had complaints against Mr. Christie,” the statement read. “Mr. Christie is not on the schedule to perform with the BSO this upcoming season, and there are no plans to engage him for future performances with the orchestra.”
In multiple interviews with the Globe, former students at Holy Cross and Oberlin described a consistent pattern of sexual harassment by Christie. Some said the organist used his considerable artistic standing to manipulate and cajole students, dangling before them entrance to some of classical music’s most rarefied circles. Former students also described a sexually charged environment that included lewd comments, large amounts of alcohol, and unwanted touching over a period between 1994 and 2017.
SOURCE
Australia: Release of education testing 2018 information summary
I am not convinced that the improvement over base year is real. Having "experts" say it is, is a laugh. What about a proper validation test?
The NAPLAN summary results issued today include combined data for online and paper student cohorts.
“Overall, the NAPLAN results for 2018 show that since 2008 there have been statistically significant gains in a number of domains and year levels, particularly at the primary level,” ACARA CEO, Robert Randall, said.
The national summary preliminary NAPLAN results for 2018 show:
Compared with the base year:
The performance of Australian students in Years 5 and 9 numeracy, Years 3 and 5 reading,Years 3 and 5 spelling, and Years 3 and 7 grammar was significantly above the NAPLAN 2008 average.
The writing test results in Years 5, 7 and 9 were below those observed in the base year (2011).
Compared with 2017:
Results were stable, with no statistically significant changes compared with last year in any of the NAPLAN domains. “This was the first year in which some students took NAPLAN online and the transition was smooth, with feedback from schools at the time of testing stating that students found the online assessment engaging,” said Mr Randall.
“The NAPLAN Online platform performed well and 99.8 per cent of students were able to complete the assessment online.”
Prior to release, NAPLAN results are reviewed and endorsed by independent measurement advisory experts.
These measurement experts have confirmed that the results for online and paper NAPLAN have assessed the same content and can be placed on the same NAPLAN assessment scale. While NAPLAN results can be compared between assessment modes and years, individual student experiences for any single test may differ due to a range of factors, including the mode of delivery or a student’s performance on the day.
For example, this year’s results for Year 9 students who completed writing test online were, on average, higher than the results of students who completed writing test on paper. The independent experts have confirmed the results are comparable; however, this difference appears to be a result of the test mode. The difference may be due to students at this year level having greater confidence writing online than on paper, as well as students’ ability to readily review and edit their work online in a way that is not possible with a paper test. This reinforces the benefit of moving to NAPLAN Online, which will give teachers, students and parents more information about what students know and can do, and where additional support is needed.
NAPLAN assesses the fundamental skills of literacy and numeracy, with the data provided used by families, schools and education systems to ensure Australian students are supported in their learning. As always, NAPLAN provides a snapshot of a child’s assessment at a point in time and individual student results should be considered together with school-based assessments.
Via email: media.contact@acara.edu.au
thus Article
that is all articles
This time, hopefully can provide benefits to all of you. Okay, see you in another article posting.
You now read the article with the link address https://onechildsmart.blogspot.com/2018/08/england-has-become-one-of-worlds.html
0 Response to " "
Post a Comment