American Psychological Association Logo

Is homework a necessary evil?

After decades of debate, researchers are still sorting out the truth about homework’s pros and cons. One point they can agree on: Quality assignments matter.

By Kirsten Weir

March 2016, Vol 47, No. 3

Print version: page 36

After decades of debate, researchers are still sorting out the truth about homework’s pros and cons. One point they can agree on: Quality assignments matter.

  • Schools and Classrooms

Homework battles have raged for decades. For as long as kids have been whining about doing their homework, parents and education reformers have complained that homework's benefits are dubious. Meanwhile many teachers argue that take-home lessons are key to helping students learn. Now, as schools are shifting to the new (and hotly debated) Common Core curriculum standards, educators, administrators and researchers are turning a fresh eye toward the question of homework's value.

But when it comes to deciphering the research literature on the subject, homework is anything but an open book.

The 10-minute rule

In many ways, homework seems like common sense. Spend more time practicing multiplication or studying Spanish vocabulary and you should get better at math or Spanish. But it may not be that simple.

Homework can indeed produce academic benefits, such as increased understanding and retention of the material, says Duke University social psychologist Harris Cooper, PhD, one of the nation's leading homework researchers. But not all students benefit. In a review of studies published from 1987 to 2003, Cooper and his colleagues found that homework was linked to better test scores in high school and, to a lesser degree, in middle school. Yet they found only faint evidence that homework provided academic benefit in elementary school ( Review of Educational Research , 2006).

Then again, test scores aren't everything. Homework proponents also cite the nonacademic advantages it might confer, such as the development of personal responsibility, good study habits and time-management skills. But as to hard evidence of those benefits, "the jury is still out," says Mollie Galloway, PhD, associate professor of educational leadership at Lewis & Clark College in Portland, Oregon. "I think there's a focus on assigning homework because [teachers] think it has these positive outcomes for study skills and habits. But we don't know for sure that's the case."

Even when homework is helpful, there can be too much of a good thing. "There is a limit to how much kids can benefit from home study," Cooper says. He agrees with an oft-cited rule of thumb that students should do no more than 10 minutes a night per grade level — from about 10 minutes in first grade up to a maximum of about two hours in high school. Both the National Education Association and National Parent Teacher Association support that limit.

Beyond that point, kids don't absorb much useful information, Cooper says. In fact, too much homework can do more harm than good. Researchers have cited drawbacks, including boredom and burnout toward academic material, less time for family and extracurricular activities, lack of sleep and increased stress.

In a recent study of Spanish students, Rubén Fernández-Alonso, PhD, and colleagues found that students who were regularly assigned math and science homework scored higher on standardized tests. But when kids reported having more than 90 to 100 minutes of homework per day, scores declined ( Journal of Educational Psychology , 2015).

"At all grade levels, doing other things after school can have positive effects," Cooper says. "To the extent that homework denies access to other leisure and community activities, it's not serving the child's best interest."

Children of all ages need down time in order to thrive, says Denise Pope, PhD, a professor of education at Stanford University and a co-founder of Challenge Success, a program that partners with secondary schools to implement policies that improve students' academic engagement and well-being.

"Little kids and big kids need unstructured time for play each day," she says. Certainly, time for physical activity is important for kids' health and well-being. But even time spent on social media can help give busy kids' brains a break, she says.

All over the map

But are teachers sticking to the 10-minute rule? Studies attempting to quantify time spent on homework are all over the map, in part because of wide variations in methodology, Pope says.

A 2014 report by the Brookings Institution examined the question of homework, comparing data from a variety of sources. That report cited findings from a 2012 survey of first-year college students in which 38.4 percent reported spending six hours or more per week on homework during their last year of high school. That was down from 49.5 percent in 1986 ( The Brown Center Report on American Education , 2014).

The Brookings report also explored survey data from the National Assessment of Educational Progress, which asked 9-, 13- and 17-year-old students how much homework they'd done the previous night. They found that between 1984 and 2012, there was a slight increase in homework for 9-year-olds, but homework amounts for 13- and 17-year-olds stayed roughly the same, or even decreased slightly.

Yet other evidence suggests that some kids might be taking home much more work than they can handle. Robert Pressman, PhD, and colleagues recently investigated the 10-minute rule among more than 1,100 students, and found that elementary-school kids were receiving up to three times as much homework as recommended. As homework load increased, so did family stress, the researchers found ( American Journal of Family Therapy , 2015).

Many high school students also seem to be exceeding the recommended amounts of homework. Pope and Galloway recently surveyed more than 4,300 students from 10 high-achieving high schools. Students reported bringing home an average of just over three hours of homework nightly ( Journal of Experiential Education , 2013).

On the positive side, students who spent more time on homework in that study did report being more behaviorally engaged in school — for instance, giving more effort and paying more attention in class, Galloway says. But they were not more invested in the homework itself. They also reported greater academic stress and less time to balance family, friends and extracurricular activities. They experienced more physical health problems as well, such as headaches, stomach troubles and sleep deprivation. "Three hours per night is too much," Galloway says.

In the high-achieving schools Pope and Galloway studied, more than 90 percent of the students go on to college. There's often intense pressure to succeed academically, from both parents and peers. On top of that, kids in these communities are often overloaded with extracurricular activities, including sports and clubs. "They're very busy," Pope says. "Some kids have up to 40 hours a week — a full-time job's worth — of extracurricular activities." And homework is yet one more commitment on top of all the others.

"Homework has perennially acted as a source of stress for students, so that piece of it is not new," Galloway says. "But especially in upper-middle-class communities, where the focus is on getting ahead, I think the pressure on students has been ratcheted up."

Yet homework can be a problem at the other end of the socioeconomic spectrum as well. Kids from wealthier homes are more likely to have resources such as computers, Internet connections, dedicated areas to do schoolwork and parents who tend to be more educated and more available to help them with tricky assignments. Kids from disadvantaged homes are more likely to work at afterschool jobs, or to be home without supervision in the evenings while their parents work multiple jobs, says Lea Theodore, PhD, a professor of school psychology at the College of William and Mary in Williamsburg, Virginia. They are less likely to have computers or a quiet place to do homework in peace.

"Homework can highlight those inequities," she says.

Quantity vs. quality

One point researchers agree on is that for all students, homework quality matters. But too many kids are feeling a lack of engagement with their take-home assignments, many experts say. In Pope and Galloway's research, only 20 percent to 30 percent of students said they felt their homework was useful or meaningful.

"Students are assigned a lot of busywork. They're naming it as a primary stressor, but they don't feel it's supporting their learning," Galloway says.

"Homework that's busywork is not good for anyone," Cooper agrees. Still, he says, different subjects call for different kinds of assignments. "Things like vocabulary and spelling are learned through practice. Other kinds of courses require more integration of material and drawing on different skills."

But critics say those skills can be developed with many fewer hours of homework each week. Why assign 50 math problems, Pope asks, when 10 would be just as constructive? One Advanced Placement biology teacher she worked with through Challenge Success experimented with cutting his homework assignments by a third, and then by half. "Test scores didn't go down," she says. "You can have a rigorous course and not have a crazy homework load."

Still, changing the culture of homework won't be easy. Teachers-to-be get little instruction in homework during their training, Pope says. And despite some vocal parents arguing that kids bring home too much homework, many others get nervous if they think their child doesn't have enough. "Teachers feel pressured to give homework because parents expect it to come home," says Galloway. "When it doesn't, there's this idea that the school might not be doing its job."

Galloway argues teachers and school administrators need to set clear goals when it comes to homework — and parents and students should be in on the discussion, too. "It should be a broader conversation within the community, asking what's the purpose of homework? Why are we giving it? Who is it serving? Who is it not serving?"

Until schools and communities agree to take a hard look at those questions, those backpacks full of take-home assignments will probably keep stirring up more feelings than facts.

Further reading

  • Cooper, H., Robinson, J. C., & Patall, E. A. (2006). Does homework improve academic achievement? A synthesis of research, 1987-2003. Review of Educational Research, 76 (1), 1–62. doi: 10.3102/00346543076001001
  • Galloway, M., Connor, J., & Pope, D. (2013). Nonacademic effects of homework in privileged, high-performing high schools. The Journal of Experimental Education, 81 (4), 490–510. doi: 10.1080/00220973.2012.745469
  • Pope, D., Brown, M., & Miles, S. (2015). Overloaded and underprepared: Strategies for stronger schools and healthy, successful kids . San Francisco, CA: Jossey-Bass.

Letters to the Editor

  • Send us a letter
  • Future Students
  • Current Students
  • Faculty/Staff

Stanford GSE

News and Media

  • News & Media Home
  • Research Stories
  • School’s In
  • In the Media

You are here

More than two hours of homework may be counterproductive, research suggests.

Education scholar Denise Pope has found that too much homework has negative impacts on student well-being and behavioral engagement (Shutterstock)

A Stanford education researcher found that too much homework can negatively affect kids, especially their lives away from school, where family, friends and activities matter.   "Our findings on the effects of homework challenge the traditional assumption that homework is inherently good," wrote Denise Pope , a senior lecturer at the Stanford Graduate School of Education and a co-author of a study published in the Journal of Experimental Education .   The researchers used survey data to examine perceptions about homework, student well-being and behavioral engagement in a sample of 4,317 students from 10 high-performing high schools in upper-middle-class California communities. Along with the survey data, Pope and her colleagues used open-ended answers to explore the students' views on homework.   Median household income exceeded $90,000 in these communities, and 93 percent of the students went on to college, either two-year or four-year.   Students in these schools average about 3.1 hours of homework each night.   "The findings address how current homework practices in privileged, high-performing schools sustain students' advantage in competitive climates yet hinder learning, full engagement and well-being," Pope wrote.   Pope and her colleagues found that too much homework can diminish its effectiveness and even be counterproductive. They cite prior research indicating that homework benefits plateau at about two hours per night, and that 90 minutes to two and a half hours is optimal for high school.   Their study found that too much homework is associated with:   • Greater stress : 56 percent of the students considered homework a primary source of stress, according to the survey data. Forty-three percent viewed tests as a primary stressor, while 33 percent put the pressure to get good grades in that category. Less than 1 percent of the students said homework was not a stressor.   • Reductions in health : In their open-ended answers, many students said their homework load led to sleep deprivation and other health problems. The researchers asked students whether they experienced health issues such as headaches, exhaustion, sleep deprivation, weight loss and stomach problems.   • Less time for friends, family and extracurricular pursuits : Both the survey data and student responses indicate that spending too much time on homework meant that students were "not meeting their developmental needs or cultivating other critical life skills," according to the researchers. Students were more likely to drop activities, not see friends or family, and not pursue hobbies they enjoy.   A balancing act   The results offer empirical evidence that many students struggle to find balance between homework, extracurricular activities and social time, the researchers said. Many students felt forced or obligated to choose homework over developing other talents or skills.   Also, there was no relationship between the time spent on homework and how much the student enjoyed it. The research quoted students as saying they often do homework they see as "pointless" or "mindless" in order to keep their grades up.   "This kind of busy work, by its very nature, discourages learning and instead promotes doing homework simply to get points," said Pope, who is also a co-founder of Challenge Success , a nonprofit organization affiliated with the GSE that conducts research and works with schools and parents to improve students' educational experiences..   Pope said the research calls into question the value of assigning large amounts of homework in high-performing schools. Homework should not be simply assigned as a routine practice, she said.   "Rather, any homework assigned should have a purpose and benefit, and it should be designed to cultivate learning and development," wrote Pope.   High-performing paradox   In places where students attend high-performing schools, too much homework can reduce their time to foster skills in the area of personal responsibility, the researchers concluded. "Young people are spending more time alone," they wrote, "which means less time for family and fewer opportunities to engage in their communities."   Student perspectives   The researchers say that while their open-ended or "self-reporting" methodology to gauge student concerns about homework may have limitations – some might regard it as an opportunity for "typical adolescent complaining" – it was important to learn firsthand what the students believe.   The paper was co-authored by Mollie Galloway from Lewis and Clark College and Jerusha Conner from Villanova University.

Clifton B. Parker is a writer at the Stanford News Service .

More Stories

A college advisor talks with a student

⟵ Go to all Research Stories

Get the Educator

Subscribe to our monthly newsletter.

Stanford Graduate School of Education

482 Galvez Mall Stanford, CA 94305-3096 Tel: (650) 723-2109

  • Contact Admissions
  • GSE Leadership
  • Site Feedback
  • Web Accessibility
  • Career Resources
  • Faculty Open Positions
  • Explore Courses
  • Academic Calendar
  • Office of the Registrar
  • Cubberley Library
  • StanfordWho
  • StanfordYou

Improving lives through learning

homework effect on teachers

  • Stanford Home
  • Maps & Directions
  • Search Stanford
  • Emergency Info
  • Terms of Use
  • Non-Discrimination
  • Accessibility

© Stanford University , Stanford , California 94305 .

teacherhead

Zest for learning… into the rainforest of teaching.

homework effect on teachers

Homework: What does the Hattie research actually say?

homework effect on teachers

This is an excellent book.  It is an attempt to distil the key messages from the vast array of studies that have been undertaken across the world into all the different factors that lead to educational achievement.  As you would hope and expect, the book contains details of the statistical methodology underpinning a meta-analysis and the whole notion of ‘effect size’ that drives the thinking in the book.  There is a discussion about what is measurable and how effect size can be interpreted in different ways. The key outcomes are interesting, suggesting a number of key factors that are likely to make the greatest impact in classrooms and more widely in the lives of learners.

My main interest here is to explore what Hattie says about homework.  This stems from a difficulty I have when I hear or read, fairly often, that ‘research shows that homework makes no difference’. It is cited as a hard fact in articles such as this one by Tim Lott in the Guardian: Why do we torment kids with homework?    Even though Tim is talking about his 6 year old, and cites research that refers to ‘younger kids’, too often the sweeping generalisation is applied to all homework for all students.  It bugs me and I think it is wrong.

homework effect on teachers

I have written about my views on homework under the heading ‘Homework Matters: Great Teachers set Great Homework’ . I’ve said that all my instincts as a teacher (and a parent) tell me that homework is a vital element in the learning process; reinforcing the interaction between teacher and student; between home and school and paving the way to students being independent autonomous learners.  Am I biased? Yes.  Is this based on hunches and personal experience? Of course.  Is it backed up by research……?  Well that is the question.

So, what does Hattie say about homework?

Helpfully he uses Homework studies as an example of the overall process of meta-analyses, so there is plenty of material. In a key example, he describes a study of five meta-analyses that capture 161 separate studies involving over 100,000 students as having an effect size d= 0.29.  What does this mean?  This is the best typical effect size across all the studies, suggesting:

  • improving the rate of learning by 15% – or advancing children’s learning by about a year
  • 65% of effects were positive
  • 35% of effects were negative
  • average achievement exceeded 62% of the levels of students not given homework.

However, there are other approaches such as  the ‘common language effect’ (CLE) that compares effects from different  distributions. For homework a d= 0.29 effect translates into a 21% chance that homework will make a positive difference.  Or, from two classes, 21 times out of a 100, using homework will be more effective.   Hattie then says that terms such as ‘small, medium and large’ need to be used with caution in respect of effect size.  He is ambitious and won’t accept comparison with 0.0 as a sign of a good strategy.   He cites Cohen as suggesting with reason that 0.2 is small, 0.4 is medium and 0.6 is large and later argues himself that  we need a hinge-point where d > 0.4 is needed for an effect to be above average and d > 0.6 to be considered excellent.

OK.  So what is this all saying. Homework, taken as an aggregated whole, shows an effect size of d= 0.29 that is between small and medium?  Oh.. but wait… here comes an important detail.  Turn the page:  The studies show that the effect size at Primary Age is d = 0.15 and for Secondary students it is d = 0.64!  Well, now we are starting to make some sense.  On this basis, homework for secondary students has an ‘excellent’ effect.  I am left thinking that, with a difference so marked, surely it is pure nonsense to aggregate these measures in the first place?

Hattie goes on to report that other factors make a difference to the results:  eg when what is measured is very precise (eg improving addition or phonics), a bigger effect is seen compared to when the outcome is more ephemeral. So, we need to be clear:  what is measured has an impact on the scale of the effect.  This means that we have to throw in all kinds of caveats about the validity of the process.  There will be some forms of homework more likely to show an effect than others;  it is not really sensible to lump all work that might be done in between lessons into the catch-all ‘homework’ and then to talk about an absolute measure of impact.  Hattie is at pains to point out that there will be great variations across the different studies that simply average out to the effect size on his barometers.  Again, in truth, each study really needs to be looked at in detail.  What kind of homework? What measure of attainment?  What type of students?  And so on…. so many variables that aggregating them together is more or less made meaningless?  Well, I’d say so.

Nevertheless, d= 0.64!  That matches my predisposed bias so I should be happy.  q.e.d.  Case closed.  I’m right and all the nay-sayers are wrong. Maybe, but the detail, as always, is worth looking at.  Hattie suggests that the reason for the difference between the  d=0.15 at primary level at d=0.64 at secondary is that younger students can’t under take unsupported study as well, they can’t filter out irrelevant information or avoid environmental distractions – and if they struggle, the overall effect can be negative.

At secondary level he suggests there is no evidence that prescribing homework develops time management skills and that the highest effects in secondary are associated with rote learning, practice or rehearsal of subject matter; more task-orientated homework has higher effects that deep learning and problem solving.  Overall, the more complex, open-ended and unstructured tasks are, the lower the effect sizes.  Short, frequent homework closely monitored by teachers has more impact that their converse forms and effects are higher for higher ability students than lower ability students, higher for older rather than younger students.  Finally, the evidence is that teacher involvement in homework is key to its success.

So, what Hattie actually says about homework is complex.  There is no meaningful sense in which it could be stated that “the research says X about homework” in a simple soundbite.  There are some lessons to learn:

The more specific and precise the task is, the more likely it is to make an impact for all learners.  Homework that is more open, more complex is more appropriate for able and older students. Teacher monitoring and involvement is key – so putting students in a position where their learning is too complex, extended or unstructured to be done unsupervised is not healthy.  This is more likely for young children, hence the very low effect size for primary age students.

All of this makes sense to me and none of it challenges my predisposition to be a massive advocate for homework.  The key is to think about the micro- level issues, not to lose all of that in a ridiculous averaging process.  Even at primary level, students are not all the same.  Older, more able students in Year 5/6 may well benefit from homework where kids in Year 2 may not.  Let’s not lose the trees for the wood!  Also, what Hattie shows is that educational inputs, processes and outcomes are all highly subjective human interactions.  Expecting these things to be reduced sensibly into scientifically absolute measured truths is absurd.  Ultimately, education is about values and attitudes and we need to see all research in that context.

PS. If you are reading this from Sweden, Tack för läsning. Låt mig veta era tankar om denna fråga.

Update :  Note that Hattie himself has commented on this blog post: https://teacherhead.com/2012/10/21/homework-what-does-the-hattie-research-actually-say/comment-page-1/#comment-536

(Slides from a Teach First session on homework are here: Teach First Homework )

See also  Setting Great Homework: The Mode A:Mode B approach.

homework effect on teachers

Share this:

87 comments.

[…] See Tom Sherrington’s (@HeadGuruTeacher) discussion of some of the research on homework in his blog-post here. […]

“The biggest mistake Hattie makes is with the CLE statistic that he uses throughout the book. In ‘Visible Learning, Hattie only uses two statistics, the ‘Effect Size’ and the CLE (neither of which Mathematicians use).

The CLE is meant to be a probability, yet Hattie has it at values between -49% and 219%. Now a probability can’t be negative or more than 100% as any Year 7 will tell you.”

https://ollieorange2.wordpress.com/2014/08/25/people-who-think-probabilities-can-be-negative-shouldnt-write-books-on-statistics/

[…] at the evidence on homework from Hattie, we’re committed to setting homework but need to be mindful that only certain types of […]

[…] Hattie has thrown some doubt over the effectiveness of homework as an intervention, wouldn’t it be better to, as Tom […]

Nice post… Though, here homework is to target students who are a bit older. Pupils at elementary level or less than 4years may not be taken serious on assignment issue.

Like Liked by 1 person

[…] mentions a couple of bits of research in his post regarding the effectiveness of homework on learning and although some studies suggest a […]

[…] of analogies with outdoor pursuits tasks like learning to abseil. If you look at the detail – for example the homework chapter as I did here – you also learn about the complexities of education research itself and the importance of […]

[…] Homework: What does the Hattie research actually say? […]

[…] as an example of the pitfalls of interpreting the results. I’ve written about this in full in this post.  It was interesting to see, in a ranking of the Visible Learning effect sizes, that Drugs appears […]

I’m reading this from Sweden (although I am originally from the UK and trained to teach there). 7 years ago I did a piece of small action research as part of my masters degree looking at how I used homework. I was in the UK at the time. My findings, although much smaller scale, would support Hattie. Homework has an impact but you must design it properly was my basic conclusion. My problem is having to deal with the huge number of parent conversations and societal attitude towards homework in Sweden (which are generally negative), and school in general really, fuelled by the media and it’s anti-homework stance. My own feeling about the attitude is that academic learning should happen only in school. Even getting parents to do something so simple as read with their child can cause endless arguments. No it’s not all parents and it does depend a lot on your location. But that’s just my experience of the schools I have worked in.

Hi LUNATIKSCIENCE,

I am currently in the process of looking into a whole school home learning policy and I would be really interested to read the work you did. I have been trying to read as much research into home learning as possible, but getting some actual data would be great.

Would you be able to share any additional information in regards to your findings?

Many thanks Alasdair

Hi, sorry just noticed this comment. I can send you my paper that I wrote if you are still interested.

Yes please. That would be so useful.

[email protected]

Thanks very much.

Hi, I would also be really interested if you were happy to share your research. I am DH working in a Prep school that is in the midst of analysing our approach to homework. Thanks

[…] I posted a link to a pro-homework argument. Again today, I’ve stumbled across another–this one summarizing John Hattie’s Visible Learning on the […]

Reblogged this on The Maths Mann .

[…] When preparing for our leadership planning day yesterday, I was investigating how to build on-going professional teaching conversations (as an alternative to Performance Review) that I stumbled upon John Hattie again talking up collective teacher efficacy on the Principal Centre Radio podcast. If you are not familiar with Hattie, his name is rarely far from discussions about teacher effectiveness… Visible Learning, 1400 meta-analyses, 80,000 studies, 300 million students… what works best in education (still, his chosen research approach, meta-analysis, is now without its detractors or straightforward teacher criticism.) […]

[…] Feel free to leave comments/thoughts between meetings here e.g. Sam sent me this link: Homework: What does the Hattie research actually say? […]

But you make the assumption that educational achievement is per se, the only thing affected. Easy to see from the teacher /school perspective. But from the parent /home perspective, there may be many more valuable activities going on that are much more important than homework, to the growth of the human being. So these things need to be taken into account too. Where homework detracts from the time spent on these, then it could be good from a school education point of view but bad from a more all – round education point of view.

[…] Homework: What does the research say about its effectiveness? […]

[…] of the problem is that the research on homework, although plentiful, is unclear. In his post Homework: What does the Hattie research actually say? blogger, author of The Learning Rainforest and education consultant Tom Sherrington unpicks the […]

[…] I have explored issues with homework in various different posts.  In particular, the research into homework by John Hattie is covered in detail in this post: Homework: What does the Hattie research actually say? […]

[…] Homework: What does the Hattie research actually say? […]

This is an excellent summary of Hattie’s work and gives us all good for thought about what could be meaningful and helpful and what to avoid when considering homework.

[…] ‘Homework: What does the Hattie research actually say?’ by Tom Sherrington. It’s important to keep in mind that all the research around homework applies to remote learning: Specific and precise tasks are more successful than longer tasks that involve complex problem solving, higher ability students benefit more than lower ability students, older students benefit more than younger students, and teacher monitoring is crucial. […]

[…] is more crucial for novices and less effective as students gain expertise, and homework has little impact on educational outcomes, particularly for young […]

[…] a lack of evidence is not the same as evidence that an approach is not successful. This echoes the comments Hattie made here about Visible Learning: “Visible Learning is a literature review, therefore it says what HAS happened not what COULD […]

Leave a comment Cancel reply

' src=

  • Already have a WordPress.com account? Log in now.
  • Subscribe Subscribed
  • Copy shortlink
  • Report this content
  • View post in Reader
  • Manage subscriptions
  • Collapse this bar

ORIGINAL RESEARCH article

The effects of teachers' homework follow-up practices on students' efl performance: a randomized-group design.

\r\nPedro Rosrio*

  • 1 Departamento de Psicologia Aplicada, Escola de Psicologia, Universidade do Minho, Braga, Portugal
  • 2 Departamento de Psicologia, Universidad de Oviedo, Oviedo, Spain
  • 3 Vicerrectoría Académica, Universidad Central de Chile, Santiago de Chile, Chile
  • 4 Facultad de Educación, Universidad Autónoma de Chile, Santiago de Chile, Chile

This study analyzed the effects of five types of homework follow-up practices (i.e., checking homework completion; answering questions about homework; checking homework orally; checking homework on the board; and collecting and grading homework) used in class by 26 teachers of English as a Foreign Language (EFL) using a randomized-group design. Once a week, for 6 weeks, the EFL teachers used a particular type of homework follow-up practice they had previously been assigned to. At the end of the 6 weeks students completed an EFL exam as an outcome measure. The results showed that three types of homework follow-up practices (i.e., checking homework orally; checking homework on the board; and collecting and grading homework) had a positive impact on students' performance, thus highlighting the role of EFL teachers in the homework process. The effect of EFL teachers' homework follow-up practices on students' performance was affected by students' prior knowledge, but not by the number of homework follow-up sessions.

Introduction

Homework is defined as a set of school tasks assigned by teachers to be completed by students out of school ( Cooper, 2001 ). Several studies have showed the positive impact of this instructional tool to enhance students' school performance and develop study skills, self-regulation, school engagement, discipline, and responsibility (e.g., Cooper et al., 2006 ; Rosário et al., 2009 , 2011 ; Buijs and Admiraal, 2013 ; Hagger et al., 2015 ).

In the homework process teachers have two major tasks: designing and setting activities ( Epstein and Van Voorhis, 2001 , 2012 ; Trautwein et al., 2009a ), and checking and/or providing homework feedback to students ( Trautwein et al., 2006b ; Núñez et al., 2014 ). Cooper (1989) called the later “classroom follow-up” (p. 87). Classroom follow-up includes feedback provided by the teacher (e.g., written comments, marking homework, and incentives; Cooper, 1989 , 2001 ). Hattie and Timperley (2007) defined feedback as the information provided by an educational agent or the student (self) on aspects of the performance. Feedback is an important source of information for checking answers ( Narciss, 2004 ) and improving academic performance ( Nicol and Macfarlane-Dick, 2006 ; Shute, 2008 ; Duijnhouwer et al., 2012 ). According to Walberg and Paik (2000) , feedback is “the key to maximizing the positive impact of homework” (p. 9) because teachers take advantage of the opportunity to reinforce the work that was well-done by the students or teach them something new that would help them improve their work. Moreover, Cooper (1989 , 2001) argued that the way teachers manage students' homework assignments presented in classroom may influence how much students benefit from homework.

Research on homework, with a particular focus on the homework follow-up practices commonly used by teachers, has looked into various practices such as homework control perceived by students (e.g., Trautwein et al., 2006a , b ), teachers' feedback on homework ( Cardelle and Corno, 1981 ; Elawar and Corno, 1985 ), and feedback on homework perceived by students (e.g., Xu, 2008 , 2010 ). Studies conducted in several countries (e.g., Germany, Hong Kong, Singapore) reported homework control (i.e., checking whether students have completed their homework) as the homework follow-up practice teachers use in class most often in elementary and middle school levels (see Trautwein et al., 2009a ; Kaur, 2011 ; Zhu and Leung, 2012 ). However, studies carried out in mathematics and French as a second language concluded that controlling homework completion reported by middle school students, or controlling students' homework style reported by teachers (e.g., “By looking at a student's assignment, I can quickly tell how much effort he/she has put into it”) did not have any effect on middle school students' achievement ( Trautwein et al., 2002 , 2009a ). To our knowledge, only the study by Trautwein et al. (2006b) found a positive predictive effect of homework control perceived by middle school students on students' homework effort in French as a Foreign Language at the student level but not at the class level.

Regarding homework feedback, Walberg and Paik (2000) described “[homework feedback as] the key to maximizing the positive impact of homework” (p. 9). In fact, the literature has evidenced a positive relationship between homework feedback and students' outcomes. For example, Xu (2008 , 2010) examined the benefits of homework feedback using a measure of teacher's feedback on homework. This measure assessed middle and high school students' perceptions on topics such as: discussing homework, collecting homework, checking homework, grading homework [i.e., assigning numerical grades for homework], and counting homework completion for students' overall grade. However, Xu (2008) ; Xu (2010) did notanalyzed the impact of any particular feedback practice. The same author found a positive relationship between homework feedback provided by teachers (as perceived by the middle and high school students) and students' interest in homework ( Xu, 2008 ); students' homework management ( Xu, 2012 ; Xu and Wu, 2013 ); and students' homework completion ( Xu, 2011 ). More recently, Núñez et al. (2014) analyzed the relationship between teachers' homework feedback as perceived by students from the fifth to the twelfth grade and academic achievement, and reported an indirect relationship between homework feedback and academic achievement through students' homework behaviors (e.g., amount of homework completed).

Other studies on homework have examined the effects of written feedback on students' academic outcomes. In particular, Cardelle and Corno (1981) and Elawar and Corno (1985) , examined the effects of three types of written homework feedback (i.e., praise, constructive criticism, constructive criticism plus praise) using an experimental design, and concluded that student's performance when given constructive criticism plus praise was higher than when given the other two types of feedback in primary education ( Elawar and Corno, 1985 ) or in higher education ( Cardelle and Corno, 1981 ). These results stress how important teachers' feedback may be not only because of its positive effect on homework, but also because it provides students with information on how to improve their work ( Cardelle and Corno, 1981 ). The synthesis by Walberg et al. (1985) confirmed the results of previous studies and showed that “commented upon or graded homework” (p.76) increased the positive effect of homework on academic achievement of elementary and secondary students.

The literature has shown the effect of some teachers' homework follow-up practices on students' homework behaviors and academic achievement ( Xu, 2012 ; Xu and Wu, 2013 ; Núñez et al., 2014 ), yet the use of different measures and sources of information (e.g., see Trautwein et al., 2006b , 2009a ) makes it difficult for researchers to draw conclusions about the benefits of the various types of homework follow-up practices. Moreover, Trautwein et al. (2006b) suggested that future studies should include other dimensions of teachers' homework practices (e.g., checking homework completion, grading homework). However, to our knowledge, research has not yet analyzed the effects of the various types of homework follow-up practices used by teachers.

To address this call, we used a quasi-experimental design in a study conducted in an authentic learning environment in order to analyze the relationship between five types of homework follow-up practices (i.e., 1, Checking homework completion ; 2, Answering questions about homework ; 3, Checking homework orally ; 4, Checking homework on the board ; and 5, Collecting and grading homework ) used by EFL teachers and their students' performance in English. Findings may be useful to school administrators and teachers as they may learn and reflect upon the effects of the homework follow-up practices used in class, which may in turn promote homework effectiveness and school success.

Considering the scarce results of prior studies, it was not possible to establish specific hypotheses regarding the relationship between type of homework feedback and student academic performance. However, taking into account the nature of each type of feedback and its implications for student learning process, in this study we hypothesize that:

(1) The types of homework feedback analyzed are differentially associated with student academic performance (increasing from types 1–5);

(2) The magnitude of the impact of the types of teacher homework feedback on academic performance is associated with students' prior level of performance.

Participants

A randomized-group design study was conducted in which 45 EFL teachers (classes) were randomly assigned to five homework follow-up conditions (nine EFL teachers per condition). Nineteen teachers were excluded from the study for various reasons (three were laid off, six did not give an accurate report of the procedures followed or submitted the data requested, and 10 did not follow the protocol closely. In the end 26 EFL teachers (20 females) aged 28–54 participated in the study. The final distribution of the teachers per condition was as follows: Type 1 (4); Type 2 (3); Type 3 (5); Type 4 (15); Type 5 (2). Participants had 3–30 years of teaching experience ( M = 19) and taught English to a total of 553 sixth-graders at six state schools in the north of Portugal. Students' age ranged 10–13 ( M = 11.05; SD = 0.87), and there were 278 girls (50.3%) and 275 boys (49.7%).

Learning English as a foreign language is compulsory from fifth to ninth grade in all Portuguese middle schools. Middle school is divided into two stages: the first stage includes fifth and sixth grade (age range 10–11), and the second stage includes seventh to ninth grade (age range 12–14). Our study was conducted with sixth grade students, which is the last year of the first stage. English is taught in two 90-min weekly lessons. As the Portuguese public school system has not enacted any specific homework policies, teachers are free to decide on the amount, frequency, and type of homework they design. This study was carried out in accordance with the recommendations of the ethics committee of the University of Minho, with written informed consent from all subjects enrolled (i.e., teachers and their students). All subjects gave written informed consent in accordance with the Declaration of Helsinki.

The two English performance measures used in this study were collected from the schools' secretary's office. Prior performance (used as a pretest) was obtained from students' grades in a final English exam completed at the end of the previous school year (end of June). Fifth grade EFL students from the six public schools enrolled in the study (all from the same region of the country) completed the same non-standardized exam in the end of the school year (June). This English exam comprised 30 questions on reading comprehension skills, vocabulary, and grammar which were calibrated by a group of EFL teachers from all the intervening schools.

Final academic performance (used as a posttest) was obtained from the students' grades in a final English exam set up specifically for this study and completed at the end of it (beginning of November). The posttest exam was made up of 20 questions designed to assess students' reading comprehension skills, vocabulary, grammar (contents covered in homework assignments 1, 2, 4, and 5), translation skills from English into Portuguese and vice versa, and writing of a short text (5–10 lines; contents covered in homework assignments 3 and 6). The exam lasted 45 min. Grades in the Portuguese compulsory educational system (first to ninth grade) range from 1 to 5, where 1 and 2 is fail, 3 pass, 4 good, and 5 excellent.

To accomplish our goal, the types of homework follow-up practices were selected from the ones identified in the literature (e.g., Walberg et al., 1985 ; Murphy et al., 1987 ; Cooper, 1989 , 2001 ; Trautwein et al., 2006b ). To learn which homework follow-up practices were used by teachers in class to deal with students' delivery of homework assignments, 15 Portuguese middle school EFL teachers were invited to participate in two focus group interviews (one group comprised seven teachers and the other eight teachers). Note that these EFL teachers did not participate in the research intervention.

Findings from this ancillary study allowed the confirmation of the two homework follow-up practices reported in the literature (i.e., checking homework completion, collecting, and grading homework ), and identified three additional practices which were used in the current study. Data from this ancillary study will not be described in detail due to space constraints. Nevertheless, some examples of each homework practice are presented in Table 1 .

www.frontiersin.org

Table 1. The five types of homework follow-up practices exemplified with quotations from the participating teachers in the focus group interviews .

Five homework follow-up practices were included in our study as follows: (1) Checking homework completion; (2) Answering questions about homework; (3) Checking homework orally; (4) Checking homework on the board; and (5) Collecting and grading homework. Types 1 and 5 were based on the literature ( Walberg et al., 1985 ; Murphy et al., 1987 ; Trautwein et al., 2006b ), and types 2–4 emerged in the focus group interviews with the EFL teachers, and were included in this study because of their local relevance.

Data were collected at the beginning of the school year (between mid-September and end of October) after obtaining permission from schools' head offices. EFL teachers confirmed their intention to participate via email, and from those who had confirmed participation, 45 and their students were randomly selected. Two weeks before the beginning of the study, the 45 EFL teachers participated in a 4-h information meeting which explained the project's aims and the research design in detail (e.g., analysis and discussion of the format and content of the English exam to assess students' performance; and information on the frequency, number, and type of homework assignments; guidelines to mark the homework assignments; and the five types of homework follow-up practices). Additionally, teachers were informed that they would be randomly assigned to an experimental condition, and the associated methodological reasons were discussed with the participants. All teachers agreed and were then randomly assigned to one of the five homework follow-up conditions (nine teachers per condition). However, only 26 teachers completed the study (see Section Participants). At the meeting, all teachers agreed to assign homework to their students only once a week (in the first class of the week) and to check homework completion in the following class using the type of homework follow-up condition they had been assigned to. The six homework assignments were extracted from the English textbook and common to all participants. Two different types of homework were assigned. The first type had reading comprehension, vocabulary, and grammar questions (homework assignments 1, 2, 4, and 5). The second type (homework assignments 3 and 6) had a translation exercise from English into Portuguese and vice versa, and writing of a short text in English (5–10 lines). After selecting the homework exercises, teachers worked on the guidelines to mark each homework assignment, and built a grade tracking sheet to be filled in with information regarding each student and each homework. The grade tracking sheet filled in with students data was delivered to researchers in the following class.

At the end of each lesson, the students noted down the instructions for the homework assignment in their notebooks and completed it out of class.

The researchers gave the EFL teachers extensive training on the homework follow-up practices in order to guarantee that all the participants under each condition followed the same protocol. During the information meeting a combination of theory and practice, open discussion, and role-playing exercises were used.

For each condition, the protocol was as follows. For homework follow-up condition no. 1 ( checking homework completion ), the teacher began the class asking students whether they had completed their homework assignment (i.e., yes, no) and recorded the data on a homework assignment sheet. For homework follow-up condition no. 2 ( answering questions about homework ), the teacher began the class asking students if they had any questions about the homework assignment (e.g., Please, ask any questions if there is something in the homework which you did not understand.), in which case the teacher would answer them. For homework follow-up condition no. 3 ( checking homework orally ), the teacher began the class checking homework orally. Under this condition the teachers proactively read the homework previously assigned to students and orally checked all the tasks or questions (i.e., the teacher read the questions and students answered them aloud, followed by an explanation of the mistakes made by students). For homework follow-up condition no. 4 ( checking homework on the board ), the teacher started the class by writing the answer to each of the homework questions on the board. Following the explanation to a specific question or task, the EFL teachers explicitly asked the class: “Do you have any other questions?” and moved on to the next question. In the case of homework follow-up condition no. 5 ( collecting and grading homework ), the teacher began the class handing out individually checked and graded homework to students. For homework assignments 1, 2, 4, and 5 (i.e., reading comprehension, vocabulary, and grammar questions) the EFL teachers pointed out which of answers were incorrect, and provided the correct answer. A numerical grade for each of the exercises and a global grade were awarded. For the second type of homework assignments (3 and 6; i.e., translation from English into Portuguese and vice versa, and writing of a short text in English), the EFL teachers made comments on the text in terms of contents and style, and gave a numerical grade. Students were encouraged to read the teachers' comments on their homework and asked if they had any questions.

To guarantee the reliability of the measurements (i.e., whether the EFL teachers followed the protocol), three research assistants were present at the beginning of each class. For 15 min, the research assistants took notes on the type of homework follow-up used by the teachers using a diary log. The level of overall agreement among the research assistants was estimated with Fleiss's Kappa ( Fleiss, 1981 ). According to Landis and Koch (1977) , the reliability among the research assistants may be rated as good (κ = 0.746; p < 0.001).

Data from the 19 EFL teachers who did not follow the protocol for their assigned homework follow-up condition were not included in the data set. Three weeks after the study, EFL teachers attended a 2-h post-research evaluation meeting with the aim to discuss their experience (e.g., comments and suggestions that could help in future studies; difficulties faced in implementing their experimental condition; reasons for not following the protocol), and analyze preliminary data. At the end of the six homework follow-up sessions, students completed a final English exam as a measure of academic performance (posttest).

Data Analysis

Each of the five homework follow-up practices was to be administered by the same number of EFL teachers (nine). However, as mentioned above, 19 EFL teachers were excluded from the study, which led to an uneven distribution of the participating teachers under the five conditions. As the number of homework follow-up sessions was not even in terms of type, it was not possible to guarantee the independence of these two variables (i.e., number of homework follow-up and type of homework follow-up practice). Thus, the amount of treatments (number of homework follow-up sessions) was taken as a control variable. The effect of the EFL teachers nested within the treatment levels (the five homework follow-up practices) was also controlled, but within the type of design (cluster randomized design). Furthermore, students' prior performance was controlled because of its potential to influence the relationship between homework and academic achievement ( Trautwein et al., 2002 , 2009b ).

Finally, the design included an independent variable (type of homework follow-up), a dependent variable (post-homework follow-up academic performance), and two covariates (number of homework follow-up sessions administered and performance prior to homework follow-up). The statistical treatment of the data was carried out using analysis of covariance (ANCOVA). Data analysis followed a two-stage strategy. First, we examined whether prior performance (pretest) significantly explained academic performance at posttest (which led to testing whether the regression slopes were null). If the result was positive, it would not be necessary to include any covariate in the model, and an ANOVA model would be fitted. On the other hand, if the result was negative, second stage, it would be necessary to verify whether the regression slopes were parallel (that is, whether the relationship between prior and final performance was similar across the different types of homework follow-up). Finally, in case the parallelism assumption were accepted, paired comparisons between the adjusted homework follow-up type variable measures (i.e., purged of covariate correlations) would be run using the method based on the false discovery rate (FDR) developed by Benjamini and Hochberg (1995) (BH).

Data were analyzed using SAS version 9.4 [ SAS Institute, Inc., (SAS), 2013 ]. The hypotheses referring to nullity and parallelism of the regression slopes were tested using SAS PROC MIXED with the solution proposed by Kenward and Roger (2009) . PROC MIXED allows the use of a linear model that relaxes the assumption of constant variance (for details, see Vallejo et al., 2010 ; Vallejo and Ato, 2012 ). The post-hoc contrasts were done using the ESTIMATE expression in SAS PROC MIXED and the BH/FDR option in SAS PROC MULTITEST.

Descriptive Statistics

Table 2 shows the descriptive statistics of the homework follow-up type variable and the two covariates (prior performance and the number of homework follow-up sessions).

www.frontiersin.org

Table 2. Descriptive statistics of the variable homework follow-up practice and covariates (prior performance and number of times feedback is provided) .

Analysis of Covariance

Null regression curve test.

To determine whether prior performance (pretest) significantly explained academic performance at posttest, a type III sum of squares model without an intercept was created. This model included the homework follow-up type (A), and interactions of homework follow-up type with the covariates prior performance ( X 1 ), and number of homework follow-up sessions ( X 2 ); that is, A × X 1 and A × X 2 . The information obtained in this analysis allowed to consider regression slopes for each level of the homework follow-up type variable, and to evaluate its nullity and, to a certain extent, its parallelism. In summary, the technique used aimed to determine whether covariates (number of homework follow-up sessions administered and performance prior to homework follow-up) modified the interaction between homework follow-up type and final performance. Table 3 addresses this question and shows two model effects: the principal effect (A) and secondary effects ( A × X 1 and A × X 2 ).

www.frontiersin.org

Table 3. Estimators of interaction parameters obtained in the first modeling stage after creating a regression model without an intercept .

Data show that all regression coefficients involving the prior performance covariate were statistically significant ( p < 0.001) with very similar levels for the homework follow-up type variable (between p = 0.86 and 0.96). Thus, we may conclude that the slopes were not null. A strong similarity was also observed between the regression coefficients, which indicates that the number of homework follow-up sessions, with the exception of the coefficient corresponding to level 2 of the homework follow-up type variable ( b A 2 × S = 0.15), was also statistically significant ( p = 0.011).

Parallel Regression Slope Test

To test the hypothesis of regression slope parallelism for the covariates prior performance ( X 1 ) and number of homework follow-up sessions ( X 2 ) on final academic performance, the interaction components A × X 1 and A × X 2 of Model A shown in Table 4 are particularly interesting.

www.frontiersin.org

Table 4. Results of fitting three ANCOVA models and one ANOVA model during the second stage of the modeling strategy .

The data show that the regression slope parallelism hypothesis was not rejected [ F (4, 160) = 0.62, p = 0.646 and F (4, 144) = 2.20, p = 0.071], although the interaction between the number of homework follow-up sessions and the type of homework follow-up turned out to be marginally non-significant. Thus, we provisionally adopted the ANCOVA model that used equal slopes to describe the influence of the covariates on homework follow-up type. Note that the variance component of the students who received homework follow-up type no. 1 was approximately five times the variance of the students receiving type no. 5. Thus, to control the heterogeneity of the data, the GROUP expression in SAS PROC MIXED was used with the solution proposed by Kenward–Roger to adjust for the degrees of freedom ( Kenward and Roger, 2009 ). Moreover, the variance component referring to EFL teachers nested within the homework follow-up types was not statistically significant ( z = 0.15, p = 0.44), so we proceeded with the single-level ANCOVA model.

Findings indicate that the differences among the various homework follow-up types do not depend on the teacher that uses them. This preliminary result stresses the relevance of conducting multilevel designs analyzing data at two levels, students and class. This finding is aligned with those of Rosário et al. (2013) which found a small effect in the relationship between teachers' reported approaches to teaching and students' reported approaches to learning.

Table 4 also shows information regarding the fit of other ANCOVA models with identical slopes: Model B and Model C. Model B shows that the types of homework follow-up did not differ in terms of the number of homework follow-up sessions provided by the EFL teachers ( X 2 ), [ F (1, 373) = 0.16, p = 0.689]. Note that the ANCOVA model with equal regression slope that left out the number of homework follow-up sessions (Model C) was more parsimonious and showed the best fit. The model with the fewest information criteria, Akaike information criteria (AIC) and Bayesian information criteria (BIC), is the model that best fits the data.

The ANCOVA model with equal slopes is shown in Figure 1 . The essential characteristic of the model is worth noting: separate regression lines for each type of homework follow-up and approximately parallel slopes among the homework follow-up types. Figure 1 also shows two subsets of means, each with means that barely differed from each other and were thus considered equal from a statistical standpoint. These subsets encompassed, on the one hand, the first two levels of the homework follow-up type variable (types 1 and 2), and on the other hand, the three last levels of the variable. The equal regression slope ( b = 0.882) between prior performance and final performance, averaging all levels of homework follow-up type, was statistically significant [ t (467) = 36.86, p < 0.001].

www.frontiersin.org

Figure 1. Pretest performance level .

Comparisons between the Adjusted Homework Follow-up Type Means

The common slope ( b = 0.882) was used to calculate the final performance means adjusted to the effect of the prior performance covariate. Purged of the correlation with the prior performance covariate, the adjusted final EFL performance means were A 1 = 3.14; A 2 = 3.11; A 3 = 3.44; A 4 = 3.88; and A 5 = 4.03.

Given the two homogeneous subsets of means previously detected, the family of pairwise comparisons that appear in Table 5 was tested. To control for the probability of making one or more type I errors at the chosen level of significance (α = 0.05) for the specified family or group of contrasts, assuming heterogeneity, the ESTIMATE expression in SAS PROC MIXED was used, as was the BH/FDR option in SAS PROC MULTITEST. As indicated in the last column of Table 5 , the procedure detected statistically significant differences ( p < 0.05) in five of the six contrasts analyzed (see Figure 2 as well).

www.frontiersin.org

Table 5. Pairwise comparisons between the homework follow-up practices based on ANCOVA BH/FDR that controlled for prior performance .

www.frontiersin.org

Figure 2. Types of homework follow-up practices .

Discussion of Results

This study analyzed whether the relationship between academic performance and homework follow-up practices depended on the type of homework follow-up practice used in class. We found that the five types of homework feedback were associated with student academic performance, despite the unbalanced number of teachers in each condition, and the low number of sessions (six sessions). The magnitude of the effects found was small, which may be due to the two previously mentioned limitations. Data from the ancillary analysis collected in the two focus groups run to identify the types of homework follow-up used by EFL teachers in class, and data from the post-research evaluation meeting run with the participating teachers contributed to the discussion of our findings.

Types of EFL Teachers' Homework Follow-up Practices and Academic Performance

As Model C (see Table 4 ) shows, and once the effect of the pretest was controlled for, the differences among the types of EFL teachers' homework follow-up practices on students' performance were statistically significant, as hypothesized. Moreover, considering the positive value of the coefficients shown in Table 4 , the data indicate that students' performance improved from homework follow-up types 1–5 (see also Figure 2 ), and also that the differences between the five homework follow-up types are not of the same magnitude. In fact, after checking the error rate for comparison family using the FDR procedure, two homogeneous subsets of treatment means were identified. The first subset encompassed homework follow-up types 1 and 2, whereas the second accounted for homework follow-up types 3–5. As shown in Table 5 , significant differences were found between adjusted treatments' means for both subsets (homework follow-up types 1 and 2 vs. homework follow-up types 3–5).

What are the commonalities and differences between these two subsets of homework follow-up types that could help explain findings? Homework follow-up types 1 and 2 did not yield differences in school performance. One possible explanation might be that neither of these types of homework follow-up provides specific information about the mistakes made by students; information which could help them improve their learning in a similar way to when EFL teachers provide feedback ( Hattie and Timperley, 2007 ). Besides, as the control for homework completion is low for these two types of homework follow-up practices, students may not have put the appropriate effort to complete the homework. The following statement was shared by most of the teachers that participated in the focus group and may help explain this latter finding: “[in class] I only ask students if they have done their homework. I know that this strategy does not help them correct their mistakes, but if I don't do it, I suspect they will give up doing their homework …” (F2P3).

In homework follow-up type 2, EFL teachers only addressed difficulties mentioned by the students, so some mistakes may have not been addressed and checked by the EFL teachers. This type of practice does not provide feedback to students. As the following quotation from a participant in the focus group revealed: “At the beginning of the class, I specifically ask students if they have any questions about their homework. The truth is, students who struggle to learn seldom ask questions…I guess that they don't do their homework, or they copy the answers from peers during the break, and just asking questions does not help a lot…but they are 28 in class.” (F2P4).

The second group of homework follow-up practices includes types 3–5. Our data indicate that there were no statistically significant differences among these three types of homework follow-up (intra-group comparisons) at posttest performance (see Table 4 ). Under each of these three conditions (homework follow-up types 3–5) homework contents were checked by the teacher. In these three types of homework follow-up, students experienced opportunities to analyze EFL teachers' explanations and to check their mistakes, which may help explain our findings and those of previous studies (see Cardelle and Corno, 1981 ; Elawar and Corno, 1985 ).

According to Cooper's model ( 1989 , 2001 ), homework follow-up type 5 may be considered the homework feedback practice, because when EFL teachers grade students' assignments and provide individual feedback, students' learning improve. This idea was mentioned by one of our participants: “I collect students' exercise books, not every day, but often enough. That is because I've learned that my students improve whenever I comment upon and grade their homework assignments. I wish I had time to do this regularly…That would be real feedback, that's for sure.” (F1P6).

When analyzing students' conceptions of feedback, Peterson and Irving (2008) concluded that students believe that having their reports graded is a “clearer and more honest” (p. 246) type of feedback. These authors also argued that good grades generate a tangible evidence of students' work for parents, which may also give way to another opportunity for feedback(e.g., praise) delivered by parents and peers ( Núñez et al., 2015 ). It is likely that students see graded homework more worthwhile when compared to other types of homework follow-up practices (e.g., answering questions about homework). This idea supports studies which found a positive association between homework effort and achievement (e.g., Trautwein et al., 2006b , 2009b ). Walberg et al. (1985) claimed that graded homework has a powerful effect on learning. However, Trautwein et al. (2009a) alerted that graded homework may have a negative impact whenever experienced as overcontrolling, as “…students may feel tempted to copy from high-achieving classmates to escape negative consequences” (p. 185). These findings ( Trautwein et al., 2006b , 2009a , b ), aligned with ours, suggest the need to analyze homework feedback in more depth. For example, there are several variables that were not considered in the current research (e.g., number of students per class, number of different grade levels teachers are teaching or number of different classes teachers teach, different level of students' expertise in class, type of content domain; but also career related issues such as frozen salaries, reduced retirement costs), which may help explain our results.

We also noticed that the effect of EFL teachers' homework follow-up practices on performance was affected by students' prior performance, confirming our second hypothesis, but not by the number of homework follow-up sessions (i.e., the number of homework follow-up sessions was only marginally non-significant as a secondary factor, not as the principal factor). A quotation from a teacher under the third condition may help illustrate this finding: “reflecting on my experience under condition 3 [checking homework orally], I can tell that students' prior knowledge was very important for explaining the variations in the efficacy of this strategy. Some of my students, for example, attend language schools and master vocabulary and grammar, but others clearly need extra help. For example, checking homework on the board so that students may copy the answers and study them at home would be very beneficial for many of my students” (M15).

The results of this preliminary study were obtained in a real learning environment and focused on homework follow-up practices commonly used by EFL teachers. We acknowledge the difficulties to set up and run a randomized-group design in a real learning environment (i.e., motivating teachers to participate, training teachers to follow the protocol, control the process). Still, we believe in the importance of collecting data on-task. Plus, we consider that our preliminary findings may help teachers and school administrators to organize school-based teachers' training and educational policies on homework. For example, studies conducted in several countries (e.g., Germany, Hong Kong, Singapore, Israel) reported that checking homework completion is the homework follow-up practice most often used by teachers to keep track of students' homework (e.g., Trautwein et al., 2009a ; Kaur, 2011 ; Zhu and Leung, 2012 ), and in some cases the only homework follow-up practice used in class (e.g., see Kukliansky et al., 2014 ). However, this type of homework follow-up does not provide students with appropriate information on how they may improve their learning. Our data show that, when EFL teachers offer individual and specific information to help student progress (e.g., homework correction, graded homework), the impact on school performance is higher, even when this help is provided for only 6 weeks. This main finding, that should be further investigated, may help teachers' in class practices and contribute to foster students' behaviors toward homework and school achievement.

In sum, our findings indicate that the time and effort teachers devote assessing, presenting, and discussing homework with students is worth the effort. In fact, students consider limited feedback an impediment to homework completion, and recognize teacher's feedback as a homework completion facilitator ( Bang, 2011 ).

During the focus group interviews, and consistent with findings by Rosário et al. (2015) , several EFL teachers stressed that, despite their positive belief about the efficacy of delivering feedback to students, they do not find the necessary time to provide feedback in class (e.g., comment on homework and grading homework). This is due to, among other reasons, the long list of contents to cover in class and the large number of students per class. Pelletier et al.'s (2002) show that the major constraint perceived by teachers in their job is related to the pressure to follow the school curriculum. Data from the focus group helped understand our findings, and highlights the need for school administrators to become aware of the educational constraints faced daily by EFL teachers at school and to find alternatives to support the use of in class homework follow-up practices. Thus, we believe that teachers, directly, and students, indirectly, would benefit from teacher training on effective homework follow-up practices with a focus on, for example, how to manage the extensive curriculum and time, and learning about different homework follow-up practices, mainly feedback. Some authors (e.g., Elawar and Corno, 1985 ; Epstein and Van Voorhis, 2012 ; Núñez et al., 2014 ; Rosário et al., 2015 ) have warned about the importance of organizing school-based teacher training with an emphasis on homework (i.e., purposes of homework, homework feedback type, amount of homework assigned, schools homework policies, and written homework feedback practices). With the focus group interviews we learned that several EFL teachers did not differentiate feedback from other homework follow-up practices, such as checking homework completion (e.g., see F2P7 statement, Table 1 ). EFL teachers termed all the homework follow-up practices used in class as feedback, despite the fact that some of these practices did not deliver useful information to improve the quality of students' homework and promote progress. These data suggest a need to foster opportunities for teachers to reflect upon their in-class instructional practices (e.g., type and purposes of the homework assigned, number and type of questions asked in class) and its impact on the quality of the learning process. For example, school-based teacher training focusing on discussing the various types of homework follow-up practices and their impact on homework quality and academic achievement would enhance teachers' practice and contribute to improve their approaches to teaching ( Rosário et al., 2013 ).

Limitations of the Study and Future Research

This study is a preliminary examination of the relationship between five types of EFL teachers' homework follow-up practices and performance in the EFL class. Therefore, some limitations must be addressed as they may play a role in our findings. First, participating EFL teachers were assigned to one and only one of five homework follow-up conditions, but 19 of them were excluded for not adhering to the protocol. As a result, the number of EFL teachers under each condition was unbalanced, especially in the case of homework follow-up condition number 5. This fact should be considered when analyzing conclusions.

Several reasons may explain why 19 EFL teachers were excluded from our research protocol (i.e., three were laid off, six did not report the work done correctly or submitted the data requested, and ten did not followed the protocol closely). Nevertheless, during the post-research evaluation meeting the EFL teachers addressed this topic which helped understand their motives for not adhering to the protocol. For example: “I'm sorry for abandoning your research, but I couldn't collect and grade homework every week. I have 30 students in class, as you know, and it was impossible for me to spend so many hours grading.” (M7). Our findings suggest that teachers' attitudes toward homework follow-up practices are important, as well as the need to set educational environments that may facilitate their use in class.

We acknowledged the difficulty of carrying out experimental studies in authentic teaching and learning environments. Nevertheless, we decided to address the call by Trautwein et al. (2006b) , and investigate teachers' homework practices as ecologically valid as possible in the natural learning environment of teachers and students.

Future studies should find a way to combine an optimal variable control model and an authentic learning environment.

Second, a mixed type of homework follow-up practices (e.g., combining homework control and checking homework on the board) was not considered in the current study as an additional level of the independent variable. In fact, some of the excluded EFL teachers highlighted the benefits of combining various homework follow-up practices, as one EFL teacher remarked: “I was “assigned” condition 5 [collecting and grading homework], but grading and noting homework every week is too demanding, as I have five more sixth grade classes to teach. So, although I am certain that giving individualized feedback is better for my students, I couldn't do it for the six homework assignments as required. In some sessions I checked homework orally.” (M24). Thus, future studies should consider the possibility of analyzing the impact of different combinations of types of homework follow-up practices. Our research focused on sixth grade EFL teachers only. To our knowledge, there are no studies examining the impact of homework follow-up practices in different education levels, but it is plausible that the type and intensity of the homework-follow up practices used by teachers may vary from one educational level to another. Hence, it would be interesting to examine whether our findings may be replicated in other grade levels, or in different subjects. Furthermore, it would be beneficial to conduct this study in other countries in order to explore whether the follow-up practices identified by EFL Portuguese teachers match those found in other teaching and learning cultures.

Third, the fact that in our study the differences found were small suggests the importance of examining the type of homework follow-up used and students' interpretation of teachers' practice. Future studies may analyze the hypothesis that students' behavior toward teacher homework follow-up practices (e.g., how students perceive their teachers' homework follow-up practices; what students do with the homework feedback information given by teachers) mediates the effect of homework on student learning and performance. In fact, the way students benefit from their teachers' homework follow-up practice may help explain the impact of these practices on students' homework performance and academic achievement. Future studies may also consider conducting more large-scale studies (i.e., with optimal sample sizes) using multilevel designs aimed at analyzing how student variables (e.g., cognitive, motivational, and affective) mediate the relationship between teacher homework follow-up type and students' learning and academic performance.

Finally, future research could also consider conducting qualitative research to analyze teachers' conceptions of homework follow-up practices, mainly feedback ( Cunha et al., 2015 ). This information may be very useful to improving homework feedback measures in future quantitative studies. Investigating teachers' conceptions of homework follow-up practices may help identify other homework feedback practices implemented in authentic learning environments. It may also help understand the reasons why teachers use specific types of homework feedback, and explore the constraints daily faced in class when giving homework feedback. As one teacher in the focus group claimed: “Unfortunately, I don't have time to collect and grade homework, because I have too many students and the content that I have to cover each term is vast. So I just check whether all students completed their homework” (F2P1).

This research was supported by the Portuguese Foundation for Science and Technology and the Portuguese Ministry of Education and Science through national funds and when applicable co-financed by FEDER under the PT2020 Partnership Agreement (UID/PSI/01662/2013), by the Spanish Ministry of Education and Science (Proyects: EDU2014-57571-P and PSI-2011-23395) and by Council of Economy and Employment of the Government of the Principality of Asturias, Spain (Proyect: FC-15-GRUPIN14-053).

Conflict of Interest Statement

The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.

Bang, H. (2011). What makes it easy or hard for you to do your homework? An account of newcomer immigrant youths' afterschool academic lives. Curr. Issues Educ. 14, 1–26.

Benjamini, Y., and Hochberg, Y. (1995). Controlling the false discovery rate: a practical and powerful approach to multiple testing. J. R. Stat. Soc. Ser. B 57, 289–300.

Google Scholar

Buijs, M., and Admiraal, W. (2013). Homework assignments to enhance student engagement in secondary education. Eur. J. Psychol. Educ. 28, 767–779. doi: 10.1007/s10212-012-0139-0

CrossRef Full Text | Google Scholar

Cardelle, M., and Corno, L. (1981). Effects on second language learning of variations in written feedback on homework assignments. TESOL Q. 15, 251–261. doi: 10.2307/3586751

Cooper, H. (1989). Synthesis of research on homework. Educ. Leadersh. 47, 85–91.

Cooper, H. (2001). The Battle Over Homework: Common Ground for Administrators, Teachers, and Parents, 2nd Edn . Thousand Oaks, CA: Sage Publications.

Cooper, H., Robinson, J., and Patall, E. (2006). Does homework improve academic achievement? A synthesis of research, 1987-2003. Rev. Educ. Res . 76, 1–62. doi: 10.3102/00346543076001001

Cunha, J., Rosário, P., Macedo, L., Nunes, A. R., Fuentes, S., Pinto, R., et al. (2015). Parents' conceptions of their homework involvement in elementary school. Psicothema 27, 159–165. doi: 10.7334/psicothema2014.210

PubMed Abstract | CrossRef Full Text | Google Scholar

Duijnhouwer, H., Prins, F., and Stokking, K. (2012). Feedback providing improvement strategies and reflection on feedback use: effects on students' writing motivation, process, and performance. Lear. Instr. 22, 171–184. doi: 10.1016/j.learninstruc.2011.10.003

Elawar, M. C., and Corno, L. (1985). A factorial experiment in teachers' written feedback on student homework: changing teacher behavior a little rather than a lot. J. Educ. Psychol. 77, 162–173. doi: 10.1037/0022-0663.77.2.162

Epstein, J. L., and Van Voorhis, F. L. (2001). More than ten minutes: teachers' roles in designing homework. Educ. Psychol. 36, 181–193. doi: 10.1207/S15326985EP3603_4

Epstein, J., and Van Voorhis, F. (2012). “The changing debate: from assigning homework to designing homework,” in Contemporary Debates in Child Development and Education , eds S. Suggate and E. Reese (London: Routledge), 263–273.

Fleiss, J. L. (1981). Statistical Methods for Rates and Proportions, 2nd Edn. New York, NY: John Wiley & Sons, Inc.

Hagger, M., Sultan, S., Hardcastle, S., and Chatzisarantis, N. (2015). Perceived autonomy support and autonomous motivation toward mathematics activities in educational and out-of-school contexts is related to mathematics homework behavior and attainment. Contemp. Educ. Psychol. 41, 111–123. doi: 10.1016/j.cedpsych.2014.12.002

Hattie, J., and Timperley, H. (2007). The power of feedback. Rev. Educ. Res. 77, 81–112. doi: 10.3102/003465430298487

Kaur, B. (2011). Mathematics homework: a study of three grade eight classrooms in Singapore. Inter. J. Sci. Math. Educ. 9, 187–206. doi: 10.1007/s10763-010-9237-0

Kenward, M. G., and Roger, J. H. (2009). An improved approximation to the precision of fixed effects from restricted maximum likelihood. Comput. Stat. Data Anal. 53, 2583–2595. doi: 10.1016/j.csda.2008.12.013

Kukliansky, I., Shosberger, I., and Eshach, H. (2014). Science teachers' voice on homework: beliefs, attitudes, and behaviors. Inter. J. Sci. Math. Educ. doi: 10.1007/s10763-014-9555-8. [Epub ahead of print].

Landis, J. R., and Koch, G. G. (1977). The measurement of observer agreement for categorical data. Biometrics 33, 159–174. doi: 10.2307/2529310

Murphy, J., Decker, K., Chaplin, C., Dagenais, R., Heller, J., Jones, R., et al. (1987). An exploratory analysis of the structure of homework assignments in high schools. Res. Rural Educ . 4, 61–71.

Narciss, S. (2004). The impact of informative feedback and self-efficacy on motivation and achievement in concept learning. Exp. Psychol. 51, 214–228. doi: 10.1027/1618-3169.51.3.214

Nicol, D., and Macfarlane-Dick, D. (2006). Formative assessment and self-regulated learning: a model and seven principles of good feedback practice. Stud. Higher Educ . 31, 199–218. doi: 10.1080/03075070600572090

Núñez, J. C., Suárez, N., Rosário, P., Vallejo, G., Cerezo, R., and Valle, A. (2014). Teachers' feedback on homework, homework-related behaviors and academic achievement. J. Educ. Res . 108, 204–216. doi: 10.1080/00220671.2013.878298

CrossRef Full Text

Núñez, J. C., Suárez, N., Rosário, P., Vallejo, G., Valle, A., and Epstein, J. L. (2015). Relationships between perceived parental involvement in homework, student homework behaviors, and academic achievement: differences among elementary, junior high, and high school students. Metacogn. Learn. doi: 10.1007/s11409-015-9135-5. [Epub ahead of print].

Pelletier, L., and Séguin-Lévesque, Legault, L. (2002). Pressure from above and pressure from below as determinants of teachers' motivation and teaching behaviors. J. Educ. Psychol. 94, 186–196. doi: 10.1037/0022-0663.94.1.186

Peterson, E., and Irving, S. (2008). Secondary school students' conceptions of assessment and feedback. Learn. Instr. 18, 238–250. doi: 10.1016/j.learninstruc.2007.05.001

Rosário, P., Mourão, R., Baldaque, M., Nunes, T., Núñez, J. C., González-Pienda, J. A., et al. (2009). Homework, self-regulated learning and math achievement. Revista de Psicodidática 14, 179–192.

Rosário, P., Mourão, R., Trigo, L., Suárez, N., Fernández, E., and Tuero-Herrero, E. (2011). English as a foreign language (EFL) homework diaries: evaluating gains and constraints for self-regulated learning and achievement. Psicothema 23, 681–687.

PubMed Abstract

Rosário, P., Núñez, J. C., Ferrando, P., Paiva, O., Lourenço, A., Cerezo, R., et al. (2013). The relationship between approaches to teaching and approaches to studying: a two-level structural equation model for biology achievement in high school. Metacogn. Learn. 8, 44–77. doi: 10.1007/s11409-013-9095-6

Rosário, P., Núñez, J. C., Vallejo, G., Cunha, J., Nunes, T., Mourão, R., et al. (2015). Does homework design matter? The role of homework's purpose in student mathematics achievement. Contemp. Educ. Psychol . 43, 10–24. doi: 10.1016/j.cedpsych.2015.08.001

SAS Institute Inc. (SAS). (2013). SAS/STAT® 13.1 User's Guide. Cary, NC: SAS Institute, Inc.

Shute, V. (2008). Focus on formative feedback. Rev. Educ. Res. 79, 153–189. doi: 10.3102/0034654307313795

Trautwein, U., Köller, O., Schmitz, B., and Baumert, J. (2002). Do homework assignments enhance achievement? A multilevel analysis in 7th-Grade Mathematics. Contemp. Educ. Psychol. 27, 26–50. doi: 10.1006/ceps.2001.1084

Trautwein, U., Lüdtke, O., Kastens, C., and Köller, O. (2006a). Effort on homework in grades 5–9: development, motivational antecedents, and the association with effort on classwork. Child Dev. 77, 1094–1111. doi: 10.1111/j.1467-8624.2006.00921.x

Trautwein, U., Ludtke, O., Schnyder, I., and Niggli, A. (2006b). Predicting homework effort: support for a domain-specific, multilevel homework model. J. Educ. Psychol. 98, 438–456. doi: 10.1037/0022-0663.98.2.438

Trautwein, U., Niggli, A., Schnyder, I., and Lüdtke, O. (2009a). Between-teacher differences in homework assignments and the development of students' homework effort, homework emotions, and achievement. J. Educ. Psychol. 101, 176–189. doi: 10.1037/0022-0663.101.1.176

Trautwein, U., Schnyder, I., Niggli, A., Neumann, M., and Lüdtke, O. (2009b). Chameleon effects in homework research: the homework-achievement association depends on the measures used and the level of analysis chosen. Contemp. Educ. Psychol. 34, 77–88. doi: 10.1016/j.cedpsych.2008.09.001

Vallejo, G., and Ato, M. (2012). Robust tests for multivariate factorial designs under heteroscedasticity. Behav. Res. Methods 44, 471–489. doi: 10.3758/s13428-011-0152-2

Vallejo, G., Ato, M., and Fernández, M. P. (2010). A robust approach for analyzing unbalanced factorial designs with fixed levels. Behav. Res. Methods 42, 607–617. doi: 10.3758/BRM.42.2.607

Walberg, H. J., and Paik, S. J. (2000). Effective Educational Practices. Educational Practices Series – 3 . Brussels: International Academy of Education.

Walberg, H. J., Paschal, R. A., and Weinstein, T. (1985). Homework's powerful effects on learning. Educ. Leadersh. 42, 76–79.

Xu, J. (2008). Models of secondary school students' interest in homework: a multilevel analysis. Am. Educ. Res. J . 45, 1180–1205. doi: 10.3102/0002831208323276

Xu, J. (2010). Predicting homework distraction at the secondary school level: a multilevel analysis. Teach. Coll. Rec . 112, 1937–1970.

Xu, J. (2011). Homework completion at the secondary school level: a multilevel analysis. J. Educ. Res . 104, 171–182. doi: 10.1080/00220671003636752

Xu, J. (2012). Predicting students' homework environment management at the secondary school level. Educ. Psychol. 32, 183–200. doi: 10.1080/01443410.2011.635639

Xu, J., and Wu, H. (2013). Self-regulation of homework behavior: homework management at the secondary school level. J. Educ. Res. 106, 1–13. doi: 10.1080/00220671.2012.658457

Zhu, Y., and Leung, F. (2012). Homework and mathematics achievement in Hong Kong: evidence from the TIMSS 2003. Int. J. Sci. Math. Educ. 10, 907–925. doi: 10.1007/s10763-011-9302-3

Keywords: types of homework follow-up, academic performance, English as a Foreign Language (EFL), homework, teachers' practices

Citation: Rosário P, Núñez JC, Vallejo G, Cunha J, Nunes T, Suárez N, Fuentes S and Moreira T (2015) The effects of teachers' homework follow-up practices on students' EFL performance: a randomized-group design. Front. Psychol . 6:1528. doi: 10.3389/fpsyg.2015.01528

Received: 01 August 2015; Accepted: 22 September 2015; Published: 13 October 2015.

Reviewed by:

Copyright © 2015 Rosário, Núñez, Vallejo, Cunha, Nunes, Suárez, Fuentes and Moreira. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY) . The use, distribution or reproduction in other forums is permitted, provided the original author(s) or licensor are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.

*Correspondence: Pedro Rosário, [email protected]

Disclaimer: All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article or claim that may be made by its manufacturer is not guaranteed or endorsed by the publisher.

U.S. flag

An official website of the United States government

The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

  • Publications
  • Account settings

Preview improvements coming to the PMC website in October 2024. Learn More or Try it out now .

  • Advanced Search
  • Journal List
  • Front Psychol

The effects of teachers' homework follow-up practices on students' EFL performance: a randomized-group design

Pedro rosário.

1 Departamento de Psicologia Aplicada, Escola de Psicologia, Universidade do Minho, Braga, Portugal

José C. Núñez

2 Departamento de Psicologia, Universidad de Oviedo, Oviedo, Spain

Guillermo Vallejo

Jennifer cunha, tânia nunes, natalia suárez, sonia fuentes.

3 Vicerrectoría Académica, Universidad Central de Chile, Santiago de Chile, Chile

4 Facultad de Educación, Universidad Autónoma de Chile, Santiago de Chile, Chile

Tânia Moreira

This study analyzed the effects of five types of homework follow-up practices (i.e., checking homework completion; answering questions about homework; checking homework orally; checking homework on the board; and collecting and grading homework) used in class by 26 teachers of English as a Foreign Language (EFL) using a randomized-group design. Once a week, for 6 weeks, the EFL teachers used a particular type of homework follow-up practice they had previously been assigned to. At the end of the 6 weeks students completed an EFL exam as an outcome measure. The results showed that three types of homework follow-up practices (i.e., checking homework orally; checking homework on the board; and collecting and grading homework) had a positive impact on students' performance, thus highlighting the role of EFL teachers in the homework process. The effect of EFL teachers' homework follow-up practices on students' performance was affected by students' prior knowledge, but not by the number of homework follow-up sessions.

Introduction

Homework is defined as a set of school tasks assigned by teachers to be completed by students out of school (Cooper, 2001 ). Several studies have showed the positive impact of this instructional tool to enhance students' school performance and develop study skills, self-regulation, school engagement, discipline, and responsibility (e.g., Cooper et al., 2006 ; Rosário et al., 2009 , 2011 ; Buijs and Admiraal, 2013 ; Hagger et al., 2015 ).

In the homework process teachers have two major tasks: designing and setting activities (Epstein and Van Voorhis, 2001 , 2012 ; Trautwein et al., 2009a ), and checking and/or providing homework feedback to students (Trautwein et al., 2006b ; Núñez et al., 2014 ). Cooper ( 1989 ) called the later “classroom follow-up” (p. 87). Classroom follow-up includes feedback provided by the teacher (e.g., written comments, marking homework, and incentives; Cooper, 1989 , 2001 ). Hattie and Timperley ( 2007 ) defined feedback as the information provided by an educational agent or the student (self) on aspects of the performance. Feedback is an important source of information for checking answers (Narciss, 2004 ) and improving academic performance (Nicol and Macfarlane-Dick, 2006 ; Shute, 2008 ; Duijnhouwer et al., 2012 ). According to Walberg and Paik ( 2000 ), feedback is “the key to maximizing the positive impact of homework” (p. 9) because teachers take advantage of the opportunity to reinforce the work that was well-done by the students or teach them something new that would help them improve their work. Moreover, Cooper ( 1989 , 2001 ) argued that the way teachers manage students' homework assignments presented in classroom may influence how much students benefit from homework.

Research on homework, with a particular focus on the homework follow-up practices commonly used by teachers, has looked into various practices such as homework control perceived by students (e.g., Trautwein et al., 2006a , b ), teachers' feedback on homework (Cardelle and Corno, 1981 ; Elawar and Corno, 1985 ), and feedback on homework perceived by students (e.g., Xu, 2008 , 2010 ). Studies conducted in several countries (e.g., Germany, Hong Kong, Singapore) reported homework control (i.e., checking whether students have completed their homework) as the homework follow-up practice teachers use in class most often in elementary and middle school levels (see Trautwein et al., 2009a ; Kaur, 2011 ; Zhu and Leung, 2012 ). However, studies carried out in mathematics and French as a second language concluded that controlling homework completion reported by middle school students, or controlling students' homework style reported by teachers (e.g., “By looking at a student's assignment, I can quickly tell how much effort he/she has put into it”) did not have any effect on middle school students' achievement (Trautwein et al., 2002 , 2009a ). To our knowledge, only the study by Trautwein et al. ( 2006b ) found a positive predictive effect of homework control perceived by middle school students on students' homework effort in French as a Foreign Language at the student level but not at the class level.

Regarding homework feedback, Walberg and Paik ( 2000 ) described “[homework feedback as] the key to maximizing the positive impact of homework” (p. 9). In fact, the literature has evidenced a positive relationship between homework feedback and students' outcomes. For example, Xu ( 2008 , 2010 ) examined the benefits of homework feedback using a measure of teacher's feedback on homework. This measure assessed middle and high school students' perceptions on topics such as: discussing homework, collecting homework, checking homework, grading homework [i.e., assigning numerical grades for homework], and counting homework completion for students' overall grade. However, Xu ( 2008 ); Xu ( 2010 ) did notanalyzed the impact of any particular feedback practice. The same author found a positive relationship between homework feedback provided by teachers (as perceived by the middle and high school students) and students' interest in homework (Xu, 2008 ); students' homework management (Xu, 2012 ; Xu and Wu, 2013 ); and students' homework completion (Xu, 2011 ). More recently, Núñez et al. ( 2014 ) analyzed the relationship between teachers' homework feedback as perceived by students from the fifth to the twelfth grade and academic achievement, and reported an indirect relationship between homework feedback and academic achievement through students' homework behaviors (e.g., amount of homework completed).

Other studies on homework have examined the effects of written feedback on students' academic outcomes. In particular, Cardelle and Corno ( 1981 ) and Elawar and Corno ( 1985 ), examined the effects of three types of written homework feedback (i.e., praise, constructive criticism, constructive criticism plus praise) using an experimental design, and concluded that student's performance when given constructive criticism plus praise was higher than when given the other two types of feedback in primary education (Elawar and Corno, 1985 ) or in higher education (Cardelle and Corno, 1981 ). These results stress how important teachers' feedback may be not only because of its positive effect on homework, but also because it provides students with information on how to improve their work (Cardelle and Corno, 1981 ). The synthesis by Walberg et al. ( 1985 ) confirmed the results of previous studies and showed that “commented upon or graded homework” (p.76) increased the positive effect of homework on academic achievement of elementary and secondary students.

The literature has shown the effect of some teachers' homework follow-up practices on students' homework behaviors and academic achievement (Xu, 2012 ; Xu and Wu, 2013 ; Núñez et al., 2014 ), yet the use of different measures and sources of information (e.g., see Trautwein et al., 2006b , 2009a ) makes it difficult for researchers to draw conclusions about the benefits of the various types of homework follow-up practices. Moreover, Trautwein et al. ( 2006b ) suggested that future studies should include other dimensions of teachers' homework practices (e.g., checking homework completion, grading homework). However, to our knowledge, research has not yet analyzed the effects of the various types of homework follow-up practices used by teachers.

To address this call, we used a quasi-experimental design in a study conducted in an authentic learning environment in order to analyze the relationship between five types of homework follow-up practices (i.e., 1, Checking homework completion ; 2, Answering questions about homework ; 3, Checking homework orally ; 4, Checking homework on the board ; and 5, Collecting and grading homework ) used by EFL teachers and their students' performance in English. Findings may be useful to school administrators and teachers as they may learn and reflect upon the effects of the homework follow-up practices used in class, which may in turn promote homework effectiveness and school success.

Considering the scarce results of prior studies, it was not possible to establish specific hypotheses regarding the relationship between type of homework feedback and student academic performance. However, taking into account the nature of each type of feedback and its implications for student learning process, in this study we hypothesize that:

  • The types of homework feedback analyzed are differentially associated with student academic performance (increasing from types 1–5);
  • The magnitude of the impact of the types of teacher homework feedback on academic performance is associated with students' prior level of performance.

Participants

A randomized-group design study was conducted in which 45 EFL teachers (classes) were randomly assigned to five homework follow-up conditions (nine EFL teachers per condition). Nineteen teachers were excluded from the study for various reasons (three were laid off, six did not give an accurate report of the procedures followed or submitted the data requested, and 10 did not follow the protocol closely. In the end 26 EFL teachers (20 females) aged 28–54 participated in the study. The final distribution of the teachers per condition was as follows: Type 1 (4); Type 2 (3); Type 3 (5); Type 4 (15); Type 5 (2). Participants had 3–30 years of teaching experience ( M = 19) and taught English to a total of 553 sixth-graders at six state schools in the north of Portugal. Students' age ranged 10–13 ( M = 11.05; SD = 0.87), and there were 278 girls (50.3%) and 275 boys (49.7%).

Learning English as a foreign language is compulsory from fifth to ninth grade in all Portuguese middle schools. Middle school is divided into two stages: the first stage includes fifth and sixth grade (age range 10–11), and the second stage includes seventh to ninth grade (age range 12–14). Our study was conducted with sixth grade students, which is the last year of the first stage. English is taught in two 90-min weekly lessons. As the Portuguese public school system has not enacted any specific homework policies, teachers are free to decide on the amount, frequency, and type of homework they design. This study was carried out in accordance with the recommendations of the ethics committee of the University of Minho, with written informed consent from all subjects enrolled (i.e., teachers and their students). All subjects gave written informed consent in accordance with the Declaration of Helsinki.

The two English performance measures used in this study were collected from the schools' secretary's office. Prior performance (used as a pretest) was obtained from students' grades in a final English exam completed at the end of the previous school year (end of June). Fifth grade EFL students from the six public schools enrolled in the study (all from the same region of the country) completed the same non-standardized exam in the end of the school year (June). This English exam comprised 30 questions on reading comprehension skills, vocabulary, and grammar which were calibrated by a group of EFL teachers from all the intervening schools.

Final academic performance (used as a posttest) was obtained from the students' grades in a final English exam set up specifically for this study and completed at the end of it (beginning of November). The posttest exam was made up of 20 questions designed to assess students' reading comprehension skills, vocabulary, grammar (contents covered in homework assignments 1, 2, 4, and 5), translation skills from English into Portuguese and vice versa, and writing of a short text (5–10 lines; contents covered in homework assignments 3 and 6). The exam lasted 45 min. Grades in the Portuguese compulsory educational system (first to ninth grade) range from 1 to 5, where 1 and 2 is fail, 3 pass, 4 good, and 5 excellent.

To accomplish our goal, the types of homework follow-up practices were selected from the ones identified in the literature (e.g., Walberg et al., 1985 ; Murphy et al., 1987 ; Cooper, 1989 , 2001 ; Trautwein et al., 2006b ). To learn which homework follow-up practices were used by teachers in class to deal with students' delivery of homework assignments, 15 Portuguese middle school EFL teachers were invited to participate in two focus group interviews (one group comprised seven teachers and the other eight teachers). Note that these EFL teachers did not participate in the research intervention.

Findings from this ancillary study allowed the confirmation of the two homework follow-up practices reported in the literature (i.e., checking homework completion, collecting, and grading homework ), and identified three additional practices which were used in the current study. Data from this ancillary study will not be described in detail due to space constraints. Nevertheless, some examples of each homework practice are presented in Table ​ Table1 1 .

The five types of homework follow-up practices exemplified with quotations from the participating teachers in the focus group interviews .

Type 1 :
    “[in class] I just check and note down whether students did their homework. This is the only type of homework feedback I can provide…I wish had time for more” (F2P7)
Type 2
    “[in class] I just ask students if they did or did not understand their homework tasks. If any, I just answer questions about homework because I want to start the class as soon as possible. You know, I need to teach them all the contents in the course program and…” (F1P1)
Type 3 :
    “I usually check homework orally. By answering questions about homework tasks I have the opportunity to explain and suggest strategies to improve learning” (F2P8)
Type 4
    “I always check homework on the board because I want to see if students understood the contents and my explanations,” (F1P3)
Type 5
    “I collect students' notebooks […] because I learned that my students do better when I comment upon and grade their homework assignments…” (F1P6)

Five homework follow-up practices were included in our study as follows: (1) Checking homework completion; (2) Answering questions about homework; (3) Checking homework orally; (4) Checking homework on the board; and (5) Collecting and grading homework. Types 1 and 5 were based on the literature (Walberg et al., 1985 ; Murphy et al., 1987 ; Trautwein et al., 2006b ), and types 2–4 emerged in the focus group interviews with the EFL teachers, and were included in this study because of their local relevance.

Data were collected at the beginning of the school year (between mid-September and end of October) after obtaining permission from schools' head offices. EFL teachers confirmed their intention to participate via email, and from those who had confirmed participation, 45 and their students were randomly selected. Two weeks before the beginning of the study, the 45 EFL teachers participated in a 4-h information meeting which explained the project's aims and the research design in detail (e.g., analysis and discussion of the format and content of the English exam to assess students' performance; and information on the frequency, number, and type of homework assignments; guidelines to mark the homework assignments; and the five types of homework follow-up practices). Additionally, teachers were informed that they would be randomly assigned to an experimental condition, and the associated methodological reasons were discussed with the participants. All teachers agreed and were then randomly assigned to one of the five homework follow-up conditions (nine teachers per condition). However, only 26 teachers completed the study (see Section Participants). At the meeting, all teachers agreed to assign homework to their students only once a week (in the first class of the week) and to check homework completion in the following class using the type of homework follow-up condition they had been assigned to. The six homework assignments were extracted from the English textbook and common to all participants. Two different types of homework were assigned. The first type had reading comprehension, vocabulary, and grammar questions (homework assignments 1, 2, 4, and 5). The second type (homework assignments 3 and 6) had a translation exercise from English into Portuguese and vice versa, and writing of a short text in English (5–10 lines). After selecting the homework exercises, teachers worked on the guidelines to mark each homework assignment, and built a grade tracking sheet to be filled in with information regarding each student and each homework. The grade tracking sheet filled in with students data was delivered to researchers in the following class.

At the end of each lesson, the students noted down the instructions for the homework assignment in their notebooks and completed it out of class.

The researchers gave the EFL teachers extensive training on the homework follow-up practices in order to guarantee that all the participants under each condition followed the same protocol. During the information meeting a combination of theory and practice, open discussion, and role-playing exercises were used.

For each condition, the protocol was as follows. For homework follow-up condition no. 1 ( checking homework completion ), the teacher began the class asking students whether they had completed their homework assignment (i.e., yes, no) and recorded the data on a homework assignment sheet. For homework follow-up condition no. 2 ( answering questions about homework ), the teacher began the class asking students if they had any questions about the homework assignment (e.g., Please, ask any questions if there is something in the homework which you did not understand.), in which case the teacher would answer them. For homework follow-up condition no. 3 ( checking homework orally ), the teacher began the class checking homework orally. Under this condition the teachers proactively read the homework previously assigned to students and orally checked all the tasks or questions (i.e., the teacher read the questions and students answered them aloud, followed by an explanation of the mistakes made by students). For homework follow-up condition no. 4 ( checking homework on the board ), the teacher started the class by writing the answer to each of the homework questions on the board. Following the explanation to a specific question or task, the EFL teachers explicitly asked the class: “Do you have any other questions?” and moved on to the next question. In the case of homework follow-up condition no. 5 ( collecting and grading homework ), the teacher began the class handing out individually checked and graded homework to students. For homework assignments 1, 2, 4, and 5 (i.e., reading comprehension, vocabulary, and grammar questions) the EFL teachers pointed out which of answers were incorrect, and provided the correct answer. A numerical grade for each of the exercises and a global grade were awarded. For the second type of homework assignments (3 and 6; i.e., translation from English into Portuguese and vice versa, and writing of a short text in English), the EFL teachers made comments on the text in terms of contents and style, and gave a numerical grade. Students were encouraged to read the teachers' comments on their homework and asked if they had any questions.

To guarantee the reliability of the measurements (i.e., whether the EFL teachers followed the protocol), three research assistants were present at the beginning of each class. For 15 min, the research assistants took notes on the type of homework follow-up used by the teachers using a diary log. The level of overall agreement among the research assistants was estimated with Fleiss's Kappa (Fleiss, 1981 ). According to Landis and Koch ( 1977 ), the reliability among the research assistants may be rated as good (κ = 0.746; p < 0.001).

Data from the 19 EFL teachers who did not follow the protocol for their assigned homework follow-up condition were not included in the data set. Three weeks after the study, EFL teachers attended a 2-h post-research evaluation meeting with the aim to discuss their experience (e.g., comments and suggestions that could help in future studies; difficulties faced in implementing their experimental condition; reasons for not following the protocol), and analyze preliminary data. At the end of the six homework follow-up sessions, students completed a final English exam as a measure of academic performance (posttest).

Data analysis

Each of the five homework follow-up practices was to be administered by the same number of EFL teachers (nine). However, as mentioned above, 19 EFL teachers were excluded from the study, which led to an uneven distribution of the participating teachers under the five conditions. As the number of homework follow-up sessions was not even in terms of type, it was not possible to guarantee the independence of these two variables (i.e., number of homework follow-up and type of homework follow-up practice). Thus, the amount of treatments (number of homework follow-up sessions) was taken as a control variable. The effect of the EFL teachers nested within the treatment levels (the five homework follow-up practices) was also controlled, but within the type of design (cluster randomized design). Furthermore, students' prior performance was controlled because of its potential to influence the relationship between homework and academic achievement (Trautwein et al., 2002 , 2009b ).

Finally, the design included an independent variable (type of homework follow-up), a dependent variable (post-homework follow-up academic performance), and two covariates (number of homework follow-up sessions administered and performance prior to homework follow-up). The statistical treatment of the data was carried out using analysis of covariance (ANCOVA). Data analysis followed a two-stage strategy. First, we examined whether prior performance (pretest) significantly explained academic performance at posttest (which led to testing whether the regression slopes were null). If the result was positive, it would not be necessary to include any covariate in the model, and an ANOVA model would be fitted. On the other hand, if the result was negative, second stage, it would be necessary to verify whether the regression slopes were parallel (that is, whether the relationship between prior and final performance was similar across the different types of homework follow-up). Finally, in case the parallelism assumption were accepted, paired comparisons between the adjusted homework follow-up type variable measures (i.e., purged of covariate correlations) would be run using the method based on the false discovery rate (FDR) developed by Benjamini and Hochberg ( 1995 ) (BH).

Data were analyzed using SAS version 9.4 [SAS Institute, Inc., (SAS), 2013 ]. The hypotheses referring to nullity and parallelism of the regression slopes were tested using SAS PROC MIXED with the solution proposed by Kenward and Roger ( 2009 ). PROC MIXED allows the use of a linear model that relaxes the assumption of constant variance (for details, see Vallejo et al., 2010 ; Vallejo and Ato, 2012 ). The post-hoc contrasts were done using the ESTIMATE expression in SAS PROC MIXED and the BH/FDR option in SAS PROC MULTITEST.

Descriptive statistics

Table ​ Table2 2 shows the descriptive statistics of the homework follow-up type variable and the two covariates (prior performance and the number of homework follow-up sessions).

Descriptive statistics of the variable homework follow-up practice and covariates (prior performance and number of times feedback is provided) .

. .
Prior performance553253.550.92
Final performance553253.570.97
Number of sessions6164.4301.62
Homework follow-up5153.181.20
Homework follow-up _1Pretest85253.360.88
    Posttest85153.270.99
Homework follow-up _2Pretest65253.340.87
    Posttest65253.260.94
Homework follow-up _3Pretest104253.420.93
    Posttest104253.520.97
Homework follow-up _4Pretest264253.680.96
    Posttest264253.730.97
Homework follow-up _5Pretest35253.740.78
    Posttest35253.830.78

N, total number of subjects; Min, minimum value; Max, maximum value; SD, standard deviation; M, mean; homework follow-up _1, checking homework completion; homework follow-up _2, answering questions about homework; homework follow-up _3, checking homework orally; homework follow-up _4, checking homework on the board; homework follow-up _5, collecting and grading homework; Pretest, performance before homework follow-up; posttest, performance after homework follow-up .

Analysis of covariance

Null regression curve test.

To determine whether prior performance (pretest) significantly explained academic performance at posttest, a type III sum of squares model without an intercept was created. This model included the homework follow-up type (A), and interactions of homework follow-up type with the covariates prior performance ( X 1 ), and number of homework follow-up sessions ( X 2 ); that is, A × X 1 and A × X 2 . The information obtained in this analysis allowed to consider regression slopes for each level of the homework follow-up type variable, and to evaluate its nullity and, to a certain extent, its parallelism. In summary, the technique used aimed to determine whether covariates (number of homework follow-up sessions administered and performance prior to homework follow-up) modified the interaction between homework follow-up type and final performance. Table ​ Table3 3 addresses this question and shows two model effects: the principal effect (A) and secondary effects ( A × X 1 and A × X 2 ).

Estimators of interaction parameters obtained in the first modeling stage after creating a regression model without an intercept .

-value > | |
[ = 1.00]−0.040.34538−0.110.915
[ = 2.00]−0.390.42538−0.920.360
[ = 3.00]0.640.245382.670.008
[ = 4.00]0.710.245382.960.003
[ = 5.00]0.410.275381.530.127
[ = 1.00] × Prior performance0.960.1053810.56< 0.001
[ = 2.00] × Prior performance0.960.0953810.47< 0.001
[ = 3.00] × Prior performance0.870.0653815.37< 0.001
[ = 4.00] × Prior performance0.860.03353826.37< 0.001
[ = 5.00] × Prior performance0.940.06353814.92< 0.001
[ = 1.00] × Number of sessions0.020.035380.590.552
[ = 2.00] × Number of sessions0.150.065382.560.011
[ = 3.00] × Number of sessions−0.030.04538−0.760.446
[ = 4.00] × Number of sessions−0.030.04538−0.850.398
[ = 5.00] × Number of sessions−0.040.04538−0.850.395

[A = 1,…,5], homework follow-up practices .

Data show that all regression coefficients involving the prior performance covariate were statistically significant ( p < 0.001) with very similar levels for the homework follow-up type variable (between p = 0.86 and 0.96). Thus, we may conclude that the slopes were not null. A strong similarity was also observed between the regression coefficients, which indicates that the number of homework follow-up sessions, with the exception of the coefficient corresponding to level 2 of the homework follow-up type variable ( b A 2 × S = 0.15), was also statistically significant ( p = 0.011).

Parallel regression slope test

To test the hypothesis of regression slope parallelism for the covariates prior performance ( X 1 ) and number of homework follow-up sessions ( X 2 ) on final academic performance, the interaction components A × X 1 and A × X 2 of Model A shown in Table ​ Table4 4 are particularly interesting.

Results of fitting three ANCOVA models and one ANOVA model during the second stage of the modeling strategy .

-value > -value > -value > -value >
4, 1621.920.1094, 1832.810.0274, 1592.850.0274, 1506.99< 0.001
1, 242846.74< 0.0011, 4651338.89< 0.0011, 4671345.16< 0.001
1, 2520.540.4641, 3730.160.689
× 4, 1600.620.646
× 4, 1442.200.071
> > > >
UN (1)0.436.41< 0.0010.426.46< 0.0010.426.48< 0.0010.986.52< 0.001
UN (2)0.315.57< 0.0010.345.60< 0.0010.345.66< 0.0010.885.66< 0.001
UN (3)0.287.14< 0.0010.287.16< 0.0010.287.17< 0.0010.947.14< 0.001
UN (4)0.2611.42< 0.0010.2611.45< 0.0010.2611.46< 0.0010.9411.47< 0.001
UN (5)0.084.01< 0.0010.084.09< 0.0010.094.11< 0.0010.624.12< 0.001
T/A0.000.150.44
Fit StatistAICBICAICBICAICBICAICBIC
Value900.1921.9889.8911.3875.0896.61539.51549.9

A, homework follow-up type; X 1 , previous grade; X 2 , number of homework follow-up sessions; UN (1, 2, 3, 4, 5), variance of each homework follow-up type; T/A, teachers nested within the homework follow-up type variable; DF Num , degrees of freedom numerator; DF Den , degrees of freedom denominator .

The data show that the regression slope parallelism hypothesis was not rejected [ F (4, 160) = 0.62, p = 0.646 and F (4, 144) = 2.20, p = 0.071], although the interaction between the number of homework follow-up sessions and the type of homework follow-up turned out to be marginally non-significant. Thus, we provisionally adopted the ANCOVA model that used equal slopes to describe the influence of the covariates on homework follow-up type. Note that the variance component of the students who received homework follow-up type no. 1 was approximately five times the variance of the students receiving type no. 5. Thus, to control the heterogeneity of the data, the GROUP expression in SAS PROC MIXED was used with the solution proposed by Kenward–Roger to adjust for the degrees of freedom (Kenward and Roger, 2009 ). Moreover, the variance component referring to EFL teachers nested within the homework follow-up types was not statistically significant ( z = 0.15, p = 0.44), so we proceeded with the single-level ANCOVA model.

Findings indicate that the differences among the various homework follow-up types do not depend on the teacher that uses them. This preliminary result stresses the relevance of conducting multilevel designs analyzing data at two levels, students and class. This finding is aligned with those of Rosário et al. ( 2013 ) which found a small effect in the relationship between teachers' reported approaches to teaching and students' reported approaches to learning.

Table ​ Table4 4 also shows information regarding the fit of other ANCOVA models with identical slopes: Model B and Model C. Model B shows that the types of homework follow-up did not differ in terms of the number of homework follow-up sessions provided by the EFL teachers ( X 2 ), [ F (1, 373) = 0.16, p = 0.689]. Note that the ANCOVA model with equal regression slope that left out the number of homework follow-up sessions (Model C) was more parsimonious and showed the best fit. The model with the fewest information criteria, Akaike information criteria (AIC) and Bayesian information criteria (BIC), is the model that best fits the data.

The ANCOVA model with equal slopes is shown in Figure ​ Figure1. 1 . The essential characteristic of the model is worth noting: separate regression lines for each type of homework follow-up and approximately parallel slopes among the homework follow-up types. Figure ​ Figure1 1 also shows two subsets of means, each with means that barely differed from each other and were thus considered equal from a statistical standpoint. These subsets encompassed, on the one hand, the first two levels of the homework follow-up type variable (types 1 and 2), and on the other hand, the three last levels of the variable. The equal regression slope ( b = 0.882) between prior performance and final performance, averaging all levels of homework follow-up type, was statistically significant [ t (467) = 36.86, p < 0.001].

An external file that holds a picture, illustration, etc.
Object name is fpsyg-06-01528-g0001.jpg

Pretest performance level .

Comparisons between the adjusted homework follow-up type means

The common slope ( b = 0.882) was used to calculate the final performance means adjusted to the effect of the prior performance covariate. Purged of the correlation with the prior performance covariate, the adjusted final EFL performance means were A 1 = 3.14; A 2 = 3.11; A 3 = 3.44; A 4 = 3.88; and A 5 = 4.03.

Given the two homogeneous subsets of means previously detected, the family of pairwise comparisons that appear in Table ​ Table5 5 was tested. To control for the probability of making one or more type I errors at the chosen level of significance (α = 0.05) for the specified family or group of contrasts, assuming heterogeneity, the ESTIMATE expression in SAS PROC MIXED was used, as was the BH/FDR option in SAS PROC MULTITEST. As indicated in the last column of Table ​ Table5, 5 , the procedure detected statistically significant differences ( p < 0.05) in five of the six contrasts analyzed (see Figure ​ Figure2 2 as well).

Pairwise comparisons between the homework follow-up practices based on ANCOVA BH/FDR that controlled for prior performance .

-value
A1-A3−0.190.09161−2.14< 0.030.0340.050
A1-A4−0.180.08120−2.29< 0.020.020.050
A1-A5−0.220.09119−2.61< 0.010.010.050
A2-A3−0.170.09126−1.95< 0.050.050.053
A2-A4−0.160.0891−2.07< 0.040.040.050
A2-A5−0.210.0999−2.41< 0.020.020.050

An external file that holds a picture, illustration, etc.
Object name is fpsyg-06-01528-g0002.jpg

Types of homework follow-up practices .

Discussion of results

This study analyzed whether the relationship between academic performance and homework follow-up practices depended on the type of homework follow-up practice used in class. We found that the five types of homework feedback were associated with student academic performance, despite the unbalanced number of teachers in each condition, and the low number of sessions (six sessions). The magnitude of the effects found was small, which may be due to the two previously mentioned limitations. Data from the ancillary analysis collected in the two focus groups run to identify the types of homework follow-up used by EFL teachers in class, and data from the post-research evaluation meeting run with the participating teachers contributed to the discussion of our findings.

Types of EFL teachers' homework follow-up practices and academic performance

As Model C (see Table ​ Table4) 4 ) shows, and once the effect of the pretest was controlled for, the differences among the types of EFL teachers' homework follow-up practices on students' performance were statistically significant, as hypothesized. Moreover, considering the positive value of the coefficients shown in Table ​ Table4, 4 , the data indicate that students' performance improved from homework follow-up types 1–5 (see also Figure ​ Figure2), 2 ), and also that the differences between the five homework follow-up types are not of the same magnitude. In fact, after checking the error rate for comparison family using the FDR procedure, two homogeneous subsets of treatment means were identified. The first subset encompassed homework follow-up types 1 and 2, whereas the second accounted for homework follow-up types 3–5. As shown in Table ​ Table5, 5 , significant differences were found between adjusted treatments' means for both subsets (homework follow-up types 1 and 2 vs. homework follow-up types 3–5).

What are the commonalities and differences between these two subsets of homework follow-up types that could help explain findings? Homework follow-up types 1 and 2 did not yield differences in school performance. One possible explanation might be that neither of these types of homework follow-up provides specific information about the mistakes made by students; information which could help them improve their learning in a similar way to when EFL teachers provide feedback (Hattie and Timperley, 2007 ). Besides, as the control for homework completion is low for these two types of homework follow-up practices, students may not have put the appropriate effort to complete the homework. The following statement was shared by most of the teachers that participated in the focus group and may help explain this latter finding: “[in class] I only ask students if they have done their homework. I know that this strategy does not help them correct their mistakes, but if I don't do it, I suspect they will give up doing their homework …” (F2P3).

In homework follow-up type 2, EFL teachers only addressed difficulties mentioned by the students, so some mistakes may have not been addressed and checked by the EFL teachers. This type of practice does not provide feedback to students. As the following quotation from a participant in the focus group revealed: “At the beginning of the class, I specifically ask students if they have any questions about their homework. The truth is, students who struggle to learn seldom ask questions…I guess that they don't do their homework, or they copy the answers from peers during the break, and just asking questions does not help a lot…but they are 28 in class.” (F2P4).

The second group of homework follow-up practices includes types 3–5. Our data indicate that there were no statistically significant differences among these three types of homework follow-up (intra-group comparisons) at posttest performance (see Table ​ Table4). 4 ). Under each of these three conditions (homework follow-up types 3–5) homework contents were checked by the teacher. In these three types of homework follow-up, students experienced opportunities to analyze EFL teachers' explanations and to check their mistakes, which may help explain our findings and those of previous studies (see Cardelle and Corno, 1981 ; Elawar and Corno, 1985 ).

According to Cooper's model ( 1989 , 2001 ), homework follow-up type 5 may be considered the homework feedback practice, because when EFL teachers grade students' assignments and provide individual feedback, students' learning improve. This idea was mentioned by one of our participants: “I collect students' exercise books, not every day, but often enough. That is because I've learned that my students improve whenever I comment upon and grade their homework assignments. I wish I had time to do this regularly…That would be real feedback, that's for sure.” (F1P6).

When analyzing students' conceptions of feedback, Peterson and Irving ( 2008 ) concluded that students believe that having their reports graded is a “clearer and more honest” (p. 246) type of feedback. These authors also argued that good grades generate a tangible evidence of students' work for parents, which may also give way to another opportunity for feedback(e.g., praise) delivered by parents and peers (Núñez et al., 2015 ). It is likely that students see graded homework more worthwhile when compared to other types of homework follow-up practices (e.g., answering questions about homework). This idea supports studies which found a positive association between homework effort and achievement (e.g., Trautwein et al., 2006b , 2009b ). Walberg et al. ( 1985 ) claimed that graded homework has a powerful effect on learning. However, Trautwein et al. ( 2009a ) alerted that graded homework may have a negative impact whenever experienced as overcontrolling, as “…students may feel tempted to copy from high-achieving classmates to escape negative consequences” (p. 185). These findings (Trautwein et al., 2006b , 2009a , b ), aligned with ours, suggest the need to analyze homework feedback in more depth. For example, there are several variables that were not considered in the current research (e.g., number of students per class, number of different grade levels teachers are teaching or number of different classes teachers teach, different level of students' expertise in class, type of content domain; but also career related issues such as frozen salaries, reduced retirement costs), which may help explain our results.

We also noticed that the effect of EFL teachers' homework follow-up practices on performance was affected by students' prior performance, confirming our second hypothesis, but not by the number of homework follow-up sessions (i.e., the number of homework follow-up sessions was only marginally non-significant as a secondary factor, not as the principal factor). A quotation from a teacher under the third condition may help illustrate this finding: “reflecting on my experience under condition 3 [checking homework orally], I can tell that students' prior knowledge was very important for explaining the variations in the efficacy of this strategy. Some of my students, for example, attend language schools and master vocabulary and grammar, but others clearly need extra help. For example, checking homework on the board so that students may copy the answers and study them at home would be very beneficial for many of my students” (M15).

The results of this preliminary study were obtained in a real learning environment and focused on homework follow-up practices commonly used by EFL teachers. We acknowledge the difficulties to set up and run a randomized-group design in a real learning environment (i.e., motivating teachers to participate, training teachers to follow the protocol, control the process). Still, we believe in the importance of collecting data on-task. Plus, we consider that our preliminary findings may help teachers and school administrators to organize school-based teachers' training and educational policies on homework. For example, studies conducted in several countries (e.g., Germany, Hong Kong, Singapore, Israel) reported that checking homework completion is the homework follow-up practice most often used by teachers to keep track of students' homework (e.g., Trautwein et al., 2009a ; Kaur, 2011 ; Zhu and Leung, 2012 ), and in some cases the only homework follow-up practice used in class (e.g., see Kukliansky et al., 2014 ). However, this type of homework follow-up does not provide students with appropriate information on how they may improve their learning. Our data show that, when EFL teachers offer individual and specific information to help student progress (e.g., homework correction, graded homework), the impact on school performance is higher, even when this help is provided for only 6 weeks. This main finding, that should be further investigated, may help teachers' in class practices and contribute to foster students' behaviors toward homework and school achievement.

In sum, our findings indicate that the time and effort teachers devote assessing, presenting, and discussing homework with students is worth the effort. In fact, students consider limited feedback an impediment to homework completion, and recognize teacher's feedback as a homework completion facilitator (Bang, 2011 ).

During the focus group interviews, and consistent with findings by Rosário et al. ( 2015 ), several EFL teachers stressed that, despite their positive belief about the efficacy of delivering feedback to students, they do not find the necessary time to provide feedback in class (e.g., comment on homework and grading homework). This is due to, among other reasons, the long list of contents to cover in class and the large number of students per class. Pelletier et al.'s ( 2002 ) show that the major constraint perceived by teachers in their job is related to the pressure to follow the school curriculum. Data from the focus group helped understand our findings, and highlights the need for school administrators to become aware of the educational constraints faced daily by EFL teachers at school and to find alternatives to support the use of in class homework follow-up practices. Thus, we believe that teachers, directly, and students, indirectly, would benefit from teacher training on effective homework follow-up practices with a focus on, for example, how to manage the extensive curriculum and time, and learning about different homework follow-up practices, mainly feedback. Some authors (e.g., Elawar and Corno, 1985 ; Epstein and Van Voorhis, 2012 ; Núñez et al., 2014 ; Rosário et al., 2015 ) have warned about the importance of organizing school-based teacher training with an emphasis on homework (i.e., purposes of homework, homework feedback type, amount of homework assigned, schools homework policies, and written homework feedback practices). With the focus group interviews we learned that several EFL teachers did not differentiate feedback from other homework follow-up practices, such as checking homework completion (e.g., see F2P7 statement, Table ​ Table1). 1 ). EFL teachers termed all the homework follow-up practices used in class as feedback, despite the fact that some of these practices did not deliver useful information to improve the quality of students' homework and promote progress. These data suggest a need to foster opportunities for teachers to reflect upon their in-class instructional practices (e.g., type and purposes of the homework assigned, number and type of questions asked in class) and its impact on the quality of the learning process. For example, school-based teacher training focusing on discussing the various types of homework follow-up practices and their impact on homework quality and academic achievement would enhance teachers' practice and contribute to improve their approaches to teaching (Rosário et al., 2013 ).

Limitations of the study and future research

This study is a preliminary examination of the relationship between five types of EFL teachers' homework follow-up practices and performance in the EFL class. Therefore, some limitations must be addressed as they may play a role in our findings. First, participating EFL teachers were assigned to one and only one of five homework follow-up conditions, but 19 of them were excluded for not adhering to the protocol. As a result, the number of EFL teachers under each condition was unbalanced, especially in the case of homework follow-up condition number 5. This fact should be considered when analyzing conclusions.

Several reasons may explain why 19 EFL teachers were excluded from our research protocol (i.e., three were laid off, six did not report the work done correctly or submitted the data requested, and ten did not followed the protocol closely). Nevertheless, during the post-research evaluation meeting the EFL teachers addressed this topic which helped understand their motives for not adhering to the protocol. For example: “I'm sorry for abandoning your research, but I couldn't collect and grade homework every week. I have 30 students in class, as you know, and it was impossible for me to spend so many hours grading.” (M7). Our findings suggest that teachers' attitudes toward homework follow-up practices are important, as well as the need to set educational environments that may facilitate their use in class.

We acknowledged the difficulty of carrying out experimental studies in authentic teaching and learning environments. Nevertheless, we decided to address the call by Trautwein et al. ( 2006b ), and investigate teachers' homework practices as ecologically valid as possible in the natural learning environment of teachers and students.

Future studies should find a way to combine an optimal variable control model and an authentic learning environment.

Second, a mixed type of homework follow-up practices (e.g., combining homework control and checking homework on the board) was not considered in the current study as an additional level of the independent variable. In fact, some of the excluded EFL teachers highlighted the benefits of combining various homework follow-up practices, as one EFL teacher remarked: “I was “assigned” condition 5 [collecting and grading homework], but grading and noting homework every week is too demanding, as I have five more sixth grade classes to teach. So, although I am certain that giving individualized feedback is better for my students, I couldn't do it for the six homework assignments as required. In some sessions I checked homework orally.” (M24). Thus, future studies should consider the possibility of analyzing the impact of different combinations of types of homework follow-up practices. Our research focused on sixth grade EFL teachers only. To our knowledge, there are no studies examining the impact of homework follow-up practices in different education levels, but it is plausible that the type and intensity of the homework-follow up practices used by teachers may vary from one educational level to another. Hence, it would be interesting to examine whether our findings may be replicated in other grade levels, or in different subjects. Furthermore, it would be beneficial to conduct this study in other countries in order to explore whether the follow-up practices identified by EFL Portuguese teachers match those found in other teaching and learning cultures.

Third, the fact that in our study the differences found were small suggests the importance of examining the type of homework follow-up used and students' interpretation of teachers' practice. Future studies may analyze the hypothesis that students' behavior toward teacher homework follow-up practices (e.g., how students perceive their teachers' homework follow-up practices; what students do with the homework feedback information given by teachers) mediates the effect of homework on student learning and performance. In fact, the way students benefit from their teachers' homework follow-up practice may help explain the impact of these practices on students' homework performance and academic achievement. Future studies may also consider conducting more large-scale studies (i.e., with optimal sample sizes) using multilevel designs aimed at analyzing how student variables (e.g., cognitive, motivational, and affective) mediate the relationship between teacher homework follow-up type and students' learning and academic performance.

Finally, future research could also consider conducting qualitative research to analyze teachers' conceptions of homework follow-up practices, mainly feedback (Cunha et al., 2015 ). This information may be very useful to improving homework feedback measures in future quantitative studies. Investigating teachers' conceptions of homework follow-up practices may help identify other homework feedback practices implemented in authentic learning environments. It may also help understand the reasons why teachers use specific types of homework feedback, and explore the constraints daily faced in class when giving homework feedback. As one teacher in the focus group claimed: “Unfortunately, I don't have time to collect and grade homework, because I have too many students and the content that I have to cover each term is vast. So I just check whether all students completed their homework” (F2P1).

This research was supported by the Portuguese Foundation for Science and Technology and the Portuguese Ministry of Education and Science through national funds and when applicable co-financed by FEDER under the PT2020 Partnership Agreement (UID/PSI/01662/2013), by the Spanish Ministry of Education and Science (Proyects: EDU2014-57571-P and PSI-2011-23395) and by Council of Economy and Employment of the Government of the Principality of Asturias, Spain (Proyect: FC-15-GRUPIN14-053).

Conflict of interest statement

The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.

  • Bang H. (2011). What makes it easy or hard for you to do your homework? An account of newcomer immigrant youths' afterschool academic lives . Curr. Issues Educ. 14 , 1–26. [ Google Scholar ]
  • Benjamini Y., Hochberg Y. (1995). Controlling the false discovery rate: a practical and powerful approach to multiple testing . J. R. Stat. Soc. Ser. B 57 , 289–300. [ Google Scholar ]
  • Buijs M., Admiraal W. (2013). Homework assignments to enhance student engagement in secondary education . Eur. J. Psychol. Educ. 28 , 767–779. 10.1007/s10212-012-0139-0 [ CrossRef ] [ Google Scholar ]
  • Cardelle M., Corno L. (1981). Effects on second language learning of variations in written feedback on homework assignments . TESOL Q. 15 , 251–261. 10.2307/3586751 [ CrossRef ] [ Google Scholar ]
  • Cooper H. (1989). Synthesis of research on homework . Educ. Leadersh. 47 , 85–91. [ Google Scholar ]
  • Cooper H. (2001). The Battle Over Homework: Common Ground for Administrators, Teachers, and Parents, 2nd Edn . Thousand Oaks, CA: Sage; Publications. [ Google Scholar ]
  • Cooper H., Robinson J., Patall E. (2006). Does homework improve academic achievement? A synthesis of research, 1987-2003 . Rev. Educ. Res . 76 , 1–62. 10.3102/00346543076001001 [ CrossRef ] [ Google Scholar ]
  • Cunha J., Rosário P., Macedo L., Nunes A. R., Fuentes S., Pinto R., et al.. (2015). Parents' conceptions of their homework involvement in elementary school . Psicothema 27 , 159–165. 10.7334/psicothema2014.210 [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Duijnhouwer H., Prins F., Stokking K. (2012). Feedback providing improvement strategies and reflection on feedback use: effects on students' writing motivation, process, and performance . Lear. Instr. 22 , 171–184. 10.1016/j.learninstruc.2011.10.003 [ CrossRef ] [ Google Scholar ]
  • Elawar M. C., Corno L. (1985). A factorial experiment in teachers' written feedback on student homework: changing teacher behavior a little rather than a lot . J. Educ. Psychol. 77 , 162–173. 10.1037/0022-0663.77.2.162 [ CrossRef ] [ Google Scholar ]
  • Epstein J. L., Van Voorhis F. L. (2001). More than ten minutes: teachers' roles in designing homework . Educ. Psychol. 36 , 181–193. 10.1207/S15326985EP3603_4 [ CrossRef ] [ Google Scholar ]
  • Epstein J., Van Voorhis F. (2012). The changing debate: from assigning homework to designing homework, in Contemporary Debates in Child Development and Education , eds Suggate S., Reese E. (London: Routledge; ), 263–273. [ Google Scholar ]
  • Fleiss J. L. (1981). Statistical Methods for Rates and Proportions, 2nd Edn. New York, NY: John Wiley & Sons, Inc. [ Google Scholar ]
  • Hagger M., Sultan S., Hardcastle S., Chatzisarantis N. (2015). Perceived autonomy support and autonomous motivation toward mathematics activities in educational and out-of-school contexts is related to mathematics homework behavior and attainment . Contemp. Educ. Psychol. 41 , 111–123. 10.1016/j.cedpsych.2014.12.002 [ CrossRef ] [ Google Scholar ]
  • Hattie J., Timperley H. (2007). The power of feedback . Rev. Educ. Res. 77 , 81–112. 10.3102/003465430298487 [ CrossRef ] [ Google Scholar ]
  • Kaur B. (2011). Mathematics homework: a study of three grade eight classrooms in Singapore . Inter. J. Sci. Math. Educ. 9 , 187–206. 10.1007/s10763-010-9237-0 [ CrossRef ] [ Google Scholar ]
  • Kenward M. G., Roger J. H. (2009). An improved approximation to the precision of fixed effects from restricted maximum likelihood . Comput. Stat. Data Anal. 53 , 2583–2595. 10.1016/j.csda.2008.12.013 [ CrossRef ] [ Google Scholar ]
  • Kukliansky I., Shosberger I., Eshach H. (2014). Science teachers' voice on homework: beliefs, attitudes, and behaviors . Inter. J. Sci. Math. Educ. 10.1007/s10763-014-9555-8. [Epub ahead of print]. [ CrossRef ] [ Google Scholar ]
  • Landis J. R., Koch G. G. (1977). The measurement of observer agreement for categorical data . Biometrics 33 , 159–174. 10.2307/2529310 [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Murphy J., Decker K., Chaplin C., Dagenais R., Heller J., Jones R., et al. (1987). An exploratory analysis of the structure of homework assignments in high schools . Res. Rural Educ . 4 , 61–71. [ Google Scholar ]
  • Narciss S. (2004). The impact of informative feedback and self-efficacy on motivation and achievement in concept learning . Exp. Psychol. 51 , 214–228. 10.1027/1618-3169.51.3.214 [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Nicol D., Macfarlane-Dick D. (2006). Formative assessment and self-regulated learning: a model and seven principles of good feedback practice . Stud. Higher Educ . 31 , 199–218. 10.1080/03075070600572090 [ CrossRef ] [ Google Scholar ]
  • Núñez J. C., Suárez N., Rosário P., Vallejo G., Cerezo R., Valle A. (2014). Teachers' feedback on homework, homework-related behaviors and academic achievement . J. Educ. Res . 108 , 204–216. 10.1080/00220671.2013.878298 [ CrossRef ] [ Google Scholar ]
  • Núñez J. C., Suárez N., Rosário P., Vallejo G., Valle A., Epstein J. L. (2015). Relationships between perceived parental involvement in homework, student homework behaviors, and academic achievement: differences among elementary, junior high, and high school students . Metacogn. Learn. 10.1007/s11409-015-9135-5. [Epub ahead of print]. [ CrossRef ] [ Google Scholar ]
  • Pelletier L., Séguin-Lévesque Legault, L. (2002). Pressure from above and pressure from below as determinants of teachers' motivation and teaching behaviors . J. Educ. Psychol. 94 , 186–196. 10.1037/0022-0663.94.1.186 [ CrossRef ] [ Google Scholar ]
  • Peterson E., Irving S. (2008). Secondary school students' conceptions of assessment and feedback . Learn. Instr. 18 , 238–250. 10.1016/j.learninstruc.2007.05.001 [ CrossRef ] [ Google Scholar ]
  • Rosário P., Mourão R., Baldaque M., Nunes T., Núñez J. C., González-Pienda J. A., et al. (2009). Homework, self-regulated learning and math achievement . Revista de Psicodidática 14 , 179–192. [ Google Scholar ]
  • Rosário P., Mourão R., Trigo L., Suárez N., Fernández E., Tuero-Herrero E. (2011). English as a foreign language (EFL) homework diaries: evaluating gains and constraints for self-regulated learning and achievement . Psicothema 23 , 681–687. [ PubMed ] [ Google Scholar ]
  • Rosário P., Núñez J. C., Ferrando P., Paiva O., Lourenço A., Cerezo R., et al. (2013). The relationship between approaches to teaching and approaches to studying: a two-level structural equation model for biology achievement in high school . Metacogn. Learn. 8 , 44–77. 10.1007/s11409-013-9095-6 [ CrossRef ] [ Google Scholar ]
  • Rosário P., Núñez J. C., Vallejo G., Cunha J., Nunes T., Mourão R., et al. (2015). Does homework design matter? The role of homework's purpose in student mathematics achievement . Contemp. Educ. Psychol . 43 , 10–24. 10.1016/j.cedpsych.2015.08.001 [ CrossRef ] [ Google Scholar ]
  • SAS Institute Inc. (SAS). (2013). SAS/STAT® 13.1 User's Guide. Cary, NC: SAS Institute, Inc. [ Google Scholar ]
  • Shute V. (2008). Focus on formative feedback . Rev. Educ. Res. 79 , 153–189. 10.3102/0034654307313795 [ CrossRef ] [ Google Scholar ]
  • Trautwein U., Köller O., Schmitz B., Baumert J. (2002). Do homework assignments enhance achievement? A multilevel analysis in 7th-Grade Mathematics . Contemp. Educ. Psychol. 27 , 26–50. 10.1006/ceps.2001.1084 [ CrossRef ] [ Google Scholar ]
  • Trautwein U., Lüdtke O., Kastens C., Köller O. (2006a). Effort on homework in grades 5–9: development, motivational antecedents, and the association with effort on classwork . Child Dev. 77 , 1094–1111. 10.1111/j.1467-8624.2006.00921.x [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Trautwein U., Ludtke O., Schnyder I., Niggli A. (2006b). Predicting homework effort: support for a domain-specific, multilevel homework model . J. Educ. Psychol. 98 , 438–456. 10.1037/0022-0663.98.2.438 [ CrossRef ] [ Google Scholar ]
  • Trautwein U., Niggli A., Schnyder I., Lüdtke O. (2009a). Between-teacher differences in homework assignments and the development of students' homework effort, homework emotions, and achievement . J. Educ. Psychol. 101 , 176–189. 10.1037/0022-0663.101.1.176 [ CrossRef ] [ Google Scholar ]
  • Trautwein U., Schnyder I., Niggli A., Neumann M., Lüdtke O. (2009b). Chameleon effects in homework research: the homework-achievement association depends on the measures used and the level of analysis chosen . Contemp. Educ. Psychol. 34 , 77–88. 10.1016/j.cedpsych.2008.09.001 [ CrossRef ] [ Google Scholar ]
  • Vallejo G., Ato M. (2012). Robust tests for multivariate factorial designs under heteroscedasticity . Behav. Res. Methods 44 , 471–489. 10.3758/s13428-011-0152-2 [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Vallejo G., Ato M., Fernández M. P. (2010). A robust approach for analyzing unbalanced factorial designs with fixed levels . Behav. Res. Methods 42 , 607–617. 10.3758/BRM.42.2.607 [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Walberg H. J., Paik S. J. (2000). Effective Educational Practices. Educational Practices Series – 3 . Brussels: International Academy of Education. [ Google Scholar ]
  • Walberg H. J., Paschal R. A., Weinstein T. (1985). Homework's powerful effects on learning . Educ. Leadersh. 42 , 76–79. [ Google Scholar ]
  • Xu J. (2008). Models of secondary school students' interest in homework: a multilevel analysis . Am. Educ. Res. J . 45 , 1180–1205. 10.3102/0002831208323276 [ CrossRef ] [ Google Scholar ]
  • Xu J. (2010). Predicting homework distraction at the secondary school level: a multilevel analysis . Teach. Coll. Rec . 112 , 1937–1970. [ Google Scholar ]
  • Xu J. (2011). Homework completion at the secondary school level: a multilevel analysis . J. Educ. Res . 104 , 171–182. 10.1080/00220671003636752 [ CrossRef ] [ Google Scholar ]
  • Xu J. (2012). Predicting students' homework environment management at the secondary school level . Educ. Psychol. 32 , 183–200. 10.1080/01443410.2011.635639 [ CrossRef ] [ Google Scholar ]
  • Xu J., Wu H. (2013). Self-regulation of homework behavior: homework management at the secondary school level . J. Educ. Res. 106 , 1–13. 10.1080/00220671.2012.658457 [ CrossRef ] [ Google Scholar ]
  • Zhu Y., Leung F. (2012). Homework and mathematics achievement in Hong Kong: evidence from the TIMSS 2003 . Int. J. Sci. Math. Educ. 10 , 907–925. 10.1007/s10763-011-9302-3 [ CrossRef ] [ Google Scholar ]

Is Homework Good for Kids? Here’s What the Research Says

A s kids return to school, debate is heating up once again over how they should spend their time after they leave the classroom for the day.

The no-homework policy of a second-grade teacher in Texas went viral last week , earning praise from parents across the country who lament the heavy workload often assigned to young students. Brandy Young told parents she would not formally assign any homework this year, asking students instead to eat dinner with their families, play outside and go to bed early.

But the question of how much work children should be doing outside of school remains controversial, and plenty of parents take issue with no-homework policies, worried their kids are losing a potential academic advantage. Here’s what you need to know:

For decades, the homework standard has been a “10-minute rule,” which recommends a daily maximum of 10 minutes of homework per grade level. Second graders, for example, should do about 20 minutes of homework each night. High school seniors should complete about two hours of homework each night. The National PTA and the National Education Association both support that guideline.

But some schools have begun to give their youngest students a break. A Massachusetts elementary school has announced a no-homework pilot program for the coming school year, lengthening the school day by two hours to provide more in-class instruction. “We really want kids to go home at 4 o’clock, tired. We want their brain to be tired,” Kelly Elementary School Principal Jackie Glasheen said in an interview with a local TV station . “We want them to enjoy their families. We want them to go to soccer practice or football practice, and we want them to go to bed. And that’s it.”

A New York City public elementary school implemented a similar policy last year, eliminating traditional homework assignments in favor of family time. The change was quickly met with outrage from some parents, though it earned support from other education leaders.

New solutions and approaches to homework differ by community, and these local debates are complicated by the fact that even education experts disagree about what’s best for kids.

The research

The most comprehensive research on homework to date comes from a 2006 meta-analysis by Duke University psychology professor Harris Cooper, who found evidence of a positive correlation between homework and student achievement, meaning students who did homework performed better in school. The correlation was stronger for older students—in seventh through 12th grade—than for those in younger grades, for whom there was a weak relationship between homework and performance.

Cooper’s analysis focused on how homework impacts academic achievement—test scores, for example. His report noted that homework is also thought to improve study habits, attitudes toward school, self-discipline, inquisitiveness and independent problem solving skills. On the other hand, some studies he examined showed that homework can cause physical and emotional fatigue, fuel negative attitudes about learning and limit leisure time for children. At the end of his analysis, Cooper recommended further study of such potential effects of homework.

Despite the weak correlation between homework and performance for young children, Cooper argues that a small amount of homework is useful for all students. Second-graders should not be doing two hours of homework each night, he said, but they also shouldn’t be doing no homework.

Not all education experts agree entirely with Cooper’s assessment.

Cathy Vatterott, an education professor at the University of Missouri-St. Louis, supports the “10-minute rule” as a maximum, but she thinks there is not sufficient proof that homework is helpful for students in elementary school.

“Correlation is not causation,” she said. “Does homework cause achievement, or do high achievers do more homework?”

Vatterott, the author of Rethinking Homework: Best Practices That Support Diverse Needs , thinks there should be more emphasis on improving the quality of homework tasks, and she supports efforts to eliminate homework for younger kids.

“I have no concerns about students not starting homework until fourth grade or fifth grade,” she said, noting that while the debate over homework will undoubtedly continue, she has noticed a trend toward limiting, if not eliminating, homework in elementary school.

The issue has been debated for decades. A TIME cover in 1999 read: “Too much homework! How it’s hurting our kids, and what parents should do about it.” The accompanying story noted that the launch of Sputnik in 1957 led to a push for better math and science education in the U.S. The ensuing pressure to be competitive on a global scale, plus the increasingly demanding college admissions process, fueled the practice of assigning homework.

“The complaints are cyclical, and we’re in the part of the cycle now where the concern is for too much,” Cooper said. “You can go back to the 1970s, when you’ll find there were concerns that there was too little, when we were concerned about our global competitiveness.”

Cooper acknowledged that some students really are bringing home too much homework, and their parents are right to be concerned.

“A good way to think about homework is the way you think about medications or dietary supplements,” he said. “If you take too little, they’ll have no effect. If you take too much, they can kill you. If you take the right amount, you’ll get better.”

More Must-Reads from TIME

  • Breaking Down the 2024 Election Calendar
  • How Nayib Bukele’s ‘Iron Fist’ Has Transformed El Salvador
  • What if Ultra-Processed Foods Aren’t as Bad as You Think?
  • How Ukraine Beat Russia in the Battle of the Black Sea
  • Long COVID Looks Different in Kids
  • How Project 2025 Would Jeopardize Americans’ Health
  • What a $129 Frying Pan Says About America’s Eating Habits
  • The 32 Most Anticipated Books of Fall 2024

Write to Katie Reilly at [email protected]

  • Journal home
  • About the journal
  • J-STAGE home
  • The Journal of Studies on Educ ...
  • Volume 20 (2018-2019) Issue 2
  • Article overview

Graduate School of Education, The University of Tokyo [Japan] Japan Society for the Promotion of Science [Japan]

Corresponding author

ORCID

2019 Volume 20 Issue 2 Pages 27-39

  • Published: 2019 Received: - Available on J-STAGE: May 08, 2021 Accepted: - Advance online publication: - Revised: -

(compatible with EndNote, Reference Manager, ProCite, RefWorks)

(compatible with BibDesk, LaTeX)

X

Homework can have both positive and negative effects on student learning. To overcome the negative effects and facilitate the positive ones, it is important for teachers to understand the underlying mechanisms of homework and how it relates to learning so that they can use the most effective methods of instruction and guidance. To provide a useful guide, this paper reviewed previous research studies and considered the roles of homework and effective instructional strategies from three psychological perspectives: behavioral, information-processing, and social constructivism. From a behavioral perspective, homework can be viewed as increasing opportunities for the repeated practice of knowledge and skills, whereas the information processing perspective places greater importance on the capacity of homework to promote deeper understanding and metacognition. Viewed from a social constructivist perspective, homework can promote the establishment of connections in the learning that occurs in school, at home, and in the wider community. Studies have shown that each of these roles of homework can contribute to the facilitation of meaningful learning and the support of students toward becoming self-initiated learners. However, there are some crucial challenges that remain in applying this knowledge to the actual school setting. This paper’s conclusion discusses possible directions for much-needed future research and suggests potential solutions.

homework effect on teachers

  • Add to favorites
  • Additional info alert
  • Citation alert
  • Authentication alert

X

The relationship between teachers' homework feedback, students' homework emotions, and academic self-esteem: A multi-group analysis of gender differences

  • Published: 09 March 2024

Cite this article

homework effect on teachers

  • Rui Gou   ORCID: orcid.org/0009-0003-0573-6877 1 ,
  • Xin Yang   ORCID: orcid.org/0000-0001-5835-810X 2 ,
  • Xiaohui Chen 1 ,
  • Chun Cao 2 &
  • Ning Chen 3  

446 Accesses

Explore all metrics

Students’ homework emotions greatly influence the quality of homework, learning activities, and even academic achievement and burden. Therefore, encouraging students’ positive homework emotions is essential for their development. This study aimed to investigate the relationship between three types of teachers’ homework feedback (checking homework on the board, grading homework, and constructive comments), students’ positive and negative homework emotions in Chinese subjects while taking into account the mediating effect of academic self-esteem and gender differences in these underlying relationships. 928 elementary school students of 4–6th grade participated in this survey and completed scales. Results showed that (1) checking homework on the board and constructive comments positively impacted students' positive emotions, while checking homework on the board negatively influenced students’ negative emotions. In contrast, constructive comments did not impact students’ negative emotions. Furthermore, grading homework had no significant effect on students’ emotions; (2) academic self-esteem mediated the relationship between teachers' homework feedback and students’ homework emotions, and (3) gender moderated some underlying relationships between teachers’ homework feedback, students’ homework emotions, and academic self-esteem. This study has implications for teachers in designing and choosing high-quality homework feedback, encouraging students’ positive homework emotions, and reducing students’ negative homework emotions.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save.

  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime

Price includes VAT (Russian Federation)

Instant access to the full article PDF.

Rent this article via DeepDyve

Institutional subscriptions

homework effect on teachers

Similar content being viewed by others

homework effect on teachers

Linking Teachers’ Student-Oriented Goals to Their Emotions in Mathematics Classrooms: The Mediation of Personal Achievement Goals

homework effect on teachers

Influence of Teacher-and-Peer Support on Positive Academic Emotions in EFL Learning: The Mediating Role of Mindfulness

homework effect on teachers

Emotion management in online groupwork reported by Chinese students

Abdel Aal, H. M. M. (2023). Academic self-esteem and its relationship to practicing extracurricular activities among university students. Cypriot Journal of Educational Science, 18 (1), 228–238. https://doi.org/10.18844/cjes.v18i1.8306

Article   Google Scholar  

Akhtar, M., & Bilour, N. (2020). State of mental health among transgender individuals in Pakistan: Psychological resilience and self-esteem. Community Mental Health Journal, 56 (5), 626–634. https://doi.org/10.1007/s10597-019-00522-5

Article   PubMed   Google Scholar  

Akter, S., Zaman, T., Zaman, F. T. Z. B., & Muhammed, N. (2018). Self-esteem and cognitive emotion regulation of young adults in Bangladesh. International Journal of Indian Psychology, 6 (4), 4–11. https://doi.org/10.25215/0604.101

Anderson, J. C., & Gerbing, D. W. (1988). Structural equation modeling in practice: A review and recommended two-step approach. Psychological Bulletin, 103 (3), 411. https://doi.org/10.1037//0033-2909.103.3.411

Annett, J. (1969). Feedback and human behaviour: The effects of knowledge of results, incentives and reinforcement on learning and performance . Penguin Books.

Google Scholar  

Arbuckle, J. L. (2008). Amos 17.0 User’s Guide . SPSS Inc.

Bagozzi, R. P., & Yi, Y. (1988). On the evaluation of structural equation models. Journal of the Academy of Marketing Science, 16 (3), 74–94. https://doi.org/10.1007/BF02723327

Bang, H. (2012). Promising homework practices: Teachers’ perspectives on making homework work for newcomer immigrant students. The High School Journal, 95 (2), 3–31. https://doi.org/10.1353/hsj.2012.0001

Baumeister, R. F., Campbell, J. D., Krueger, J. I., & Vohs, K. D. (2003). Does high self-esteem cause better performance, interpersonal success, happiness, or healthier lifestyles? Psychological Science in the Public Interest, 4 (1), 1–44. https://doi.org/10.1111/1529-1006.01431

Bieleke, M., Goetz, T., Yanagida, T., Botes, E., Frenzel, A. C., & Pekrun, R. (2022). Measuring emotions in mathematics: The Achievement Emotions Questionnaire—Mathematics (AEQ-M). ZDM–Mathematics Education, 55 (2), 269–284. https://doi.org/10.1007/s11858-022-01425-8

Brookhart, S. M. (2008). How to give effective feedback to your students . Alexandria: ASCD.

Brown, C. S., & Stone, E. A. (2016). Gender stereotypes and discrimination: How sexism impacts development. Advances in Child Development and Behavior, 50 , 105–133. https://doi.org/10.1016/bs.acdb.2015.11.001

Brown, H. D. (1994). Teaching by principles: An interactive approach to language pedagogy . Prentice Hall Regents.

Cardelle, M., & Corno, L. (1981). Effects on second language learning of variations in written feedback on homework assignments. Tesol Quarterly, 15 (3), 251–261. https://doi.org/10.2307/3586751

Chennamaneni, P. R., Echambadi, R., Hess, J. D., & Syam, N. (2016). Diagnosing harmful collinearity in moderated regressions: A roadmap. International Journal of Research in Marketing, 33 (1), 172–182. https://doi.org/10.1016/j.ijresmar.2015.08.004

Cooper, H. (2007). The battle over homework: Common ground for administrators, teachers, and parents (3rd ed.). Corwin Press. https://doi.org/10.4135/9781483329420

Book   Google Scholar  

Cooper, H., Robinson, J. C., & Patall, E. A. (2006). Does homework improve academic achievement? A synthesis of research, 1987–2003. Review of Educational Research, 76 (1), 1–62. https://doi.org/10.3102/00346543076001001

Coopersmith, S. (1967). The antecedents of self-esteem . Freeman & Company.

Costa, M., Cardoso, A. P., Lacerda, C., Lopes, A., & Gomes, C. (2016). Homework in primary education from the perspective of teachers and pupils. Procedia-Social and Behavioral Sciences, 217 (5), 139–148. https://doi.org/10.1016/j.sbspro.2016.02.047

Crocker, J., & Wolfe, C. T. (2001). Contingencies of self-worth. Psychological Review, 108 (3), 593–623. https://doi.org/10.1037/0033-295x.108.3.593

Article   CAS   PubMed   Google Scholar  

Cronbach, L. J. (1951). Coefficient alpha and the internal structure of tests. Psychometrika, 16 (3), 297–334. https://doi.org/10.1007/BF02310555

Cunha, J., Rosário, P., Núñez, J. C., Nunes, A. R., Moreira, T., & Nunes, T. (2018). “Homework feedback is…”: Elementary and middle school teachers’ conceptions of homework feedback. Frontiers in Psychology, 9 , 32. https://doi.org/10.3389/fpsyg.2018.00032

Article   PubMed   PubMed Central   Google Scholar  

Cunha, J., Rosário, P., Núñez, J. C., Vallejo, G., Martins, J., & Högemann, J. (2019). Does teacher homework feedback matter to 6th graders’ school engagement?: A mixed methods study. Metacognition and Learning, 14 (8), 89–129. https://doi.org/10.1007/s11409-019-09200-z

Deci, E. L., & Ryan, R. M. (2000). The “what” and “why” of goal pursuits: Human needs and the self-determination of behavior. Psychological Inquiry, 11 (4), 227–268. https://doi.org/10.1207/S15327965PLI1104_01

Demanet, J., & Van Houtte, M. (2012). Teachers’ attitudes and students’ opposition. School misconduct as a reaction to teachers’ diminished effort and affect. Teaching and Teacher Education, 28 (6), 860–869. https://doi.org/10.1016/j.tate.2012.03.008

Dettmers, S., Trautwein, U., Ludtke, O., Goetz, T., Frenzel, A. C., & Pekrun, R. (2011). Students’ emotions during homework in mathematics: Testing a theoretical model of antecedents and achievement outcomes. Contemporary Educational Psychology, 36 (1), 25–35. https://doi.org/10.1016/j.cedpsych.2010.10.001

Diamond, L. M., & Aspinwall, L. G. (2003). Emotion regulation across the life span: An integrative perspective emphasizing self-regulation, positive affect, and dyadic processes. Motivation and Emotion, 27 (6), 125–156. https://doi.org/10.1023/A:1024521920068

Diseth, A., Meland, E., & Breidablik, H. J. (2014). Self-beliefs among students: Grade level and gender differences in self-esteem, self-efficacy, and implicit theory of intelligence. Learning and Individual Differences, 35 , 1–8. https://doi.org/10.1016/j.lindif.2014.06.003

Dong, Y., Yu, G., & Zhou, X. (2013). On the factors affecting the academic emotions of adolescents with and without learning disabilities [in Chinese language]. Chinese Journal of Special Education, 4 , 42–47. https://doi.org/10.3969/j.issn.1007-3728.2013.04.009

Doron, J., Thomas-Ollivier, V., Vachon, H., & Fortes-Bourbousson, M. (2013). Relationships between cognitive coping, self-esteem, anxiety and depression: A cluster-analysis approach. Personality and Individual Differences, 55 (5), 515–520. https://doi.org/10.1016/j.paid.2013.04.017

Dumani, B., & Gencel, N. (2023). Comparison of face-to-face and distance education: An example of a vocational high school. International Journal of Progressive Education, 19 (1), 131–153. https://doi.org/10.29329/ijpe.2023.517.9

Dumont, M., & Provost, M. (1999). Resilience in adolescents: Protective role of social support, coping strategies, self-esteem, and social activities on experience of stress and depression. Journal of Youth and Adolescence, 28 (3), 343–363. https://doi.org/10.1023/A:1021637011732

Elawar, M. C., & Corno, L. (1985). A factorial experiment in teachers’ written feedback on student homework: Changing teacher behavior a little rather than a lot. Journal of Educational Psychology, 77 (2), 162–173. https://doi.org/10.1037/0022-0663.77.2.162

Evans, C., & Waring, M. (2011). Student teacher assessment feedback preferences: The influence of cognitive styles and gender. Learning and Individual Differences, 21 (3), 271–280. https://doi.org/10.1016/j.lindif.2010.11.011

Farkas, G., Grobe, R. P., Sheehan, D., & Shuan, Y. (1990). Cultural resources and school success: Gender, ethnicity, and poverty groups within an urban school district. American Sociological Review, 55 (1), 127–142. https://doi.org/10.2307/2095708

Fornell, C., & Larcker, D. F. (1981). Evaluating structural equation models with unobservable variables and measurement error. Journal of Marketing Research, 18 (1), 39–50. https://doi.org/10.1177/002224378101800104

Garnefski, N., Kraaij, V., & Spinhoven, P. (2001). Negative life events, cognitive emotion regulation and emotional problems. Personality and Individual Differences, 30 (8), 1311–1327. https://doi.org/10.1016/S0191-8869(00)00113-6

Goetz, T., Nett, U. E., Martiny, S. E., Hall, N. C., Pekrun, R., Dettmers, S., et al. (2012). Students’ emotions during homework: Structures, self-concept antecedents, and achievement outcomes. Learning and Individual Difference, 22 (2), 225–234. https://doi.org/10.1016/j.lindif.2011.04.006

Guo, W., & Zhou, W. (2021). Relationships between teacher feedback and student motivation: A comparison between male and female students. Frontiers in Psychology, 12 , 679575. https://doi.org/10.3389/fpsyg.2021.679575

Habrat, A. (2018). The role of self-esteem in foreign language learning and teaching . Springer International Publishing. https://doi.org/10.1007/978-3-319-75283-9

Hair, J. F., Jr., Sarstedt, M., Hopkins, L., & Kuppelwieser, V. G. (2014). Partial least squares structural equation modeling (PLS-SEM) An emerging tool in business research. European Business Review, 26 (2), 106–121. https://doi.org/10.1108/EBR-10-2013-0128

Harter, S. (1999). Symbolic interactionism revisited: Potential liabilities for the self-constructed in the crucible of interpersonal relationships. Merrill-Palmer Quarterly, 45 (4), 677–703. https://doi.org/10.2307/23093377

Hattie, J., & Timperley, H. (2007). The power of feedback. Review of Educational Research, 77 (1), 81–112. https://doi.org/10.3102/003465430298487

Heimpel, S. A., Elliot, A. J., & Wood, J. V. (2006). Basic personality dispositions, self-esteem, and personal goals: An approach-avoidance analysis. Journal of Personality, 74 (5), 1293–1320. https://doi.org/10.1111/j.1467-6494.2006.00410.x

Helm, C. (2007). Teacher dispositions affecting self-esteem and student performance. The Clearing House: A Journal of Educational Strategies, Issues and Ideas, 80 (3), 109–110. https://doi.org/10.3200/TCHS.80.3.109-110

Hoge, D. R., Smit, E. K., & Hanson, S. L. (1990). School experiences predicting changes in self-esteem of sixth and seventh grade students. Journal of Educational Psychology, 82 (1), 117–127. https://doi.org/10.1037/0022-0663.82.1.117

Hu, L. T., & Bentler, P. M. (1999). Cutoff criteria for fit indexes in covariance structure analysis: Conventional criteria versus new alternatives. Structural Equation Modeling- A Multidisciplinary Journal, 6 (1), 1–55. https://doi.org/10.1080/10705519909540118

Huang, C. (2013). Gender differences in academic self-efficacy: A meta-analysis. European Journal of Psychology of Education, 28 (3), 1–35. https://doi.org/10.1007/s10212-011-0097-y

Irving, S., Harris, L., & Peterson, E. (2011). ‘One assessment doesn’t serve all the purposes’ or does it? New Zealand teachers describe assessment and feedback. Asia Pacific Education Review, 12 (9), 413–426. https://doi.org/10.1007/s12564-011-9145-1

Jiang, L. (2001). A study of optimizing feedback sessions for elementary school students’ out-of-class homework. Education Science, 4 , 31–32.

Kaur, B. (2011). Mathematics homework: A study of three grade eight classrooms in Singapore. International Journal of Science and Mathematics Education, 9 (2), 187–206. https://doi.org/10.1007/s10763-010-9237-0

Article   ADS   Google Scholar  

Khaleghinezhad, S. A., Shabani, M., Hakimzadeh, R., Nazari Shaker, H., & Amerian, M. (2016). Prediction of high school students’ life satisfaction and academic performance based on locus of control and self-esteem. International Journal of School Health, 3 (3), 1–7. https://doi.org/10.17795/intjsh-31924

Kline, R. B. (2022). Principles and practice of structural equation modeling . Guilford Press.

Kling, K. C., Hyde, J. S., Showers, C. J., & Buswell, B. N. (1999). Gender differences in self-esteem: A meta-analysis. Psychological Bulletin, 125 (4), 470. https://doi.org/10.1037/0033-2909.125.4.470

Knollmann, M., & Wild, E. (2007). Quality of parental support and students’ emotions during homework: Moderating effects of students’ motivational orientations. European Journal of Psychology of Education, 22 (3), 63–76. https://doi.org/10.1007/BF03173689

Koka, A., & Hein, V. (2003). Perceptions of teacher’s feedback and learning environment as predictors of intrinsic motivation in physical education. Psychology of Sport and Exercise, 4 (4), 333–346. https://doi.org/10.1016/S1469-0292(02)00012-2

Krijgsman, C. (2021). Assessment and motivation: a self-determination theory perspective on performance grading, goal clarification and process feedback in physical education [Doctoral dissertation, Ghent University]. https://doi.org/10.33540/463

Larson, R. W., & Brown, J. R. (2007). Emotional development in adolescence: What can be learned from a high school theater program? Child Development, 78 (4), 1083–1099. https://doi.org/10.1111/j.1467-8624.2007.01054.x

Lee, H., & Jung, E. (2023). An analysis of the longitudinal changes in the determining factors for adolescents’ self-esteem with random forests. In J. S. Park, L. T. Yang, Y. Pan, & J. H. Park (Eds.), Advances in Computer science and ubiquitous computing. CUTECSA 2022. Lecture Notes in Electrical Engineering. (Vol. 1028). Springer. https://doi.org/10.1007/978-981-99-1252-0_16

Chapter   Google Scholar  

Leone, C. M., & Richards, M. H. (1989). Classwork and homework in early adolescence: The ecology of achievement. Journal of Youth and Adolescence, 18 (12), 531–548. https://doi.org/10.1007/BF02139072

Li, J., Han, X., Wang, W., Sun, G., & Cheng, Z. (2018). How social support influences university students’ academic achievement and emotional exhaustion: The mediating role of self-esteem. Learning and Individual Differences, 61 , 120–126. https://doi.org/10.1016/j.lindif.2017.11.016

Liu, Y., Gong, S., & Xiong, J. (2016). The Influence of perceived mathematics homework quality, perceived control and homework emotion on homework effort for middle school students. Journal of Psychological Science, 39 (2), 357–363. https://doi.org/10.16719/j.cnki.1671-6981.20160216

Mahfoodh, O. H. A. (2017). “I feel disappointed”: EFL university students’ emotional responses towards teacher written feedback. Assessing Writing, 31 , 53–72. https://doi.org/10.1016/j.asw.2016.07.001

Manna, G., Giorgio, F., Sonia, I., Como, M. R., & De Santis, S. (2016). The relationship between self-esteem, depression and anxiety: Comparing vulnerability and scar model in the Italian context. Mediterranean Journal of Clinical Psychology, 4 (3), 1–16. https://doi.org/10.6092/2282-1619/2016.4.1328

Marsh, H. W. (1993). Academic self-concept: Theory, measurement and research. In J. M. Suls (Ed.), Psychological perspectives on the self (pp. 59–98). Lawrence Erlbaum. https://doi.org/10.4324/9781315806976

Midgley, C., Feldlaufer, H., & Eccles, J. S. (1989). Student/teacher relations and attitudes toward mathematics before and after the transition to junior high school. Child Development, 60 (4), 981–992. https://doi.org/10.2307/1131038

Mouratidis, M., Vansteenkiste, M., Lens, W., & Sideridis, G. (2008). The motivating role of positive feedback in sport and physical education: Evidence for a motivational model. Journal of Sport and Exercise Psychology, 30 (2), 240–268. https://doi.org/10.1123/jsep.30.2.240

Mruk, C. J. (2013). Self-esteem and positive psychology: Research, theory, and practice . Springer Publishing Company.

Murphy, J., Decker, K., Chaplin, C., Dagenais, R., Heller, J., Jones, R., et al. (1987). An exploratory analysis of the structure of homework assignments in high schools. Research in Rural Education, 4 (2), 61–71.

Nicaise, V., Bois, E. J., Fairclough, S. J., Amorose, A. J., & Cogérino, G. (2007). Girls’ and boys’ perceptions of physical education teachers’ feedback: Effects on performance and psychological response. Journal of Sports Sciences, 25 (8), 915–926. https://doi.org/10.1080/02640410600898095

Núñez, J. C., Suárez, N., Rosário, P., Vallejo, G., Cerezo, R., & Valle, A. (2015). Teachers’ feedback on homework, homework-related behaviors, and academic achievement. The Journal of Educational Research, 108 (3), 204–216. https://doi.org/10.1080/00220671.2013.878298

Okoko‚ W. O. (2012). Self-esteem and academic performance of students in public secondary schools in Ndhiwa Distrist, Kenya. [Master dissertation, University of Nairobi, Kenya].

Orth, U., & Robins, R. W. (2013). Understanding the link between low self-esteem and depression. Current Directions in Psychological Science, 22 (6), 455–460. https://doi.org/10.1177/0963721413492763

Pekrun, R. (2006). The control-value theory of achievement emotions: Assumptions, corollaries, and implications for educational research and practice. Educational Psychology Review, 18 (4), 315–341. https://doi.org/10.1007/s10648-006-9029-9

Pekrun, R., Goetz, T., Titz, W., & Perry, R. P. (2002). Academic emotions in students’ self-regulated learning and achievement: A program of qualitative and quantitative research. Educational Psychologist, 37 (2), 91–105. https://doi.org/10.1207/S15326985EP3702_4

Perveen, F., Altaf, S., & Tehreem, H. (2022). Relationship between self-esteem and academic performance: A gendered perspective. Pakistan Journal of Social Research, 4 (3), 780–785. https://doi.org/10.52567/pjsr.v4i03.768

Peterson, E., & Irving, S. (2008). Secondary school students’ conceptions of assessment and feedback. Learning and Instruction, 18 (3), 238–250. https://doi.org/10.1016/j.learninstruc.2007.05.001

Plante, I., De la Sablonnière, R., Aronson, J. M., & Théorêt, M. (2013). Gender stereotype endorsement and achievement-related outcomes: The role of competence beliefs and task values. Contemporary Educational Psychology, 38 (3), 225–235. https://doi.org/10.1016/j.cedpsych.2013.03.004

Podsakoff, P. M., & Organ, D. W. (1986). Self-reports in organizational research: Problems and prospects. Journal of Management, 12 (4), 531–544. https://doi.org/10.1177/014920638601200408

Pulfrey, C., Darnon, C., & Butera, F. (2013). Autonomy and task performance: Explaining the impact of grades on intrinsic motivation. Journal of Educational Psychology, 105 (1), 39–57. https://doi.org/10.1037/a0029376

Reeve, J. (2012). A self-determination theory perspective on student engagement. In S. L. Christenson, A. L. Reschly, & C. Wylie (Eds.), Handbook of research on student engagement (pp. 149–172). New York: Springer. https://doi.org/10.1007/978-1-4614-2018-7_7

Ren, X., Jing, B., Li, H., & Wu, C. (2022). The impact of perceived teacher support on Chinese junior high school students’ academic self-efficacy: The mediating roles of achievement goals and academic emotions. Frontiers in Psychology, 13 , 1028722. https://doi.org/10.3389/fpsyg.2022.1028722

Rosário, P., Cunha, J., Nunes, A. R., Moreira, T., Núñez, J. C., & Xu, J. (2019). “Did you do your homework?” Mathematics teachers’ homework follow-up practices at middle school level. Psychology in the Schools, 56 (1), 1–17. https://doi.org/10.1002/pits.22198

Rosário, P., Núñez, J. C., Vallejo, G., Cunha, J., Nunes, T., Suárez, N., et al. (2015). The effects of teachers’ homework follow-up practices on students’ EFL performance: A randomized-group design. Frontiers in Psychology, 6 , 1528. https://doi.org/10.3389/fpsyg.2015.01528

Rosenberg, M., Schooler, C., Schoenbach, C., & Rosenberg, F. (1995). Global self-esteem and specific self-esteem: Different concepts, different outcomes. American Sociological Review, 60 (1), 141–156. https://doi.org/10.2307/2096350

Schunk, D. H. (1995). Self-efficacy, motivation, and performance. Journal of Applied Sport Psychology, 7 (2), 112–137. https://doi.org/10.1080/10413209508406961

Article   MathSciNet   Google Scholar  

Simonton, K. L., & Layne, T. E. (2023). Investigating middle school students’ physical education emotions, emotional antecedents, self-esteem, and intentions for physical activity. Journal of Teaching in Physical Education, 42 (4), 1–10. https://doi.org/10.1123/jtpe.2022-0193

Singh, R., Saleem, M., Pradhan, P., Heffernan, C., Heffernan, N. T., Razzaq, L., Dailey, M. D., Oonnor, C., & Mulcahy, C. (2011). Feedback during Web-Based Homework: The Role of Hints. In G. Biswas, S. Bull, J. Kay, & A. Mitrovic (Eds.), Artificial Intelligence in Education. AIED 2011. Lecture Notes in Computer Science. (Vol. 6738). Springer, Berlin. https://doi.org/10.1007/978-3-642-21869-9_43

Smith, C. A., & Lazarus, R. S. (1990). Emotion and adaptation. Contemporary Sociology, 21 (4), 609–637. https://doi.org/10.2307/2075902

Sowislo, J. F., & Orth, U. (2013). Does low self-esteem predict depression and anxiety? A meta-analysis of longitudinal studies. Psychological Bulletin, 139 (1), 213–240. https://doi.org/10.1037/a0028931

Strandell, J. (2016). Culture, cognition and behavior in the pursuit of self-esteem. Poetics, 54 (5), 14–24. https://doi.org/10.1016/j.poetic.2015.08.007

Stroet, K., Opdenakker, M. C., & Minnaert, A. (2013). Effects of need supportive teaching on early adolescents’ motivation and engagement: A review of the literature. Educational Research Review, 9 , 65–87. https://doi.org/10.1016/j.edurev.2012.11.003

Tafarodi, R. W., & Swann, W. B., Jr. (1995). Self-liking and self-competence as dimensions of global self-esteem: Initial validation of a measure. Journal of Personality Assessment, 65 (2), 322–342. https://doi.org/10.1207/s15327752jpa6502_8

Teoh, H. J., & Nur Afiqah, R. (2010). Self-esteem amongst young adults: The effect of gender, social support and personality. Malaysian Journal of Psychiatry, 19 (2), 41–49.

Trautwein, U., Lüdtke, O., Kastens, C., & Köller, O. (2006). Effort on homework in grades 5–9: Development, motivational antecedents, and the association with effort on classwork. Child Development, 77 (4), 1094–1111. https://doi.org/10.1111/j.1467-8624.2006.00921.x

Trautwein, U., Niggli, A., Schnyder, I., & Lüdtke, O. (2009). Between-teacher differences in homework assignments and the development of students’ homework effort, homework emotions, and achievement. Journal of Educational Psychology, 101 (1), 176. https://doi.org/10.1037/0022-0663.101.1.176

Trautwein, U., Schnyder, I., Niggli, A., Neumann, M., & Lüdtke, O. (2009). Chameleon effects in homework research: The homework-achievement association depends on the measures used and the level of analysis chosen. Contemporary Educational Psychology, 34 (1), 77–88. https://doi.org/10.1016/j.cedpsych.2008.09.001

Velotti, P., Garofalo, C., Bottazzi, F., & Caretti, V. (2017). Faces of shame: Implications for self-esteem, emotion regulation, aggression and well-being. The Journal of Psychology, 151 (2), 171–184. https://doi.org/10.1080/00223980.2016.1248809

Verma, S., Sharma, D., & Larson, R. W. (2002). School stress in India: Effects on time and daily emotions. International Journal of Behavioral Development, 26 (6), 500–508. https://doi.org/10.1080/01650250143000454

Wang, F., & Wang, S. (2012). A comparative study on the influence of automated evaluation system and teacher grading on students’ English writing. Procedia Engineering, 29 (2), 993–997. https://doi.org/10.1016/j.proeng.2012.01.077

Woltering, S., & Lewis, M. D. (2009). Developmental pathways of emotion regulation in childhood: A neuropsychological perspective. Mind, Brain, and Education, 3 (3), 160–169. https://doi.org/10.1111/j.1751-228X.2009.01066.x

Xu, J. (2011). Homework emotion management at the secondary school level: Antecedents and homework completion. Teachers College Record, 113 (3), 529–560. https://doi.org/10.1177/016146811111300303

Xu, J. (2018). Emotion regulation in mathematics homework: An empirical study. The Journal of Educational Research, 111 (1), 1–11. https://doi.org/10.1080/00220671.2016.1175409

Article   MathSciNet   CAS   Google Scholar  

Xu, J., Du, J., & Fan, X. (2017). Self-regulation of mathematics homework behavior: An empirical investigation. The Journal of Educational Research, 110 (5), 467–477. https://doi.org/10.1080/00220671.2015.1125837

Zhang, J. (2023). A comparative study of "writing peer review" in high school chemistry error-prone concept learning. [Master dissertation, East China Normal University].

Zhang, Q. (2016). Basic characteristics of the content of the teaching and learning of Chinese subjects. Curriculum, Teaching Material and Method, 36 (1), 82–87. https://doi.org/10.19877/j.cnki.kcjcjf.2016.01.014

Zhu, Y., & Leung, F. (2012). Homework and mathematics achievement in Hong Kong: evidence from the TIMSS2003. International Journal of Science and Mathematics Education, 10 (8), 907–925. https://doi.org/10.1007/s10763-011-9302-3

Download references

Acknowledgements

The authors thank the participating students, teachers, and schools for their time and support.

This work was supported by The National Social Science Found of China “14th Five-Year Plan” 2022 Youth Project in Education: Research on the formation mechanism of schoolwork burden of primary and secondary school students and the accurate reduction mechanism of big data [grant numbers CHA220299].

Author information

Authors and affiliations.

Faculty of Information Science and Technology, Northeast Normal University, #2555 Jingyue Avenue, Changchun City, 130117, Jilin, China

Rui Gou & Xiaohui Chen

Faculty of Education, Northeast Normal University, #5268 Renming Avenue, Changchun City, 130024, Jilin, China

Xin Yang & Chun Cao

School of Communication, Qufu Normal University, #80 North Yantai Road, Rizhao City, 276826, Shandong, China

You can also search for this author in PubMed   Google Scholar

Contributions

Conceptualization: RG, XY, CC; methodology: RG, XY; formal analysis and investigation: NC, RG, XY; writing-original draft preparation: RG, XY; Writing-review and editing: RG, XY; CC, XC; funding acquisition: XY; Resources: XC.

Corresponding author

Correspondence to Xin Yang .

Ethics declarations

Conflict of interest.

The authors have no financial or proprietary interests in any material discussed in this article.

Ethics approval

Ethical approval was obtained from the Northeast Normal University.

Consent to participate

Written consent was obtained from participants.

Consent for publication

This submission has been approved by all co-authors.

Additional information

Publisher's note.

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Gou, R., Yang, X., Chen, X. et al. The relationship between teachers' homework feedback, students' homework emotions, and academic self-esteem: A multi-group analysis of gender differences. Soc Psychol Educ (2024). https://doi.org/10.1007/s11218-024-09897-0

Download citation

Received : 22 June 2023

Accepted : 06 February 2024

Published : 09 March 2024

DOI : https://doi.org/10.1007/s11218-024-09897-0

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Homework feedback
  • Homework emotions
  • Academic self-esteem
  • Find a journal
  • Publish with us
  • Track your research
  • Choose your language

VISIBLE LEARNING

  • Books in English
  • Other languages
  • Infographics
  • Hattie Ranking
  • John Hattie

Hattie Ranking: 252 Influences And Effect Sizes Related To Student Achievement

John Hattie developed a way of synthesizing various influences in different meta-analyses according to their effect size (Cohen’s d). In his ground-breaking study “ Visible Learning ” he ranked 138 influences that are related to learning outcomes from very positive effects to very negative effects. Hattie found that the average effect size of all the interventions he studied was 0.40. Therefore he decided to judge the success of influences relative to this ‘hinge point’, in order to find an answer to the question “What works best in education?”

Originally, Hattie studied six areas that contribute to learning: the student , the home , the school , the curricula , the teacher , and teaching and learning approaches . (The updated list also includes the classroom.) But Hattie did not only provide a list of the relative effects of different influences on student achievement. He also tells the story underlying the data. He found that the key to making a difference was making teaching and learning visible. He further explained this story in his book “ Visible learning for teachers “.

John Hattie updated his list of 138 effects to 150 effects in  Visible Learning for Teachers (2011) , and more recently to a list of 195 effects in The Applicability of Visible Learning to Higher Education (2015) . His research is now based on nearly 1200 meta-analyses – up from the 800 when Visible Learning came out in 2009. According to Hattie the story underlying the data has hardly changed over time even though some effect sizes were updated and we have some new entries at the top, at the middle, and at the end of the list.

Below you can find an updated version of our first , second and third visualization of effect sizes related to student achievement.

Order Visible Learning! Order VL for Teachers!

  • Older Versions

2018-updated-hattie-ranking-hatties-list-of-influences-effect-sizes-achievement-rangliste

RankInfluenceEffect size d (Dec 2017)Effect size d (Aug 2017)SubdomainDomain
1Collective teacher efficacy1.571.57LeadershipSCHOOL
2Self-reported grades1.331.33Prior knowledge and backgroundSTUDENT
3Teacher estimates of achievement1.291.62Teacher attributesTEACHER
4Cognitive task analysis1.291.29Strategies emphasizing learning intentionsTEACHING: Focus on teaching/instructional strategies
5Response to intervention1.291.29Strategies emphasizing feedbackTEACHING: Focus on teaching/instructional strategies
6Piagetian programs1.281.28Prior knowledge and backgroundSTUDENT
7Jigsaw method1.21.2Teaching/instructional strategiesTEACHING: Focus on teaching/instructional strategies
8Conceptual change programs0.990.99Other curricula programsCURRICULA
9Prior ability0.940.94Prior knowledge and backgroundSTUDENT
10Strategy to integrate with prior knowledge0.930.93Learning strategiesTEACHING: Focus on student learning strategies
11Self-efficacy0.920.92Beliefs, attitudes and dispositionsSTUDENT
12Teacher credibility0.90.9Teacher attributesTEACHER
13Micro-teaching/video review of lessons0.880.88Teacher educationTEACHER
14Transfer strategies0.860.86Strategies emphasizing student meta-cognitive/ self-regulated learningTEACHING: Focus on student learning strategies
15Classroom discussion0.820.82Strategies emphasizing feedbackTEACHING: Focus on teaching/instructional strategies
16Scaffolding0.820.82Teaching/instructional strategiesTEACHING: Focus on teaching/instructional strategies
17Deliberate practice0.790.82Learning strategiesTEACHING: Focus on student learning strategies
18Summarization0.790.79Learning strategiesTEACHING: Focus on student learning strategies
19Effort0.770.79Learning strategiesTEACHING: Focus on student learning strategies
20Interventions for students with learning needs0.770.77Implementations that emphasize school-wide teaching strategiesTEACHING: Focus on implementation method
21Mnemonics0.760.77Learning strategiesTEACHING: Focus on student learning strategies
22Planning and prediction0.760.76Strategies emphasizing learning intentionsTEACHING: Focus on teaching/instructional strategies
23Repeated reading programs0.750.76Reading, writing and the artsCURRICULA
24Teacher clarity0.750.75Teacher attributesTEACHER
25Elaboration and organization0.750.75Strategies emphasizing student meta-cognitive/ self-regulated learningTEACHING: Focus on student learning strategies
26Evaluation and reflection0.750.75Strategies emphasizing student meta-cognitive/ self-regulated learningTEACHING: Focus on student learning strategies
27Reciprocal teaching0.740.75Teaching/instructional strategiesTEACHING: Focus on teaching/instructional strategies
28Rehearsal and memorization0.730.74Learning strategiesTEACHING: Focus on student learning strategies
29Comprehensive instructional programs for teachers0.720.73Reading, writing and the artsCURRICULA
30Help seeking0.720.83 & 0.60Strategies emphasizing student meta-cognitive/ self-regulated learningTEACHING: Focus on student learning strategies
31Phonics instruction0.70.7Reading, writing and the artsCURRICULA
32Feedback0.70.7Strategies emphasizing feedbackTEACHING: Focus on teaching/instructional strategies
33Deep motivation and approach0.690.69Motivational approach, orientationSTUDENT
34Field independence0.680.68Prior knowledge and backgroundSTUDENT
35Acceleration programs0.680.68School curricula for gifted studentsCLASSROOM
36Learning goals vs. no goals0.680.68Strategies emphasizing learning intentionsTEACHING: Focus on teaching/instructional strategies
37Problem-solving teaching0.680.68Teaching/instructional strategiesTEACHING: Focus on teaching/instructional strategies
38Outlining and transforming0.660.66Learning strategiesTEACHING: Focus on student learning strategies
39Concept mapping0.640.64Strategies emphasizing learning intentionsTEACHING: Focus on teaching/instructional strategies
40Vocabulary programs0.620.62Reading, writing and the artsCURRICULA
41Creativity programs0.620.62Other curricula programsCURRICULA
42Behavioral intervention programs0.620.62Classroom influencesCLASSROOM
43Setting standards for self-judgement0.620.62Strategies emphasizing learning intentionsTEACHING: Focus on teaching/instructional strategies
44Teachers not labeling students0.610.61Teacher-student interactionsTEACHER
45Relations of high school to university achievement0.60.6Prior knowledge and backgroundSTUDENT
46Meta-cognitive strategies0.60.6Strategies emphasizing student meta-cognitive/ self-regulated learningTEACHING: Focus on student learning strategies
47Spaced vs. mass practice0.60.6Learning strategiesTEACHING: Focus on student learning strategies
48Direct instruction0.60.6Teaching/instructional strategiesTEACHING: Focus on teaching/instructional strategies
49Mathematics programs0.590.59Math and sciencesCURRICULA
50Appropriately challenging goals0.590.59Strategies emphasizing learning intentionsTEACHING: Focus on teaching/instructional strategies
51Spelling programs0.580.58Reading, writing and the artsCURRICULA
52Tactile stimulation programs0.580.58Other curricula programsCURRICULA
53Strategy monitoring0.580.58Strategies emphasizing student meta-cognitive/ self-regulated learningTEACHING: Focus on student learning strategies
54Service learning0.580.58Implementations using out-of-school learningTEACHING: Focus on implementation method
55Working memory strength0.570.57Prior knowledge and backgroundSTUDENT
56Full compared to pre-term/low birth weight0.570.57Physical influencesSTUDENT
57Mastery learning0.570.57Strategies emphasizing success criteriaTEACHING: Focus on teaching/instructional strategies
58Explicit teaching strategies0.570.57Teaching/instructional strategiesTEACHING: Focus on teaching/instructional strategies
59Technology with learning needs students0.570.57Implementations using technologiesTEACHING: Focus on implementation method
60Concentration/persistence/ engagement0.560.56Beliefs, attitudes and dispositionsSTUDENT
61Prior achievement 0.550.55Prior knowledge and backgroundSTUDENT
62Visual-perception programs0.550.55Reading, writing and the artsCURRICULA
63Self-verbalization and self-questioning0.550.55Strategies emphasizing student meta-cognitive/ self-regulated learningTEACHING: Focus on student learning strategies
64Cooperative vs. individualistic learning0.550.55Teaching/instructional strategiesTEACHING: Focus on teaching/instructional strategies
65Technology in other subjects0.550.55Implementations using technologiesTEACHING: Focus on implementation method
66Practice testing0.540.54Learning strategiesTEACHING: Focus on student learning strategies
67Interactive video methods0.540.54Implementations using technologiesTEACHING: Focus on implementation method
68Second/third chance programs0.530.53Reading, writing and the artsCURRICULA
69Enrichment programs0.530.53School curricula for gifted studentsCLASSROOM
70Positive peer influences0.530.53Classroom influencesCLASSROOM
71Peer tutoring0.530.53Strategies emphasizing student perspectives in learningTEACHING: Focus on student learning strategies
72Cooperative vs. competitive learning0.530.53Teaching/instructional strategiesTEACHING: Focus on teaching/instructional strategies
73Positive family/home dynamics0.520.52Home environmentHOME
74Socio-economic status0.520.52Family resourcesHOME
75Teacher-student relationships0.520.52Teacher-student interactionsTEACHER
76Self-regulation strategies0.520.52Strategies emphasizing student meta-cognitive/ self-regulated learningTEACHING: Focus on student learning strategies
77Record keeping0.520.52Learning strategiesTEACHING: Focus on student learning strategies
78Play programs0.5Other curricula programsCURRICULA
79Parental involvement0.50.5Home environmentHOME
80Student rating of quality of teaching0.50.5Teacher-student interactionsTEACHER
81Note taking0.50.5Learning strategiesTEACHING: Focus on student learning strategies
82Underlining and highlighting0.50.5Learning strategiesTEACHING: Focus on student learning strategies
83Time on task0.490.49Learning strategiesTEACHING: Focus on student learning strategies
84Science programs0.480.48Math and sciencesCURRICULA
85Generalized school effects0.480.48Other school factorsSCHOOL
86Clear goal intentions0.480.48Strategies emphasizing learning intentionsTEACHING: Focus on teaching/instructional strategies
87Providing formative evaluation0.480.48Strategies emphasizing feedbackTEACHING: Focus on teaching/instructional strategies
88Questioning0.480.48Strategies emphasizing feedbackTEACHING: Focus on teaching/instructional strategies
89Intelligent tutoring systems0.480.48Implementations using technologiesTEACHING: Focus on implementation method
90Comprehension programs0.470.47Reading, writing and the artsCURRICULA
91Integrated curricula programs0.470.47Other curricula programsCURRICULA
92Small group learning0.470.47Classroom composition effectsCLASSROOM
93Information communications technology (ICT)0.470.47Implementations using technologiesTEACHING: Focus on implementation method
94Perceived task value0.460.46Beliefs, attitudes and dispositionsSTUDENT
95Study skills0.460.46Learning strategiesTEACHING: Focus on student learning strategies
96Relative age within a class0.450.45Physical influencesSTUDENT
97Writing programs0.450.45Reading, writing and the artsCURRICULA
98Imagery0.450.45Learning strategiesTEACHING: Focus on student learning strategies
99Achieving motivation and approach0.440.44Motivational approach, orientationSTUDENT
100Early years’ interventions0.440.44+0.29+0.27Home environmentHOME
101Strong classroom cohesion0.440.44Classroom influencesCLASSROOM
102Inductive teaching0.440.44Teaching/instructional strategiesTEACHING: Focus on teaching/instructional strategies
103Technology with elementary students0.440.44Implementations using technologiesTEACHING: Focus on implementation method
104Exposure to reading0.430.43Reading, writing and the artsCURRICULA
105Outdoor/adventure programs0.430.43Other curricula programsCURRICULA
106School size (600-900 students at secondary)0.430.43School compositional effectsSCHOOL
107Teacher expectations0.430.43Teacher attributesTEACHER
108Philosophy in schools0.430.43Teaching/instructional strategiesTEACHING: Focus on teaching/instructional strategies
109Teaching communication skills and strategies0.430.43Teaching/instructional strategiesTEACHING: Focus on teaching/instructional strategies
110Motivation0.420.42Motivational approach, orientationSTUDENT
111Reducing anxiety0.420.42Motivational approach, orientationSTUDENT
112Elaborative interrogation0.420.42Strategies emphasizing student meta-cognitive/ self-regulated learningTEACHING: Focus on student learning strategies
113Behavioral organizers0.420.42Strategies emphasizing learning intentionsTEACHING: Focus on teaching/instructional strategies
114Technology in writing0.420.42Implementations using technologiesTEACHING: Focus on implementation method
115Technology with college students0.420.42Implementations using technologiesTEACHING: Focus on implementation method
116Positive self-concept0.410.41Beliefs, attitudes and dispositionsSTUDENT
117Professional development programs0.410.41Teacher educationTEACHER
118Relating creativity to achievement0.40.4Prior knowledge and backgroundSTUDENT
119Goal commitment0.40.4Strategies emphasizing learning intentionsTEACHING: Focus on teaching/instructional strategies
120Cooperative learning0.40.4Teaching/instructional strategiesTEACHING: Focus on teaching/instructional strategies
121Inquiry-based teaching0.40.4Teaching/instructional strategiesTEACHING: Focus on teaching/instructional strategies
122After-school programs0.40.4Implementations using out-of-school learningTEACHING: Focus on implementation method
123Social skills programs0.390.39Other curricula programsCURRICULA
124Relations of high school achievement to career performance0.380.38Prior knowledge and backgroundSTUDENT
125Drama/arts programs0.380.38Reading, writing and the artsCURRICULA
126Career interventions0.380.38Other curricula programsCURRICULA
127Music programs0.370.37Reading, writing and the artsCURRICULA
128Worked examples0.370.37Strategies emphasizing success criteriaTEACHING: Focus on teaching/instructional strategies
129Mobile phones0.370.37Implementations using technologiesTEACHING: Focus on implementation method
130Bilingual programs0.360.36Other curricula programsCURRICULA
131Student-centered teaching0.360.36Student-focused interventionsTEACHING: Focus on student learning strategies
132Attitude to content domains0.350.35Beliefs, attitudes and dispositionsSTUDENT
133Counseling effects0.350.35Other school factorsSCHOOL
134Classroom management0.350.35Classroom influencesCLASSROOM
135Gaming/simulations0.350.35Implementations using technologiesTEACHING: Focus on implementation method
136Chess instruction0.340.34Other curricula programsCURRICULA
137Motivation/character programs0.340.34Other curricula programsCURRICULA
138Decreasing disruptive behavior0.340.34Classroom influencesCLASSROOM
139Collaborative learning0.340.34Teaching/instructional strategiesTEACHING: Focus on teaching/instructional strategies
140Teaching creative thinking0.340.34Implementations that emphasize school-wide teaching strategiesTEACHING: Focus on implementation method
141Stereotype threat0.330.33Beliefs, attitudes and dispositionsSTUDENT
142Technology in mathematics0.330.33Implementations using technologiesTEACHING: Focus on implementation method
143ADHD – treatment with drugs0.320.32Physical influencesSTUDENT
144Principals/school leaders0.320.32LeadershipSCHOOL
145School climate0.320.32LeadershipSCHOOL
146Average teacher effects0.320.32Teacher attributesTEACHER
147Adjunct aids0.320.32Teaching/instructional strategiesTEACHING: Focus on teaching/instructional strategies
148External accountability systems0.310.31School resourcingSCHOOL
149Matching style of learning0.310.31Student-focused interventionsTEACHING: Focus on student learning strategies
150Manipulative materials on math0.30.3Math and sciencesCURRICULA
151Ability grouping for gifted students0.30.3School curricula for gifted studentsCLASSROOM
152Teaching test taking and coaching0.30.3Learning strategiesTEACHING: Focus on student learning strategies
153Technology with high school students0.30.3Implementations using technologiesTEACHING: Focus on implementation method
154Mindfulness0.290.29Beliefs, attitudes and dispositionsSTUDENT
155Home visiting0.290.29Home environmentHOME
156Cognitive behavioral programs0.290.29Classroom influencesCLASSROOM
157Online and digital tools0.290.29Implementations using technologiesTEACHING: Focus on implementation method
158Technology in reading/literacy0.290.29Implementations using technologiesTEACHING: Focus on implementation method
159Homework0.290.29Implementations using out-of-school learningTEACHING: Focus on implementation method
160Desegregation0.280.28School compositional effectsSCHOOL
161Pre-school programs0.280.26Other school factorsSCHOOL
162Whole-school improvement programs0.280.28Implementations that emphasize school-wide teaching strategiesTEACHING: Focus on implementation method
163Use of calculators0.270.27Math and sciencesCURRICULA
164Mainstreaming/inclusion0.270.27Classroom composition effectsCLASSROOM
165Student personality attributes0.260.26Beliefs, attitudes and dispositionsSTUDENT
166Exercise/relaxation0.260.26Physical influencesSTUDENT
167Lack of illness0.260.26Physical influencesSTUDENT
168Out-of-school curricula experiences0.260.26School compositional effectsSCHOOL
169Volunteer tutors0.260.26Strategies emphasizing student perspectives in learningTEACHING: Focus on student learning strategies
170Problem-based learning0.260.26Teaching/instructional strategiesTEACHING: Focus on teaching/instructional strategies
171Use of PowerPoint0.260.26Implementations using technologiesTEACHING: Focus on implementation method
172Grit/incremental vs. entity thinking0.250.25Beliefs, attitudes and dispositionsSTUDENT
173Adopted vs non-adopted care0.250.25Family structureHOME
174Religious schools0.240.24Types of schoolSCHOOL
175Competitive vs. individualistic learning0.240.24Teaching/instructional strategiesTEACHING: Focus on teaching/instructional strategies
176Intact (two-parent) families0.230.23Family structureHOME
177Summer school0.230.23Types of schoolSCHOOL
178Teacher personality attributes0.230.23Teacher attributesTEACHER
179Individualized instruction0.230.23Student-focused interventionsTEACHING: Focus on student learning strategies
180Programmed instruction0.230.23Implementations using technologiesTEACHING: Focus on implementation method
181Technology in science0.230.23Implementations using technologiesTEACHING: Focus on implementation method
182Teacher verbal ability0.220.22Teacher attributesTEACHER
183Clickers0.220.22Implementations using technologiesTEACHING: Focus on implementation method
184Visual/audio-visual methods0.220.22Implementations using technologiesTEACHING: Focus on implementation method
185Finances0.210.21School resourcingSCHOOL
186Reducing class size0.210.21Classroom composition effectsCLASSROOM
187Interleaved practice0.210.21Learning strategiesTEACHING: Focus on student learning strategies
188Discovery-based teaching0.210.21Teaching/instructional strategiesTEACHING: Focus on teaching/instructional strategies
189Technology in small groups0.210.21Implementations using technologiesTEACHING: Focus on implementation method
190Student support programs – college0.210.21Implementations that emphasize school-wide teaching strategiesTEACHING: Focus on implementation method
191Extra-curricula programs0.20.2Other curricula programsCURRICULA
192Engaged vs disengaged fathers0.20.2Family structureHOME
193Aptitude/treatment interactions0.190.19Student-focused interventionsTEACHING: Focus on student learning strategies
194Learning hierarchies-based approach0.190.19Strategies emphasizing learning intentionsTEACHING: Focus on teaching/instructional strategies
195Co- or team teaching0.190.19Implementations that emphasize school-wide teaching strategiesTEACHING: Focus on implementation method
196Within class grouping0.180.18Classroom composition effectsCLASSROOM
197Web-based learning0.180.18Implementations using technologiesTEACHING: Focus on implementation method
198Lack of stress0.170.17Motivational approach, orientationSTUDENT
199Other family structure0.160.16Family structureHOME
200One-on-one laptops0.160.16Implementations using technologiesTEACHING: Focus on implementation method
201Home-school programs0.160.16Implementations using out-of-school learningTEACHING: Focus on implementation method
202Sentence combining programs0.150.15Reading, writing and the artsCURRICULA
203Parental autonomy support0.150.15Home environmentHOME
204Distance education0.130.13Implementations using out-of-school learningTEACHING: Focus on implementation method
205Morning vs. evening0.120.12Beliefs, attitudes and dispositionsSTUDENT
206Positive ethnic self-identity0.120.12Beliefs, attitudes and dispositionsSTUDENT
207Juvenile delinquent programs0.120.12Other curricula programsCURRICULA
208School choice programs0.120.12School compositional effectsSCHOOL
209Tracking/streaming0.120.12Classroom composition effectsCLASSROOM
210Mentoring0.120.12Classroom influencesCLASSROOM
211Initial teacher training programs0.120.12Teacher educationTEACHER
212Different types of testing0.120.12Strategies emphasizing feedbackTEACHING: Focus on teaching/instructional strategies
213Teacher subject matter knowledge0.110.11Teacher educationTEACHER
214Diverse student body0.10.1School compositional effectsSCHOOL
215Background music0.10.1Classroom influencesCLASSROOM
216Diversity courses0.090.09Other curricula programsCURRICULA
217Charter schools0.090.09Types of schoolSCHOOL
218Modifying school calendars/timetables0.090.09Other school factorsSCHOOL
219Detracking0.090.09Classroom composition effectsCLASSROOM
220Gender on achievement0.080.08Physical influencesSTUDENT
221Perceptual-motor programs0.080.08Other curricula programsCURRICULA
222Single-sex schools0.080.08Types of schoolSCHOOL
223Middle schools’ interventions0.080.08School compositional effectsSCHOOL
224Mastery goals0.060.06Motivational approach, orientationSTUDENT
225Whole language approach0.060.06Reading, writing and the artsCURRICULA
226College halls of residence0.050.05School compositional effectsSCHOOL
227Teacher performance pay0.050.05Teacher attributesTEACHER
228Breastfeeding0.040.04Physical influencesSTUDENT
229Multi-grade/age classes0.040.04Classroom composition effectsCLASSROOM
230Humor0.040.04Teaching/instructional strategiesTEACHING: Focus on teaching/instructional strategies
231Parental employment0.030.03Family resourcesHOME
232Student control over learning0.020.02Student-focused interventionsTEACHING: Focus on student learning strategies
233Non-immigrant background0.010.01Family resourcesHOME
234Open vs. traditional classrooms0.010.01Classroom composition effectsCLASSROOM
235Technology in distance education0.010.01Implementations using technologiesTEACHING: Focus on implementation method
236Performance goals-0.01-0.01Motivational approach, orientationSTUDENT
237Summer vacation effect-0.02-0.02Types of schoolSCHOOL
238Lack of sleep-0.05-0.05Physical influencesSTUDENT
239Surface motivation and approach-0.11-0.11Motivational approach, orientationSTUDENT
240Family on welfare/state aid-0.12-0.12Family resourcesHOME
241Parental military deployment-0.16-0.16Home environmentHOME
242Television-0.18-0.18Home environmentHOME
243Students feeling disliked-0.19-0.19Classroom influencesCLASSROOM
244Suspension/expelling students-0.2-0.2Other school factorsSCHOOL
245Non-standard dialect use -0.29-0.29Prior knowledge and backgroundSTUDENT
246Retention (holding students back)-0.32-0.32Classroom composition effectsCLASSROOM
247Corporal punishment in the home-0.33-0.33Home environmentHOME
248Moving between schools-0.34-0.34Home environmentHOME
249Depression-0.36-0.36Motivational approach, orientationSTUDENT
250Boredom-0.49-0.49Motivational approach, orientationSTUDENT
251Deafness-0.61-0.61Physical influencesSTUDENT
252ADHD-0.9-0.9Physical influencesSTUDENT
  • visible-learning.org/2016/04/hattie-ranking-backup-of-138-effects/
  • visible-learning.org/hattie-ranking-backup-195-effects/
  • visible-learning.org/nvd3/visualize/hattie-ranking-interactive-2009-2011-2015.html
  • https://visible-learning.org/backup-hattie-ranking-256-effects-2017/

glossary-visible-learning-john-hattie-studie

60 comments on “ Hattie Ranking: 252 Influences And Effect Sizes Related To Student Achievement ”

' src=

Hi there – thanks for sharing the graphic – not sure if someone has already pointed out to you the error. You have “Classroom Behavioural” with an effect size of 0.8

I was looking for Classroom Discussion and assume you must have got those mixed up. Classroom Behavioural has an effect size of only 0.62.

Hope this helps with a revision of the graphic – cheers

' src=

Hi Tom, Thanks for pointing that out! I double checked the issue with Hattie’s two books about “Visible Learning”. The list I visualized for this website is related to Hattie (2009) Visible Learning . Hattie constantly updates his list with more meta studies. I suggest that your comment relates to an updated list in Hattie (2011) Visible Learning for Teachers ? Cheers, Sebastian

' src=

Can someone help me please? I have seen many different tables of Hattie’s effect sizes and the order and effect size seems to differ quite significantly between them. Why is this? I am trying to use them for an evaluative model and I am confused as to which order and effect size I should use.

With thanks for any clarification you can offer.

Hi Clare, As Hattie has updated the ranking in his newer books I would recommend to use the latest version of the list in “Visible Learning for Teachers” which cites over 900 meta studies.

' src=

Could you please explain the negative probabilities in the work that I’ve read about here: https://ollieorange2.wordpress.com/2014/08/25/people-who-think-probabilities-can-be-negative-shouldnt-write-books-on-statistics/comment-page-1/#comment-545

Hello, the CLE calculations have been wrong in earlier editions of Visibible Learning. The Common Language Effect Size (CLE) is a probability measure and by definition must be between 0% and 100%. This error has been corrected in newer editions and translations of the book. From the very beginning the story of Visible Learning is mainly based on the effect size (Cohen’s d) which are correct.

Here’s what John Hattie says about about it: “At the last minute in editing I substituted the wrong column of data into the CLE column and did not pick up this error; I regret this omission. In each subsequent edition the references to CLE and their estimates will be dropped – with no loss to the story.” http://leadershipacademy.wiki.inghamisd.org/file/view/Corrections%20in%20VL2.pdf/548965844/Corrections%20in%20VL2.pdf

' src=

That is not the main issue. The bigger problem is conceptual. For example, ‘instructional strategies’ is not a strategy … no more than vehicle is a specific vehicle. A child’s wagon, a wheelbarrow, a half ton truck, a five ton truck are vehicles. If we used ‘vehicles’ to move gravel from point A to point B … and we calculated an effect size on vehicles … we suffer from ‘regression towards the mean’; the child’s wagon will look more powerful than it is (a higher effect size) and the 5 ton truck will look worse (a lower effect size). The same issue is with Cooperative Learning. Cooperative learning is a label for a belief system about how students learn; it has approximately 200 group structures that go from simple to complex (Numbered Heads to Think Pair Share to Jigsaw to Group Investigation). To provide an effect size for cooperative learning is imprecise … same problem …. regression towards the mean.

Also, Concept Mapping (Joseph Novak’s work) is an example of an instructional strategy … he wisely does not provide an effect size for ‘graphic organizers’ — because graphic organizers is not a specific instructional method (that would included, flow charts, ranking ladders, Venn diagrams, Fishbone diagrams, Mind Maps and Concept Maps).

For a ‘drug’ example, imagine calculating the effect size for 10 mg, 50 mg., 100 mg., and 150 mg of that drug … then averaging them to tell people that this ‘pain medicine’ has an effect size of say .58. Clearly, that is imprecise. cheers, bbb

' src=

I am looking at this graph and am curious as to what age group this study was done on when it comes to education.

Dear Erica, in an interview John Hattie explains: “I was interested in 4-20 year olds and for every influence was very keen to evaluate any moderators – but found very few indeed. The story underlying the data seems applicable to this age range.” Best wishes, Sebastian

http://visible-learning.org/2013/01/john-hattie-visible-learning-interview/

' src=

Very interesting looking at the things that you do in your classroom that you feel are really getting the ideas across well, and finding out that you may be missing a big chuck of your class just by the way you are presenting material to them!

' src=

I’ve been reading a book called Spark, by John Ratey. In it, he argues that cardio exercise has a large influence on student success. Does anyone know where this might fit into Hattie’s effects, or any related studies?

' src=

I note that peer tutoring has a 0.55 effect but mentoring which Hattie states is a form of peer tutoring has a 0.15 effect. How can there be this level of difference? One could assume from this that mentoring is not a particularly worthwhile investment but there would be few people who have achieved eminence in their fields who were not heavily influenced by a mentor.

' src=

“Peer mentoring” is a specific kind of program. Likewise, I’m guessing Hattie’s “mentoring” isn’t what you have in mind. If you look at mentoring programs, it’s not like having a single brilliant individual who intimately guides you throughout a period of life. This is very hard to do well in the broader school system. You need way too many mentors to be practical, not to mention paying them and matching them up. Also, not all students respond well. Great people have generally relied on and responded to mentors in their development. But try fixing up a typical student with a typical mentor, and you’ll see it hard to predict the outcome.

' src=

Hello, I am about to buy the book but I wondered if someone could just quickly fill me in here on what statistic is being used to represent the effect size, e.g. r or r^2 or z? Thanks.

Hello Daniel, Hattie uses Cohen’s d to represent the effect size. Cohen’s d is defined as the difference between two means divided by a standard deviation of the pooled groups or of the control group alone. Cheers, Sebastian

' src=

So in a group with a large standard deviation (e.g wide range of abilities) the effect size for the same improvement in mean always looks smaller than a group with a smaller standard deviation (lower range of abilities)? Hardly seems a valid tool for comparison..

Hello Mark, effect size d isn’t a perfect measure (that doesn’t exist) but it’s a good and practical approach to compare different sample sizes. Moreover, taking into account the standard deviation helps to better interpret mean differences. Taking your example of a large standard deviation before the intervention (e.g. wide range of abilities): imagine an intervention that results in only a small mean difference. Maybe your intervention has a large effect size d if you manage to bring the group of learners together and lower the standard deviation.

' src=

OK. I am not a statistician but I have some questions about Hattie’s explanation as to how publication bias does not affect his results. You can find the questions here: https://sites.google.com/a/lsnepal.com/hattie-funnel-plot/

Hi Brad, I found this paper with a more detailed funnel plot. It’s a follow-up on Hattie’s “reading methods”: http://ivysherman.weebly.com/uploads/1/7/4/2/17421639/post_-_edd_1007_final_paper_pr.pdf

' src=

Comprehensive school entry screening is not specifically mentioned by Prof. John Hattie. However certain elements are: Feedback, Evaluation, Classroom Behaviour, Interventions for the Learning Disabled, Prior Achievement, Home Environment, Early Intervention, Parent Involvement, Preterm Birth Weight, Reducing Anxiety, SES. But others are missing eg. The division of Behaviour into Internal and External, the effects of below average Speech-Language level, Resilience, etc. The validity of the 20 years research on Parent, Teacher and Child-based school entry screening is contained in Reddington & Wheeldon (2009)which can be sent to Prof. Hattie (also presented at the International Conference on Applied Psychology, Paris, July, 2014). Prof. Hattie’s hierarchies are an extremely helpful guide, and checklist, against which to compare the Parent, Teacher and Child based items of the school entry screening system.

' src=

I purchased the Visible Learning book and appreciate the ranking and effect sizes.

Although, there isnt a place anywhere in the book where the intervention labels are explained in detail.

For instance, what does Piagetian programs mean; what do creativity programs entail; how are repeated reading programs executed?

Is there a way I can find out more information on what the labels mean to John Hattie?

' src=

This might be of some help: http://visible-learning.org/glossary/#2_Piagetian_programs

' src=

The explanation in this link is backed up with another link – that second link is to an abstract about a study that compared Piagettian test with IQ tests so see was the better predictor of school performance. It is not very surprising that Piagetian tests were better predictors (since these correspond to school tasks more closely than those of IQ tests). The main problem for me is that the study does not deal with ‘Piagetian programs’ (sic) just a test. I am struggling to find an endorsement of ‘Piagetian programs’, though I can find plenty of studies that points out gaps in Piaget’s approach – including Piaget’s own admission (late in life, but all the more creditworthy to acknowledge at that stage) that he was wrong about language being secondary to learning. Where are these studies that show strong effect sizes for Piagetian programmes?

' src=

Hi Mr. Hattie, How was your ranking calculated mathematically?

Do you use the data from visible learning to make your calculations?

Would you take video submissions to run through your visible learning process complete with transcripts and data analysis?

Is there a charge for visible learning?

How long does it take to get feedback?

I’m fascinated by the idea that you are quantifying teaching strategies and want to better understand the process.

Thanks, Kendra Henry

' src=

Here are some thoughts on Hatties use of statistics mathematically http://literacyinleafstrewn.blogspot.no/2012/12/can-we-trust-educational-research_20.html

…and here https://ollieorange2.wordpress.com/2014/08/25/people-who-think-probabilities-can-be-negative-shouldnt-write-books-on-statistics/

Hello GL, Thank you for the links discussing the issues related to Hattie’s use of CLE. The CLE calculations have been wrong in earlier editions of Visible Learning. The Common Language Effect Size (CLE) is a probability measure and by definition must be between 0% and 100%. This error has been corrected in newer editions and translations of the book. From the very beginning the story of Visible Learning is mainly based on the effect size (Cohen’s d) which are correct.

Here’s what John Hattie says about about it: “At the last minute in editing I substituted the wrong column of data into the CLE column and did not pick up this error; I regret this omission. In each subsequent edition the references to CLE and their estimates will be dropped – with no loss to the story.” http://leadershipacademy.wiki.inghamisd.org/file/view/Corrections%20in%20VL2.pdf/548965844/Corrections%20in%20VL2.pdf

Best wishes, Sebastian

Dear Kendra, For transcripts and data analysis you might check out the Visible Classroom project: http://visibleclassroom.com For a better understanding of the process I would recommend reading the book “Visible Learning for Teachers”. Best wishes, Sebastian

Hi Mr. Hattie, Is it possible to get access to your powerpoint? Thanks, Kendra Henry

' src=

Is there an explanation, on your site, of the new top two effects? Teacher estimates of achievement and Collective Teacher Efficacy?

Well, I (stupidly) rented a Kindle version of the VL for Teachers that your link led me to on Amazon. I am trying to learn precisely what is meant by the new top two effects. I didn’t notice that it was the 2012 version, which I already own. (Old eyes shouldn’t buy books on a smartphone, I suppose.)

If the 2012 list was the Gold Standard of effect sizes, how is it that the 2015 list is topped by two brand-new effects?

Dear Chuck, Thanks for your comment! I also think about these two brand new effects since I have visualized the new list. Unfortunately John Hattie gives little detail in his paper from 2015. I found this a short introduction video “Collective teacher efficacy” helpful: https://www.youtube.com/watch?v=FUfEWZGLFZE . And I think this is one of the meta-analysis Hattie relates to: Eels (2011): “Meta-Analysis of the Relationship Between Collective Teacher Efficacy and Student Achivement” http://ecommons.luc.edu/cgi/viewcontent.cgi?article=1132&context=luc_diss Best regards, Sebastian

' src=

Question on one of the top effect sizes…

I looked up and read the dissertation by Jenni Anne Marie Donohoo on Collective Efficacy.

There is zero mention anywhere in the paper about any effect size over 0.63. Can you find out how Hattie (or anyone) got the 1.57 effect size on collective efficacy?

' src=

It is great to be reading about research from the horses mouth and linking to the practices of our school which our strongly influenced by Hattie

' src=

I’m just wondering way diagnotic and remediation programs to overcome students’weaknesses on science concepts and other diciplines has not been included this review. I have working in this area since 1990. one of this work was appears in may Ph D thesis at Monash, 1990. ‘Remediation of weaknesses in physiscs concencepts’.

' src=

Did I miss ‘focus’? To those of us ‘on the front lines’ one of the most important variables in learning is Focus/lack thereof. Add the co-morbidity of anxiety and depression, it effects that student-teacher relationship, contributes to the lack of retention and big picture learning. Focus, and it’s deficit, will impact not only the student, but the home, the school, the curricula, the teacher, and learning approaches..sigh…

' src=

Thanks for the diagram Sebastian – what are these effects influencing specifically? It says “learning outcomes” at the start – is there anywhere that the specific learning outcomes are listed, along with how they are objectively measured?

Kind regards, Kevin

' src=

Hi there! What’s mean classroom behavior ? What’s the definition of this concept ? I’m looking forward to see your explanation.

Have a good day!

' src=

Simon, I find a similar issue with the topics listed. In my field as a Behavior Analyst, the topography of behavior is important as an objective means to gather data. An unbiased observer should be able to collect data with specificity of the behavioral definition for valid data to be analyzed. Rebekah

' src=

Is there somewhere where I can look to see just what topics are included under each heading for effect size? For instance, where would ‘memorization’ as an effect fall under?

' src=

I have been a Hattie follower since 2009 and really believe in his research. My question is, does anyone know why the 2015 list of 195 influences is not published in later books (i.e. Teaching Literacy in the VL Classroom, 2017)?

' src=

How can we get the 1400 meta-analyses list?

' src=

Hello !!! Where can I find the descriptions of all 195 influences with good and bad examples ?? So I can read about good parctrice and bad practice, something very practicle !!! Thx

I would recommend reading Hattie’s book “Visible Learning for Teachers”: https://www.amazon.ca/Visible-Learning-Teachers-John-Hattie/dp/0415690153

' src=

Hi there – very interested in all of this – however Differentiation, doesn’t seem to appear…..what am I missing here? Thanks Martin

' src=

I agreed with your opinion !!!

' src=

Differentiation is not a specific strategy–it is an application of strategies. Many of the specific strategies are examples that are used for differentiation.

' src=

A question about the effect of a larger, maybe more conceptual, item: academic standards. That doesn’t seem to be on the list, but “Teacher Efficacy,” “Teacher Credibility,” and “teacher Clarity” do appear. So should one assume that clear academic performance expectations are woven into other contributors/factors?

Also, “Subject Matter Knowledge” is rather low on the list – but I’ve read numerous articles/attended many conferences where they discuss the importance of teachers being subject matter experts

' src=

What is number 141 (July 4, 1900)?

Thanks for pointing to this error. Number 141 is “stereotype threat”. I corrected the this.

' src=

The teaching and learning variables are numerous and often amazing! “Hats off to Hattie” for diving deep into what makes us all tick. The uniqueness of each student and all of the influences that abound in each student’s life is often overwhelming. Prayer and a conscientious professional learning committee (PLC) is a great place to start for the benefit of each student. I’m grateful for your help and perseverance to provide us with valuable research. I would like to see the old PTA (Parent & Teachers Association) revitalized. Active stakeholders are needed in the learning process in my view.

' src=

Hello: I am a reading consultant organizing a class for students reading two or more years below grade level. I believe the teaching of specific strategies: summarizing, main idea, theme, elements of a story, highlighting, skimming/scanning, are essential but I would like to see Haddie’s scale on the efficacy of teaching specific skills such as these to the middle schooler (ages 13-15, grades 7 and 8). Please advise as to where to find them (of course, I know where summarizing is on the scale..but others…)

' src=

I am working on my Masters of Education and am interested in including Hattie’s studies in my research. Can you tell me how come I can’t find this particular research in any peer reviewed journals? I’d love to analyse his study and thoroughly read his methods for research. It seems suspicious that I can only view his study by purchasing his book.

' src=

I have been trying to find research on the effect size of two current trends in elementary classrooms: Flexible Seating and Blended Learning. Has Hattie, or anyone, gathered data on either of these?

' src=

What is meant by “gender on achievement”?

The effect of gender on the learning outcomes. (“Do boys learn less then girls?”)

' src=

I have tried reading on the mathematics of size effects, but I find it quite complicated. If the size effect of inductive teaching is 0,44 and the size effect of micro-teaching is 0,88, does it mean that micro-teaching is twice as effective as inductive teaching ? Is it a linear relation or something else ?

Thanks’ for your help.

' src=

Hi, this is very interesting. Could you explain what is meant by collective teacher efficacy? What parameters went into this category? Thank you very much.

Hi Klara, here is a short overview on CTE: https://visible-learning.org/2018/03/collective-teacher-efficacy-hattie/

' src=

I’m just wondering way diagnotic and remediation programs to overcome students’weaknesses on science concepts and other diciplines has not been included this review. I have working in this area since 1990. one of this work was appears in may Ph D thesis at Monash, 1990. ‘Remediation of weaknesses in physiscs concencepts’.

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

Save my name, email, and website in this browser for the next time I comment.

Edkimo Online-feedback. Try it now, it's free!

  • شيخ روحاني on Hattie Ranking: 252 Influences And Effect Sizes Related To Student Achievement
  • Pete Gabriella on Professor John Hattie
  • Steve Capone on Video: John Hattie’s keynote at the Whole Education Annual Conference
  • Phillip Cowell on Collective Teacher Efficacy (CTE) according to John Hattie
  • simon on The Economist: John Hattie’s research and “Teaching the teachers”
  • Susan Begin on Visible Learning World Conference 2019
  • Anna on What is your experience?
  • Rebekah Camp on Hattie Ranking: 252 Influences And Effect Sizes Related To Student Achievement
  • Effective School Communications In The Summer | The Blogging Adviser on Hattie Ranking: 252 Influences And Effect Sizes Related To Student Achievement
  • Sebastian Waack on Hattie Ranking: 252 Influences And Effect Sizes Related To Student Achievement
  • Our Mission

Spiral of adults holding hands

How New Teachers Can Set Healthy Emotional Boundaries With Students

Teacher-tested tips for responding to students’ distress with kindness and compassion—without taking their burdens onto your own shoulders.

Shayla Ewing never had a cavity in her life when, in 2020, she began experiencing intense dental pain. An appointment with her dentist revealed that her teeth were intact, but the English and drama teacher had been clenching her jaw so tightly that serious health symptoms were likely to develop unless she took steps to reduce the work-induced stress and anxiety at the root of the problem.

At the time, Ewing was “supporting student after student and teacher after teacher” through the early months of the pandemic. She had trouble sleeping, struggled to “muster the same passion I once had for my classroom work,” and lived with “an always present feeling of anxiety and constant fatigue,” she wrote at the time in an essay for EducationWeek . 

Today, several years later and after “seeking counseling to get my life back in order,” Ewing is a newly minted principal preparing for the new school year at Pekin Community High School in Illinois. She recognizes her symptoms from 2020 as the “cost of caring” a little too much, the inevitable price of “giving of myself to my students, and empathizing with their lives and their worlds,” she says. “You can’t just give yourself for free, and that’s what was happening to me. I had to figure out what went wrong with me.”

Training for Resilience

New teachers aren’t usually trained to respond to students’ needs without taking on too much emotional burden themselves, says Patricia Jennings, a professor of education at the University of Virginia. “Teachers hear these stories of adversity and trauma from students, and it’s heart-wrenching,” says Jennings. “Without the skills and the training to deal with this, that psychological or empathetic distress adds to the existing emotional toll of teaching.”

Empathizing with students is both laudable and unavoidable, but “it has risks when it becomes distressing,” write Jennings and co-author Helen H. Min in a 2023 research review . Cultivating compassion, through a personal practice or a professional learning program such as CARE , where the emphasis is on developing emotional awareness, resilience, and personal boundaries, prepares teachers to “address students’ needs and protects teachers from the negative effects of empathy-based stress,” the researchers conclude. 

When we asked veteran teachers for advice to help new teachers set healthy emotional boundaries with students—strategies that allow novice educators to be kind and proactive but prevent them from becoming overly invested in students’ burdens—we heard from hundreds of teachers on our social channels. Here are the best teacher-tested tips they shared with us.

Find Your Support Team

As a new teacher, get to know the support resources available to students in your building as well as how to connect kids to these people and services. “Often, people who are new to a caring profession like teaching become isolated. You put on yourself the idea that this kid needs you and only you can provide them the support they need,” says Ewing. “That’s a dangerous place to be. You’re not meant to carry that load, and you’re not trained to do it.” 

As issues with students come up, let other adults in the building know, suggests high school music teacher John Stevenson. “Make fast friends with the rest of your support team—administrators, counselors, team leaders—and keep them in the loop from the first time a student confides in you about an issue,” Stevenson writes on Facebook. “Document everything and send your notes to someone else on your team as soon as possible.” 

Resist the urge to try fixing everything yourself, says veteran educator Laura Bradley. “The reality is that it does take a village,” Bradley notes. “I guess my boundary is to wrap a circle of my support team around me so that I am not taking it all on by myself.”

Set Up Student Care Routines Early

Long before students lose their cool and need a reset, it’s helpful to make them aware of the care options, resources,  and processes already in place. When educator Chelsea Ayn Nelson notices a student in her classroom becoming dysregulated, she’ll have a “quick, empathetic talk” with the child and ask, “Do you know what you need now?” 

Options might include going to the quiet corner , a quick “walk for water,” a chat with a counselor, a cold towel on the head or neck , or even a housekeeping task like starting to “organize manipulatives,” Nelson says. “Do any of these sound good?” she’ll ask before letting the student know that she needs to return to teaching, drawing a kind but clear boundary. “‘I’ve got to handle the class now, but I’ll check in. Let me know if something changes or gets worse.’ Then I teach.” 

Don’t ‘Give Your All’

“It’s hard to not be ‘all in’ with what a student is feeling or experiencing, or to not feel some responsibility for their happiness,” Ewing says. “It’s a vulnerable place, and it’s really hard to step out of that.” But showing up for a student “doesn’t mean you have to lose yourself in the process or sacrifice your own well-being,” says Cathleen B. “If you’re drained and overwhelmed, you’re not helping yourself or your students.”

Sometimes drawing a protective barrier can be a small action: “I close my door at lunch to recharge,” Colin Caldwell writes. “And from time to time, I need a mental health day off from work. It’s not a perfect system, and even for me, a teacher who generally enjoys helping students, it can get overwhelming. You are a teacher, not a social worker.” 

At Ewing’s school, teachers set up “walk buddies,” colleagues who head outdoors together during planning periods for a quick 10-minute reset walk—allowing teachers to move their bodies and collect themselves for a few moments in the school day. 

Meanwhile, Vanessa M. looks at the misses in her classroom—moments when she might have handled situations with her students differently—and accepts them as “feedback,” then checks in with herself to see if “I’m calibrating correctly, if I’m in the right headspace to meet my students’ needs.” If things seem headed in the wrong direction, she might suggest a short break—for students and herself, “a moment to color or read for five minutes.” 

Be an Anthropologist 

When students misbehave, try to act “like an anthropologist,” says sixth-grade teacher Kaari Berg Rodriguez. Pause and adopt an exploratory, curious mindset instead of immediately reacting—admittedly tough in the heat of the moment—so you can consider students’ behavior and “what skills they are missing” before determining next steps, says Berg Rodriguez. 

Most important, “do not take it personally,” says early-grade teacher Karen Hertlein, drawing a line in the sand when confronting circumstances you might otherwise take to heart. “Remember that the behavior of your students is not your fault.” 

Of course, clarity around classroom rules and expectations is the starting point for a well-run classroom, says Jennings, the University of Virginia professor, and can curb meltdowns before they happen. “There’s a lot of research showing that when children are given really clear expectations, and are shown what these look like, and given an opportunity to practice, then they learn how to do it themselves,” Jennings says. “The problem is that there’s often ambiguity and a lack of consistency.” 

Find Your Mentors

Mentors are sometimes assigned to new teachers through a formal district-level program, but Ewing notes that many schools don’t have these programs, so finding informal mentors can be an important step for new teachers trying to develop a healthy approach to teaching. “Maybe there’s someone in your building who you’ve connected with, or maybe it’s someone who teaches the same subject as you,” Ewing suggests. 

When challenging situations arise with  students, having a mentor as a sounding board outside of the classroom can be a valuable resource, offering expert feedback in situations that new teachers may not feel prepared to tackle on their own. “They’re going to understand that you’re still learning the skill of how to support students, and also still learning how to support yourself at the same time,” Ewing says.

Reframe Relationships

Though most new teachers know that relationships are an important part of learning and engagement, clear guidance about what appropriate connections with students look like is often missing from their training. “Relationships are an education buzzword, right?” says Ewing. “People talk a lot about forming relationships as if it’s this solution: ‘Oh, you have classroom management problems? Well, do you know your kids? Did you form relationships with them?’” 

Clarifying the line between appropriate connection and the murky zone that lies beyond is tricky—especially at the beginning of a teaching career. For students, a teacher’s warm, caring attention to their struggles can seem like a reprieve from difficulties at home or in school. But you are not “friends,” cautions Jennings, and sometimes it’s enough to “know them, and a little bit about their lives and families.” Though a new teacher’s instinct may be to connect deeply with students, Jennings says that even a baseline connection can go a long way in the classroom. ”One of the most powerful motivators for human beings generally is to be seen for who you are, and to be valued as an individual,” Jennings says. For teachers, Ewing suggests leading with the idea that “I am an instructional leader in my room. I’m here for the student to learn the content— and I want to know their personalities, their likes,” she says. “But when it comes to telling me about their trauma, harmful things that are currently happening to them, or happened in the past, then I need to partner with someone else to help me carry that. That’s the line I draw.” When students share things beyond the realm of what Ewing can support, she might say, “Oh my gosh, that’s so important. Thank you for sharing that with me; that’s very courageous. Let’s get some help together.” 

Stanford University

Along with Stanford news and stories, show me:

  • Student information
  • Faculty/Staff information

We want to provide announcements, events, leadership messages and resources that are relevant to you. Your selection is stored in a browser cookie which you can remove at any time using “Clear all personalization” below.

Denise Pope

Education scholar Denise Pope has found that too much homework has negative effects on student well-being and behavioral engagement. (Image credit: L.A. Cicero)

A Stanford researcher found that too much homework can negatively affect kids, especially their lives away from school, where family, friends and activities matter.

“Our findings on the effects of homework challenge the traditional assumption that homework is inherently good,” wrote Denise Pope , a senior lecturer at the Stanford Graduate School of Education and a co-author of a study published in the Journal of Experimental Education .

The researchers used survey data to examine perceptions about homework, student well-being and behavioral engagement in a sample of 4,317 students from 10 high-performing high schools in upper-middle-class California communities. Along with the survey data, Pope and her colleagues used open-ended answers to explore the students’ views on homework.

Median household income exceeded $90,000 in these communities, and 93 percent of the students went on to college, either two-year or four-year.

Students in these schools average about 3.1 hours of homework each night.

“The findings address how current homework practices in privileged, high-performing schools sustain students’ advantage in competitive climates yet hinder learning, full engagement and well-being,” Pope wrote.

Pope and her colleagues found that too much homework can diminish its effectiveness and even be counterproductive. They cite prior research indicating that homework benefits plateau at about two hours per night, and that 90 minutes to two and a half hours is optimal for high school.

Their study found that too much homework is associated with:

* Greater stress: 56 percent of the students considered homework a primary source of stress, according to the survey data. Forty-three percent viewed tests as a primary stressor, while 33 percent put the pressure to get good grades in that category. Less than 1 percent of the students said homework was not a stressor.

* Reductions in health: In their open-ended answers, many students said their homework load led to sleep deprivation and other health problems. The researchers asked students whether they experienced health issues such as headaches, exhaustion, sleep deprivation, weight loss and stomach problems.

* Less time for friends, family and extracurricular pursuits: Both the survey data and student responses indicate that spending too much time on homework meant that students were “not meeting their developmental needs or cultivating other critical life skills,” according to the researchers. Students were more likely to drop activities, not see friends or family, and not pursue hobbies they enjoy.

A balancing act

The results offer empirical evidence that many students struggle to find balance between homework, extracurricular activities and social time, the researchers said. Many students felt forced or obligated to choose homework over developing other talents or skills.

Also, there was no relationship between the time spent on homework and how much the student enjoyed it. The research quoted students as saying they often do homework they see as “pointless” or “mindless” in order to keep their grades up.

“This kind of busy work, by its very nature, discourages learning and instead promotes doing homework simply to get points,” Pope said.

She said the research calls into question the value of assigning large amounts of homework in high-performing schools. Homework should not be simply assigned as a routine practice, she said.

“Rather, any homework assigned should have a purpose and benefit, and it should be designed to cultivate learning and development,” wrote Pope.

High-performing paradox

In places where students attend high-performing schools, too much homework can reduce their time to foster skills in the area of personal responsibility, the researchers concluded. “Young people are spending more time alone,” they wrote, “which means less time for family and fewer opportunities to engage in their communities.”

Student perspectives

The researchers say that while their open-ended or “self-reporting” methodology to gauge student concerns about homework may have limitations – some might regard it as an opportunity for “typical adolescent complaining” – it was important to learn firsthand what the students believe.

The paper was co-authored by Mollie Galloway from Lewis and Clark College and Jerusha Conner from Villanova University.

Media Contacts

Denise Pope, Stanford Graduate School of Education: (650) 725-7412, [email protected] Clifton B. Parker, Stanford News Service: (650) 725-0224, [email protected]

Education | California gives schools homework assignment:…

Share this:.

  • Click to share on Twitter (Opens in new window)
  • Click to share on Facebook (Opens in new window)
  • Click to print (Opens in new window)
  • Click to email a link to a friend (Opens in new window)
  • Public Notices
  • Marin Supervisors
  • Marin Civil Grand Jury

Education | California gives schools homework assignment: Build housing for teachers

Author

In a flurry of recent legislation and initiatives, California officials are pushing school districts to convert their surplus property into housing for teachers, school staff and even students and families.

Some districts have already started, and now the state wants every district to become a landlord.

“I believe that California has enough resources and ingenuity to solve (the housing shortage), and the data shows that California’s schools have the land to make this happen,” state Superintendent of Public Instruction Tony Thurmond said at a press conference in July. “As school leaders, we can get this done for our communities and restore the California Dream.”

But some superintendents and education analysts are skeptical, saying the idea won’t work everywhere and school districts might be better off focusing on education, not real estate development.

“I’m grateful someone’s paying attention to this, but I feel like educators are being asked to solve so many problems,” said Mendocino County Superintendent Nicole Glentzer. “Student performance, attendance, behavior … and now the housing crisis? It’s too much.”

Last month, Thurmond pledged financial incentives for districts that pass bonds to build staff housing, and the Department of Education is sponsoring a workshop for district officials to learn the ins and outs of real estate development.

His move comes on the heels of a report from the University of California, Berkeley, and the the University of California, Los Angeles, that found school districts in California own 75,000 acres of developable land, enough to build 2.3 million residences — which could wipe out the state’s housing shortage.

It also follows the Teacher Housing Act of 2016, which allows school districts to pursue funding sources for housing projects, including state and federal tax credits. Other pieces of legislation, including a 2022 law that went into effect in January, further streamlined the development and funding process. Other laws allow teachers to live in affordable housing even though their income might exceed the qualifying limits.

If Proposition 2, a $10 billion school facilities bond, passes this fall, schools could use that money to not only repair classrooms and other structures, but build teacher housing.

‘It’s changed my life’

A handful of districts have already embarked on projects.

The Los Angeles Unified School District owns several buildings, including a 90-dwelling building that opened in April and a 26-dwelling building reserved for low-income families. The San Francisco Unified School District plans to open a 135-residence building this fall.

The Santa Clara Unified School District has owned a 70-dwelling complex for more than two decades. In San Mateo County, the Office of Education is working with a public-private housing nonprofit to buy an existing apartment building for local teachers.

In Marin, the county education office joined with the county and state to plan teacher housing on state-owned land near San Quentin prison; the Novato Unified School District is exploring workforce housing at several properties it owns; and the College of Marin has formed a coalition of organizations to promote staff housing.

A person in a yellow shirt and dark green pants helps a child in a blueish-purple dress put on roller skates.

In San Diego, preschool teacher Carolina Sanchez Garcia said she cried when she learned she won a spot at the 264-apartment Scripps Ranch complex, built through a partnership between the San Diego Unified School District and an affordable housing developer.

Due to the high cost of housing in San Diego, she had been commuting from Tijuana, Mexico, for more than a decade. To get to work on time, she’d get up at 2 a.m., move her five kids into the car where they’d go back to sleep, and make the trek across the border to work. Her kids would brush their teeth and get ready for school at a Starbucks.

Now, her commute is only 15 minutes.

“It’s changed my life,” Garcia said. “My kids are sleeping more. I’m sleeping more. It’s made me a better mother and a better teacher. Now, I start my day feeling positive and energized.”

A rendering of Oak Hill Apartments, the complex proposed for state land near San Quentin prison. (Provided by Eden & Education Housing Partners)

Garcia pays $1,300 a month for a three-bedroom apartment, roughly half of market rate. The rent is similar to what she paid in Tijuana, but now she has time to cook dinner for her family, prepare for class and help her children with homework. Her kids can participate in after-school activities and spend time with friends. Her gas bill is also lower.

“I am so grateful,” Garcia said. “I think all districts should do this. Teachers need help.”

Kyle Weinberg, a special education teacher who’s head of the San Diego teachers union, said the district’s housing endeavors have been successful because teachers share in the planning process, ensuring that the location, size and rents meet teachers’ needs. The district paid for the Scripps Ranch development through an agreement with a private developer, and plans to pay for the next development with money from Measure U, a $3.2 billion school facilities bond that passed in 2022.

Subsidized housing is necessary, Weinberg said, because of the high cost of living in San Diego. To live in a one-bedroom apartment there, starting teachers, who earn about $60,000, would have to pay roughly 63% of their take-home pay on rent. Teachers have long commutes and suffer from burnout, he said.

The union’s goal is to have 700 dwellings available, serving at least 10% of the teaching staff.

“We have a staffing crisis in our district,” Weinberg said. “We need to explore all possible solutions. Along with salaries and benefits, expanding workforce housing is one of those options.”

Negligible turnover

The model state officials often point to is 705 Serramonte in Daly City. The Jefferson Union High School District opened the 122-apartment complex in 2022, and it now houses a quarter of the district staff. A one-bedroom apartment rents for $1,450 a month, about half the market rate.

The district paid for the $75 million project by passing a $33 million bond specifically for teacher housing, and borrowing the rest. The rents generated by the project cover the bond payments. The district hired a property management company to handle maintenance and other issues.

Daly City is sandwiched between Silicon Valley and San Francisco, which have some of the highest rents in the country. Teachers commute from the East Bay and beyond, and the district grappled with a persistent 25% staff turnover rate annually, said district spokesperson Denise Shreve.

Since 705 Serramonte opened, the district has had near-zero turnover.

“Students now start off the school year with a teacher in their classroom, instead of a long-term substitute,” Shreve said. “You have to look at the long-term benefits. We now have teacher retention and students are better off because of it.”

Lisa Raskin, a social science teacher and instructional coach for the district, said she’s struggled with housing over her 20-year career but never considered leaving. A San Francisco native, she’s committed to staying in the area — which has meant that she’s always had roommates.

When she moved into 705 Serramonte, it was her first time living in her own apartment.

“I can be with community if I want, or I can be alone. I love that,” Raskin said, noting that her neighbors and colleagues often host barbecues, game nights and other gatherings. “We call it ‘adult dorms.’ I feel safe here.”

Overworked superintendents

But not every district can pass a bond for teacher housing. Many can’t even pass bonds to repair school campuses. And some superintendents say they’re already so overworked that undertaking a complicated project like real estate development is a near impossibility. California had a superintendent turnover rate of more than 18% last year, according to research from the Superintendent Lab, in part due to workload.

Glentzer, the Mendocino County superintendent, said housing development would be a challenge for smaller, rural and lower-income districts. Those districts face teacher and housing shortages like their wealthier, urban counterparts, but lack the ability to raise the money and hire the staff to oversee projects.

Besides, the housing shortage affects lots of people in the community — not just teachers. Mendocino County has been scarred by numerous wildfires over the past few years, plus a boom in vacation rentals that have decimated the local housing market, leaving some people to live in trailers or even their cars.

A better solution, she said, would be for housing to be left to regional authorities and for the state to fund school districts sufficiently to pay their teachers more.

Still, she understands the need. She herself lived in a district-owned home when she was superintendent of the Potter Valley Community Unified School district northeast of Ukiah. The  two-bedroom bungalow was next to the football field, and she enjoyed the reduced rent and proximity to work.

“There’s no question we need housing,” Glentzer said. “But when you’re the superintendent and the principal and head of maintenance and you’re teaching Spanish, how are you supposed to find the bandwidth for this? I have a degree in education. I never took a real estate course.”

Marguerite Roza, director of the policy research center Georgetown Edunomics Lab, agreed. School districts might be better off paying teachers more or targeting raises for teachers who are in high demand, such as those who work in special education, math or science.

A person in a yellow shirt holds a plate with food, as two children at a table eat food.

She also noted that except in those three fields, the teacher shortage is ebbing. With federal coronavirus relief money expiring and student enrollment declining, many districts might be laying off teachers — not hiring, she said. EdJoin, a teacher hiring board, showed nearly 2,000 openings this month for special education teachers in California, for example, but fewer than 100 for third-grade teachers.

“By building housing, districts might be addressing a crisis that no longer exists.” Roza said. “School districts’ expertise and focus is to provide education. To assume school districts could take on the responsibility of being landlords efficiently is concerning.”

Growing interest

To help school districts learn the basics of real estate development, the California School Boards Association has been hosting workshops and providing resources for the past two years. So far, 152 of the state’s 1,000 school districts have signed up to study the idea, and the numbers have been growing, said spokesperson Troy Flint.

He acknowledged that smaller districts might not have the staff to get projects off the ground, but some are working on projects together or collaborating with their local county offices of education, he said.

“Districts see the immense value workforce housing can offer their staff, students, and communities,” Flint said. “There is widespread interest in education workforce housing as an elegant way to address the housing affordability crisis. Workforce housing also brings quality-of-life, community and environmental benefits — and may even help address declining enrollment as district staff can afford to live with their families in the communities they serve.”

The Independent Journal contributed to this report.

More in Education

Papermill Creek Children’s Corner in Point Reyes Station received the unrestricted funding from the Marin Community Foundation.

Local News | West Marin preschool gets $100K grant

Making optional student loan payments while you're in school or during your grace period can save thousands in the long-run.

Business | Want cheaper college? Pay interest while in school

A $137,000 allocation for the products could hinge on the outcome of Measure B, the bond proposal on the November ballot.

Local News | Tam Union school district might issue locking cellphone pouches

Despite support from State Superintendent of Instruction, Tony Thurmond, the Jewish Public Affairs Committee of California and the Anti-Defamation League, legislators decided to push the bill to next year’s legislative session to allow for more time to work with education leaders and communities.

Education | California bill to prevent antisemitism in ethnic studies put on hold

More bullying, teacher dissatisfaction with the chancellor: 5 takeaways from NYC’s 2024 school survey

A group of young students wearing backpacks line up in front of a large brick building.

Sign up for Chalkbeat New York’s free daily newsletter to keep up with NYC’s public schools.

More New York City students than at any point in the past five years say kids in their schools regularly bully each other, newly released 2024 school survey results show.

More than half of the roughly 355,000 middle and high school students who responded to the city school system’s annual survey earlier this year said their classmates sometimes or often bullied, harassed, or intimidated each other last school year – up from 44% in 2019, the first year for which citywide survey numbers are available.

The rise in student reports of bullying comes as city schools confront a swirl of social challenges including the lingering effects of the pandemic, cyberbullying, social media, and global tumult such as the Israel-Hamas war and mass migration into New York City.

The rising student bullying numbers are among several notable trends from the latest annual questionnaire, which also reached roughly 66,000 teachers and over 400,000 parents and guardians this year.

For educators, the survey suggests that sweeping new curriculum mandates are trickling down into classrooms, with teachers reporting less influence over selecting curriculum, but more coherence in how their schools are approaching instruction. And nearly a third of city teachers reported dissatisfaction with the schools chancellor – a figure that’s nearly doubled since 2019.

Parents, meanwhile, continued to express a desire for more extracurricular and after-school options.

Here are five takeaways from the 2024 school survey. You can access the full results by navigating to the Education Department’s school survey site and clicking on the “results page” link.

Cyberbullying, discrimination on the rise

In all five questions on the 2024 survey involving bullying and harassment, students reported worse outcomes than in previous years.

Among the most notable changes: Roughly 43% of students reported seeing regular bullying and harassment online this year, compared to 35% in 2019.

Educators across the city have sounded the alarm about the effects of social media and online bullying spilling into classrooms. One Queens school administrator threatened last fall to suspend students for following an Instagram account that posted vulgar and insulting comments about students.

Increasing awareness of the harms of online harassment has also played a role in some schools’ decisions to ban cell phones during the school day, and in the city’s debate over instituting a systemwide ban .

More students also report seeing classmates bullied over their backgrounds, with 40% of students seeing harassment based on race, ethnicity, religion, or immigration status, up from 30% in 2019.

The reports came during a school year marked by explosive conflicts over the Israel-Hamas war and the continued arrival of tens of thousands of migrant students – some of whom expressed concerns that their immigration status made them targets.

As in past years , girls – and Black girls in particular – reported the highest rates of bullying and harassment.

Education Department spokesperson Jenna Lyle said, “Bullying has absolutely no place in our school communities.” She pointed to a citywide Respect for All initiative that provides professional development for school staff to deal with bullying and designates a liaison in every school.

Teacher dissatisfaction with the chancellor is up

Roughly 32% of teachers said they were dissatisfied with schools Chancellor David Banks – up from 30% last year and 17% in 2019, when former Chancellor Richard Carranza was at the helm.

A plurality of teachers, 44%, reported that they were satisfied with the chancellor, while 24% said they didn’t know.

It’s difficult to know exactly what’s driving the rise in dissatisfaction. Banks has launched several major curriculum initiatives that have forced big changes in how schools teach reading and algebra , drawing mixed responses from educators.

Earlier in Banks’ tenure, he and Mayor Eric Adams presided over deeply unpopular budget cuts as federal pandemic aid dried up, though officials replaced much of the expiring funding last year.

Teacher dissatisfaction with the schools chancellor has swung dramatically in previous years as well. It dropped from 43% to 18% between 2014 and 2015 when Carmen Fariña took over from Dennis Walcott.

Lyle, the Education Department spokesperson, said Banks is making “needed changes” to the school system, but said he is “committed to supporting staff” during those changes. “95% of families are satisfied with the education their children are receiving and teachers report high levels of satisfaction with their work, their schools, and the professional development they are receiving,” she added.

Effects of curriculum changes take hold

Rising dissatisfaction with the chancellor may not be the only impact of the big curriculum changes.

Teachers, particularly those in elementary schools, where officials have mandated superintendents to choose one of three pre-approved literacy curriculums, report having less influence over selecting curriculum in their schools, with 67% saying they have some or a great deal of influence, down from 71% last year. Only 58% of elementary teachers said they had influence over choosing curriculum materials.

On the other hand, teachers reported more “instructional coherence,” with 81% saying that when their school started a new program, they followed up to make sure it was working, up three percentage points from last year.

Parents continue to crave more extracurricular options

When asked which improvement they most wanted at their kids’ schools, the largest group of parents asked for stronger enrichment programs like clubs and after-school activities.

That tracks with past years. Stronger enrichment programming has been the top request of parents for years, with smaller class sizes next in line. In 2024, 24% of parents said more enrichment programs were their top priority, while 15% said smaller class sizes. In 2017, 24% of families chose enrichment programs, while 21% chose class sizes.

Enrichment programs can be particularly difficult for small schools to offer because they are funded on a per-pupil basis. As enrollment has shrunk, the number of tiny schools has also grown.

Average class sizes for elementary, middle, and high schools are down slightly compared to 2019 , though they’re up compared to 2022. The 2022 state class size law will require the city to continue significantly reducing class sizes in the coming years.

Mental health, classroom behavior remain challenges

Educators across the city and country have confronted growing challenges with student mental health and behavior in the wake of the pandemic, and those issues remain challenges, the survey data indicates.

This year, 68% of teachers said their students regularly do their work when they’re supposed to, down from 69% last year and from 75% in 2022 and 2019. Roughly 77% of teachers said adults in their buildings had access to mental health support , down two percentage points from last year, and eight percentage points compared to 2022.

Michael Elsen-Rooney is a reporter for Chalkbeat New York, covering NYC public schools. Contact Michael at [email protected]

CEA President Kevin Vick talks school funding, improving  the workplace for teachers

New Colorado Education Association President Kevin Vick shares how he will approach the job as he replaces Amie Baca-Oehlert.

Michigan Department of Education asks again for dismissal of federal civil rights case

U.S. officials say the MDE misled districts about what they were entitled to provide for students with disabilities during COVID closures.

With more hot days, Detroit students' learning and health suffers

Days above 90 degrees on the heat index are expected to quadruple over the next two decades, meaning more closures for Detroit public schools

There’s now a timeline for deciding which Denver schools will close, and it’s not very long

The superintendent is expected to make school closure recommendations on Nov. 7 and the school board is set to vote on them Nov. 21.

Federal review of Memphis school district cites dozens of cases of alleged sexual harassment and assault

Memphis-Shelby County Schools violated Title IX by not adequately responding to complaints of sexual harassment and assault of students over a three-year period, the U.S. Department of Education’s Office of Civil Rights announced.

Streamlined application process for IPS Innovation Network opens to potential school partners

The updated application process grew out of a board resolution affirming its desire to collaborate with schools of all kinds, including charters.

  • Live In The D
  • Newsletters

WEATHER ALERT

A flood warning in effect for Livingston County

Teacher faces multiple sexual contact charges with student on school property in monroe county, sexual contact with student spanned school, park, and hotel.

Brandon Carr , Digital Content Producer

Christopher Wilhem, 38, of Toledo, was charged in the First District Court of Monroe on four counts of criminal sexual conduct in the third degree, two counts of criminal sexual conduct in the fourth degree, and one count of the use of a computer in the commission of a crime.

Wilhelm is a former teacher and football coach from Bedford High School, and the charges stem from a reported sexual relationship he had with a student in 2022.

Recommended Videos

The student reported that Wilhelm had sexual contact with her multiple times on school property, at a local park, at his residence, and at a hotel in Toledo, Ohio.

Officials say the Toledo police are investigating the incidents of sexual contact in Toledo.

Anyone with information should contact D/Sgt. Michael Peterson with the Michigan State Police at 734-242-3500 or Crime Stoppers at 1-800-Speak Up.

All tips to Crime Stoppers are anonymous. Click here to submit a tip online .

Copyright 2024 by WDIV ClickOnDetroit - All rights reserved.

About the Author

Brandon carr.

Brandon Carr is a digital content producer for ClickOnDetroit and has been with WDIV Local 4 since November 2021. Brandon is the 2015 Solomon Kinloch Humanitarian award recipient for Community Service.

Click here to take a moment and familiarize yourself with our Community Guidelines.

IMAGES

  1. Homework strategies from teachers

    homework effect on teachers

  2. Why Teachers Should Give Less Homework

    homework effect on teachers

  3. The Value of Parents Helping with Homework

    homework effect on teachers

  4. WHY DO TEACHERS GIVE HOMEWORK? #homework, #respectrules, #schoolrules

    homework effect on teachers

  5. The Science Of Homework: Why Timing is EverythingEducation & Teacher

    homework effect on teachers

  6. Teachers need to adhere to schoolwork limits

    homework effect on teachers

VIDEO

  1. Homework effect

  2. The World Changers- Teachers

  3. Homework Fun With Effect

  4. The best classwork and homework turn in system #teacherhacks #teacher

COMMENTS

  1. PDF Increasing the Effectiveness of Homework for All Learners in the ...

    Abstract This article discusses how teachers can increase the efectiveness of homework assignments for all learners. Homework, when designed and imple-mented properly, is a valuable tool for reinforcing learning. This essay provides a summary of educational research on homework, discusses the elements of ef-fective homework, and suggests practical classroom applications for teachers. The ...

  2. Is homework a necessary evil?

    Homework battles have raged for decades. For as long as kids have been whining about doing their homework, parents and education reformers have complained that homework's benefits are dubious. Meanwhile many teachers argue that take-home lessons are key to helping students learn. Now, as schools are shifting to the new (and hotly debated) Common Core curriculum standards, educators ...

  3. Types of Homework and Their Effect on Student Achievement

    Recommended Citation Minke, Tammi A., "Types of Homework and Their Efect on Student Achievement" (2017). Culminating Projects in Teacher Development. 24.

  4. More than two hours of homework may be counterproductive, research

    More than two hours of homework may be counterproductive, research suggests GSE scholar Denise Pope finds that students in high-achieving schools who spend too much time on homework experience more stress and health problems.

  5. PDF Does Homework Work or Hurt? A Study on the Effects of Homework on

    The process also made clear that teachers were not intending to inflict emotional damage on their students. Instead, they felt compelled to assign homework for two reasons: one, the extensive curriculum they were expected to teach, and two, the past precedent of assigning homework.

  6. Key Lessons: What Research Says About the Value of Homework

    Homework has been in the headlines again recently and continues to be a topic of controversy, with claims that students and families are suffering under the burden of huge amounts of homework. School board members, educators, and parents may wish to turn to the research for answers to their questions about the benefits and drawbacks of homework.

  7. (PDF) Investigating the Effects of Homework on Student Learning and

    This article investigates the effects of homework on student learning and academic performance, drawing from recent research and studies.

  8. Does Homework Improve Academic Achievement? A Synthesis of Research

    Abstract In this article, research conducted in the United States since 1987 on the effects of homework is summarized. Studies are grouped into four research designs. The authors found that all studies, regardless of type, had design flaws. However, both within and across design types, there was generally consistent evidence for a positive influence of homework on achievement. Studies that ...

  9. PDF Assigning Effective Homework

    Also, consider that other teachers may be giving homework assignments with the same time frames. Do establish, teach and publish homework policies and procedures to ensure that students and parents understand them (e.g., when assignments are due, where they are to be submitted, how to make up missed assignments, connec-tions between homework ...

  10. Homework: What does the Hattie research actually say?

    Short, frequent homework closely monitored by teachers has more impact that their converse forms and effects are higher for higher ability students than lower ability students, higher for older rather than younger students.

  11. Frontiers

    The results showed that three types of homework follow-up practices (i.e., checking homework orally; checking homework on the board; and collecting and grading homework) had a positive impact on students' performance, thus highlighting the role of EFL teachers in the homework process.

  12. Online Mathematics Homework Increases Student Achievement

    The intervention has potential for wider adoption. F or most American middle school students and teachers, mathematics homework is a regular practice. Typically, a teacher assigns homework during class, and each student later completes the assigned math problems. The next day, the teacher reviews the answers in a full-class discussion.

  13. Homework purposes, homework behaviors, and academic achievement

    Highlights • We aimed to gain a deeper understanding of homework purposes in relation with homework behaviors. • A path analysis with two samples (calibration and validation) was run. • Participants were 4265 6th graders and their teachers. • The model showed a good fit explaining 70% of students' mathematics achievement. • Teachers should relate the purposes of homework to ...

  14. The effects of teachers' homework follow-up practices on students' EFL

    The results showed that three types of homework follow-up practices (i.e., checking homework orally; checking homework on the board; and collecting and grading homework) had a positive impact on students' performance, thus highlighting the role of EFL teachers in the homework process.

  15. Is Homework Good for Kids? Here's What the Research Says

    The no-homework policy of a second-grade teacher in Texas went viral last week, earning praise from parents across the country who lament the heavy workload often assigned to young students ...

  16. The Role of Homework in Student Learning: A Review from a Psychological

    Abstract Homework can have both positive and negative effects on student learning. To overcome the negative effects and facilitate the positive ones, it is important for teachers to understand the underlying mechanisms of homework and how it relates to learning so that they can use the most effective methods of instruction and guidance. To provide a useful guide, this paper reviewed previous ...

  17. PDF The Effects of Homework on Student Achievement

    Purpose his study is to determine how implementing effects student achievement in mathematics. There are numerous unique plans regarding homework that teachers use and often these varieties occur within the same school or district. Through my research, I found many recommendations for length of homework, the amount of pend on homework, etc., but no

  18. The relationship between teachers' homework feedback, students

    Students' homework emotions greatly influence the quality of homework, learning activities, and even academic achievement and burden. Therefore, encouraging students' positive homework emotions is essential for their development. This study aimed to investigate the relationship between three types of teachers' homework feedback (checking homework on the board, grading homework, and ...

  19. Hattie effect size list

    Hattie's updated effect size list of 256 influences across all areas related to student achievement.

  20. The Pros and Cons of Homework

    Homework has long been a source of debate, with parents, educators, and education specialists debating the advantages of at-home study. There are many pros and cons of homework. We've examined a few significant points to provide you with a summary of the benefits and disadvantages of homework.

  21. Student-perceived parents' and teachers' expectancies and feedback

    We measured Year 7 to 9 children's perceptions of parents' and teachers' expectancies and teachers' feedback as well as children's homework motivation (expectancy and value beliefs), homework effort, academic outcomes, and academic stress in both samples using two waves of data.

  22. PDF Does Homework Really Improve Achievement? Kevin C. Costley, Ph.D ...

    One of the main factors impacting student achievement has been the use of homework (Collier, 2007). Opinions vary on whether or not homework has positive effects on achievement. National statistics have shown that teachers are attempting to remedy low test scores by giving students more homework (O'Neill 2008).

  23. How New Teachers Can Set Healthy Emotional Boundaries

    Cultivating compassion, through a personal practice or a professional learning program such as CARE, where the emphasis is on developing emotional awareness, resilience, and personal boundaries, prepares teachers to "address students' needs and protects teachers from the negative effects of empathy-based stress," the researchers conclude.

  24. Stanford research shows pitfalls of homework

    A Stanford researcher found that too much homework can negatively affect kids, especially their lives away from school, where family, friends and activities matter.

  25. The powerful impact of teacher expectations: a narrative review

    Teacher beliefs that moderate expectation effects. There are only three researchers and their colleagues who have initiated a research programme that has explored teacher beliefs differences as potentially influencing their expectations and thereby moderating their expectations for their students: Babad (Citation 2009), Weinstein (Citation 2002), and Rubie-Davies (Citation 2015).

  26. California gives schools homework assignment: Build housing for teachers

    EdJoin, a teacher hiring board, showed nearly 2,000 openings this month for special education teachers in California, for example, but fewer than 100 for third-grade teachers.

  27. More student bullying, teacher dissatisfaction with the chancellor: NYC

    Roughly 32% of teachers said they were dissatisfied with schools Chancellor David Banks - up from 30% last year and 17% in 2019, when former Chancellor Richard Carranza was at the helm.

  28. AI to be trained to help teachers mark homework under new ...

    Generative AI tools are to be trained to help teachers create lesson plans and mark homework under a new project announced by the Government. The £4 million scheme will see government documents ...

  29. Skiatook Public Schools: Teacher no longer employed after controversial

    UPDATE (8/30/24) — Skiatook Public Schools announced a teacher is no longer employed by the district following a controversial homework assignment that gained traction online. FOX23 reported on ...

  30. Teacher faces multiple sexual contact charges with student on school

    2 warnings and 55 watches in effect for 11 counties in the area. Local News. Brandon Carr, ... Wilhelm is a former teacher and football coach from Bedford High School, and the charges stem from a ...