helpful professor logo

7 Steps for How to Write an Evaluation Essay (Example & Template)

7 Steps for How to Write an Evaluation Essay (Example & Template)

Chris Drew (PhD)

Dr. Chris Drew is the founder of the Helpful Professor. He holds a PhD in education and has published over 20 articles in scholarly journals. He is the former editor of the Journal of Learning Development in Higher Education. [Image Descriptor: Photo of Chris]

Learn about our Editorial Process

In this ultimate guide, I will explain to you exactly how to write an evaluation essay.

1. What is an Evaluation Essay?

An evaluation essay should provide a critical analysis of something.

You’re literally ‘evaluating’ the thing you’re looking up.

Here’s a couple of quick definitions of what we mean by ‘evaluate’:

  • Merriam-Webster defines evaluation as: “to determine the significance, worth, or condition of usually by careful appraisal and study”
  • Collins Dictionary says: “If you evaluate something or someone, you consider them in order to make a judgment about them, for example about how good or bad they are.”

Here’s some synonyms for ‘evaluate’:

So, we could say that an evaluation essay should carefully examine the ‘thing’ and provide an overall judgement of it.

Here’s some common things you may be asked to write an evaluation essay on:

This is by no means an exhaustive list. Really, you can evaluate just about anything!

Get a Pdf of this article for class

Enjoy subscriber-only access to this article’s pdf

2. How to write an Evaluation Essay

There are two secrets to writing a strong evaluation essay. The first is to aim for objective analysis before forming an opinion. The second is to use an evaluation criteria.

Aim to Appear Objective before giving an Evaluation Argument

Your evaluation will eventually need an argument.

The evaluation argument will show your reader what you have decided is the final value of the ‘thing’ you’re evaluating.

But in order to convince your reader that your evaluative argument is sound, you need to do some leg work.

The aim will be to show that you have provided a balanced and fair assessment before coming to your conclusion.

In order to appear balanced you should:

  • Discuss both the pros and cons of the thing
  • Discuss both the strengths and weaknesses of the thing
  • Look at the thing from multiple different perspectives
  • Be both positive and critical. Don’t make it look like you’re biased towards one perspective.

In other words, give every perspective a fair hearing.

You don’t want to sound like a propagandist. You want to be seen as a fair and balanced adjudicator.

Use an Evaluation Criteria

One way to appear balanced is to use an evaluation criteria.

An evaluation criteria helps to show that you have assessed the ‘thing’ based on an objective measure.

Here’s some examples of evaluation criteria:

  • Strength under pressure
  • Longevity (ability to survive for a long time)
  • Ease of use
  • Ability to get the job done
  • Friendliness
  • Punctuality
  • Ability to predict my needs
  • Calmness under pressure
  • Attentiveness

A Bed and Breakfast

  • Breakfast options
  • Taste of food
  • Comfort of bed
  • Local attractions
  • Service from owner
  • Cleanliness

We can use evaluation criteria to frame out ability to conduct the analysis fairly.

This is especially true for if you have to evaluate multiple different ‘things’. For example, if you’re evaluating three novels, you want to be able to show that you applied the same ‘test’ on all three books!

This will show that you gave each ‘thing’ a fair chance and looked at the same elements for each.

3. How to come up with an Evaluation Argument

After you have:

  • Looked at both good and bad elements of the ‘thing’, and
  • Used an evaluation criteria

You’ll then need to develop an evaluative argument. This argument shows your own overall perspective on the ‘thing’.

Remember, you will need to show your final evaluative argument is backed by objective analysis. You need to do it in order!

Analyze first. Evaluate second.

Here’s an example.

Let’s say you’re evaluating the quality of a meal.

You might say:

  • A strength of the meal was its presentation. It was well presented and looked enticing to eat.
  • A weakness of the meal was that it was overcooked. This decreased its flavor.
  • The meal was given a low rating on ‘cost’ because it was more expensive than the other comparative meals on the menu.
  • The meal was given a high rating on ‘creativity’. It was a meal that involved a thoughtful and inventive mix of ingredients.

Now that you’ve looked at some pros and cons and measured the meal based on a few criteria points (like cost and creativity), you’ll be able to come up with a final argument:

  • Overall, the meal was good enough for a middle-tier restaurant but would not be considered a high-class meal. There is a lot of room for improvement if the chef wants to win any local cooking awards.

Evaluative terms that you might want to use for this final evaluation argument might include:

  • All things considered
  • With all key points in mind

4. Evaluation Essay Outline (with Examples)

Okay, so now you know what to do, let’s have a go at creating an outline for your evaluation essay!

Here’s what I recommend:

4.1 How to Write your Introduction

In the introduction, feel free to use my 5-Step INTRO method . It’ll be an introduction just like any other essay introduction .

And yes, feel free to explain what the final evaluation will be.

So, here it is laid out nice and simple.

Write one sentence for each point to make a 5-sentence introduction:

  • Interest: Make a statement about the ‘thing’ you’re evaluating that you think will be of interest to the reader. Make it a catchy, engaging point that draws the reader in!
  • Notify: Notify the reader of any background info on the thing you’re evaluating. This is your chance to show your depth of knowledge. What is a historical fact about the ‘thing’?
  • Translate: Re-state the essay question. For an evaluative essay, you can re-state it something like: “This essay evaluates the book/ product/ article/ etc. by looking at its strengths and weaknesses and compares it against a marking criteria”.
  • Report: Say what your final evaluation will be. For example you can say “While there are some weaknesses in this book, overall this evaluative essay will show that it helps progress knowledge about Dinosaurs.”
  • Outline: Simply give a clear overview of what will be discussed. For example, you can say: “Firstly, the essay will evaluate the product based on an objective criteria. This criteria will include its value for money, fit for purpose and ease of use. Next, the essay will show the main strengths and weaknesses of the product. Lastly, the essay will provide a final evaluative statement about the product’s overall value and worth.”

If you want more depth on how to use the INTRO method, you’ll need to go and check out our blog post on writing quality introductions.

4.2 Example Introduction

This example introduction is for the essay question: Write an Evaluation Essay on Facebook’s Impact on Society.

“Facebook is the third most visited website in the world. It was founded in 2004 by Mark Zuckerberg in his college dorm. This essay evaluates the impact of Facebook on society and makes an objective judgement on its value. The essay will argue that Facebook has changed the world both for the better and worse. Firstly, it will give an overview of what Facebook is and its history. Then, it will examine Facebook on the criteria of: impact on social interactions, impact on the media landscape, and impact on politics.”

You’ll notice that each sentence in this introduction follows my 5-Step INTRO formula to create a clear, coherent 5-Step introduction.

4.3 How to Write your Body Paragraphs

The first body paragraph should give an overview of the ‘thing’ being evaluated.

Then, you should evaluate the pros and cons of the ‘thing’ being evaluated based upon the criteria you have developed for evaluating it.

Let’s take a look below.

4.4 First Body Paragraph: Overview of your Subject

This first paragraph should provide objective overview of your subject’s properties and history. You should not be doing any evaluating just yet.

The goal for this first paragraph is to ensure your reader knows what it is you’re evaluating. Secondarily, it should show your marker that you have developed some good knowledge about it.

If you need to use more than one paragraph to give an overview of the subject, that’s fine.

Similarly, if your essay word length needs to be quite long, feel free to spend several paragraphs exploring the subject’s background and objective details to show off your depth of knowledge for the marker.

4.5 First Body Paragraph Example

Sticking with the essay question: Write an Evaluation Essay on Facebook’s Impact on Society , this might be your paragraph:

“Facebook has been one of the most successful websites of all time. It is the website that dominated the ‘Web 2.0’ revolution, which was characterized by user two-way interaction with the web. Facebook allowed users to create their own personal profiles and invite their friends to follow along. Since 2004, Facebook has attracted more than one billion people to create profiles in order to share their opinions and keep in touch with their friends.”

Notice here that I haven’t yet made any evaluations of Facebook’s merits?

This first paragraph (or, if need be, several of them) should be all about showing the reader exactly what your subject is – no more, no less.

4.6 Evaluation Paragraphs: Second, Third, Forth and Fifth Body Paragraphs

Once you’re confident your reader will know what the subject that you’re evaluating is, you’ll need to move on to the actual evaluation.

For this step, you’ll need to dig up that evaluation criteria we talked about in Point 2.

For example, let’s say you’re evaluating a President of the United States.

Your evaluation criteria might be:

  • Impact on world history
  • Ability to pass legislation
  • Popularity with voters
  • Morals and ethics
  • Ability to change lives for the better

Really, you could make up any evaluation criteria you want!

Once you’ve made up the evaluation criteria, you’ve got your evaluation paragraph ideas!

Simply turn each point in your evaluation criteria into a full paragraph.

How do you do this?

Well, start with a topic sentence.

For the criteria point ‘Impact on world history’ you can say something like: “Barack Obama’s impact on world history is mixed.”

This topic sentence will show that you’ll evaluate both pros and cons of Obama’s impact on world history in the paragraph.

Then, follow it up with explanations.

“While Obama campaigned to withdraw troops from Iraq and Afghanistan, he was unable to completely achieve this objective. This is an obvious negative for his impact on the world. However, as the first black man to lead the most powerful nation on earth, he will forever be remembered as a living milestone for civil rights and progress.”

Keep going, turning each evaluation criteria into a full paragraph.

4.7 Evaluation Paragraph Example

Let’s go back to our essay question: Write an Evaluation Essay on Facebook’s Impact on Society .

I’ve decided to use the evaluation criteria below:

  • impact on social interactions;
  • impact on the media landscape;
  • impact on politics

Naturally, I’m going to write one paragraph for each point.

If you’re expected to write a longer piece, you could write two paragraphs on each point (one for pros and one for cons).

Here’s what my first evaluation paragraph might look like:

“Facebook has had a profound impact on social interactions. It has helped people to stay in touch with one another from long distances and after they have left school and college. This is obviously a great positive. However, it can also be seen as having a negative impact. For example, people may be less likely to interact face-to-face because they are ‘hanging out’ online instead. This can have negative impact on genuine one-to-one relationships.”

You might notice that this paragraph has a topic sentence, explanations and examples. It follows my perfect paragraph formula which you’re more than welcome to check out!

4.8 How to write your Conclusion

To conclude, you’ll need to come up with one final evaluative argument.

This evaluation argument provides an overall assessment. You can start with “Overall, Facebook has been…” and continue by saying that (all things considered) he was a good or bad president!

Remember, you can only come up with an overall evaluation after you’ve looked at the subject’s pros and cons based upon your evaluation criteria.

In the example below, I’m going to use my 5 C’s conclusion paragraph method . This will make sure my conclusion covers all the things a good conclusion should cover!

Like the INTRO method, the 5 C’s conclusion method should have one sentence for each point to create a 5 sentence conclusion paragraph.

The 5 C’s conclusion method is:

  • Close the loop: Return to a statement you made in the introduction.
  • Conclude: Show what your final position is.
  • Clarify: Clarify how your final position is relevant to the Essay Question.
  • Concern: Explain who should be concerned by your findings.
  • Consequences: End by noting in one final, engaging sentence why this topic is of such importance. The ‘concern’ and ‘consequences’ sentences can be combined

4.9 Concluding Argument Example Paragraph

Here’s a possible concluding argument for our essay question: Write an Evaluation Essay on Facebook’s Impact on Society .

“The introduction of this essay highlighted that Facebook has had a profound impact on society. This evaluation essay has shown that this impact has been both positive and negative. Thus, it is too soon to say whether Facebook has been an overall positive or negative for society. However, people should pay close attention to this issue because it is possible that Facebook is contributing to the undermining of truth in media and positive interpersonal relationships.”

Note here that I’ve followed the 5 C’s conclusion method for my concluding evaluative argument paragraph.

5. Evaluation Essay Example Template

Below is a template you can use for your evaluation essay , based upon the advice I gave in Section 4:

Introduction

Use the to write an introduction. This introduction should clearly state what you are evaluating, the criteria that you will be using to evaluate it, and what will be.

Body Paragraph 1: Outline of the Subject

Before evaluating the subject or ‘thing’, make sure you use a paragraph or two to clearly explain what it is to the reader. This is your chance to show your depth of knowledge about the topic.

Body Paragraphs 2 – 5: Evaluate the Subject

Use the evaluation criteria you have decided upon to evaluate the subject. For each element of the criteria, write one paragraph looking at the pros and cons of the subject. You might want to use my to write your paragraphs.

Conclusion

Use my to write a 5-sentence conclusion. Make sure you show your final evaluative argument in the conclusion so your reader knows your final position on the issue.

6. 23+ Good Evaluation Essay Topics

Okay now that you know how to write an evaluation essay, let’s look at a few examples.

For each example I’m going to give you an evaluation essay title idea, plus a list of criteria you might want to use in your evaluation essay.

6.1 Evaluation of Impact

  • Evaluate the impact of global warming on the great barrier reef. Recommended evaluation criteria: Level of bleaching; Impact on tourism; Economic impact; Impact on lifestyles; Impact on sealife
  • Evaluate the impact of the Global Financial Crisis on poverty. Recommended evaluation criteria: Impact on jobs; Impact on childhood poverty; Impact on mental health rates; Impact on economic growth; Impact on the wealthy; Global impact
  • Evaluate the impact of having children on your lifestyle. Recommended evaluation criteria: Impact on spare time; Impact on finances; Impact on happiness; Impact on sense of wellbeing
  • Evaluate the impact of the internet on the world. Recommended evaluation criteria: Impact on connectedness; Impact on dating; Impact on business integration; Impact on globalization; Impact on media
  • Evaluate the impact of public transportation on cities. Recommended evaluation criteria: Impact on cost of living; Impact on congestion; Impact on quality of life; Impact on health; Impact on economy
  • Evaluate the impact of universal healthcare on quality of life. Recommended evaluation criteria: Impact on reducing disease rates; Impact on the poorest in society; Impact on life expectancy; Impact on happiness
  • Evaluate the impact of getting a college degree on a person’s life. Recommended evaluation criteria: Impact on debt levels; Impact on career prospects; Impact on life perspectives; Impact on relationships

6.2 Evaluation of a Scholarly Text or Theory

  • Evaluate a Textbook. Recommended evaluation criteria: clarity of explanations; relevance to a course; value for money; practical advice; depth and detail; breadth of information
  • Evaluate a Lecture Series, Podcast or Guest Lecture. Recommended evaluation criteria: clarity of speaker; engagement of attendees; appropriateness of content; value for monet
  • Evaluate a journal article. Recommended evaluation criteria: length; clarity; quality of methodology; quality of literature review ; relevance of findings for real life
  • Evaluate a Famous Scientists. Recommended evaluation criteria: contribution to scientific knowledge; impact on health and prosperity of humankind; controversies and disagreements with other scientists.
  • Evaluate a Theory. Recommended evaluation criteria: contribution to knowledge; reliability or accuracy; impact on the lives of ordinary people; controversies and contradictions with other theories.

6.3 Evaluation of Art and Literature

  • Evaluate a Novel. Recommended evaluation criteria: plot complexity; moral or social value of the message; character development; relevance to modern life
  • Evaluate a Play. Recommended evaluation criteria: plot complexity; quality of acting; moral or social value of the message; character development; relevance to modern life
  • Evaluate a Film. Recommended evaluation criteria: plot complexity; quality of acting; moral or social value of the message; character development; relevance to modern life
  • Evaluate an Artwork. Recommended evaluation criteria: impact on art theory; moral or social message; complexity or quality of composition

6.4 Evaluation of a Product or Service

  • Evaluate a Hotel or Bed and Breakfast. Recommended evaluation criteria: quality of service; flexibility of check-in and check-out times; cleanliness; location; value for money; wi-fi strength; noise levels at night; quality of meals; value for money
  • Evaluate a Restaurant. Recommended evaluation criteria: quality of service; menu choices; cleanliness; atmosphere; taste; value for money.
  • Evaluate a Car. Recommended evaluation criteria: fuel efficiency; value for money; build quality; likelihood to break down; comfort.
  • Evaluate a House. Recommended evaluation criteria: value for money; build quality; roominess; location; access to public transport; quality of neighbourhood
  • Evaluate a Doctor. Recommended evaluation criteria: Quality of service; knowledge; quality of equipment; reputation; value for money.
  • Evaluate a Course. Recommended evaluation criteria: value for money; practical advice; quality of teaching; quality of resources provided.

7. Concluding Advice

how to write an evaluation essay

Evaluation essays are common in high school, college and university.

The trick for getting good marks in an evaluation essay is to show you have looked at both the pros and cons before making a final evaluation analysis statement.

You don’t want to look biased.

That’s why it’s a good idea to use an objective evaluation criteria, and to be generous in looking at both positives and negatives of your subject.

Read Also: 39 Better Ways to Write ‘In Conclusion’ in an Essay

I recommend you use the evaluation template provided in this post to write your evaluation essay. However, if your teacher has given you a template, of course use theirs instead! You always want to follow your teacher’s advice because they’re the person who will be marking your work.

Good luck with your evaluation essay!

Chris

  • Chris Drew (PhD) https://helpfulprofessor.com/author/chris-drew-phd-2/ 10 Reasons you’re Perpetually Single
  • Chris Drew (PhD) https://helpfulprofessor.com/author/chris-drew-phd-2/ 20 Montessori Toddler Bedrooms (Design Inspiration)
  • Chris Drew (PhD) https://helpfulprofessor.com/author/chris-drew-phd-2/ 21 Montessori Homeschool Setups
  • Chris Drew (PhD) https://helpfulprofessor.com/author/chris-drew-phd-2/ 101 Hidden Talents Examples

2 thoughts on “7 Steps for How to Write an Evaluation Essay (Example & Template)”

' src=

What an amazing article. I am returning to studying after several years and was struggling with how to present an evaluative essay. This article has simplified the process and provided me with the confidence to tackle my subject (theoretical approaches to development and management of teams).

I just wanted to ask whether the evaluation criteria has to be supported by evidence or can it just be a list of criteria that you think of yourself to objectively measure?

Many many thanks for writing this!

' src=

Usually we would want to see evidence, but ask your teacher for what they’re looking for as they may allow you, depending on the situation.

Leave a Comment Cancel Reply

Your email address will not be published. Required fields are marked *

  • Writing Rules
  • Running Head & Page numbersÂ
  • Using Quotations
  • Citing Sources
  • Reference List
  • General Reference List Principles
  • Structure of the Report

Introduction

  • References & Appendices
  • Unpacking the Assignment Topic
  • Planning and Structuring the Assignment
  • Writing the Assignment
  • Writing Concisely
  • Developing Arguments

Critically Evaluating Research

  • Editing the Assignment
  • Writing in the Third Person
  • Directive Words
  • Before You Submit
  • Cover Sheet & Title Page
  • Academic Integrity
  • Marking Criteria
  • Word Limit Rules
  • Submitting Your WorkÂ
  • Writing Effective E-mails
  • Writing Concisely Exercise
  • About Redbook

Some research reports or assessments will require you critically evaluate a journal article or piece of research. Below is a guide with examples of how to critically evaluate research and how to communicate your ideas in writing.

To develop the skill of being able to critically evaluate, when reading research articles in psychology read with an open mind and be active when reading. Ask questions as you go and see if the answers are provided. Initially skim through the article to gain an overview of the problem, the design, methods, and conclusions. Then read for details and consider the questions provided below for each section of a journal article.

  • Did the title describe the study?
  • Did the key words of the title serve as key elements of the article?
  • Was the title concise, i.e., free of distracting or extraneous phrases?
  • Was the abstract concise and to the point?
  • Did the abstract summarise the study’s purpose/research problem, the independent and dependent variables under study, methods, main findings, and conclusions?
  • Did the abstract provide you with sufficient information to determine what the study is about and whether you would be interested in reading the entire article?
  • Was the research problem clearly identified?
  • Is the problem significant enough to warrant the study that was conducted?
  • Did the authors present an appropriate theoretical rationale for the study?
  • Is the literature review informative and comprehensive or are there gaps?
  • Are the variables adequately explained and operationalised?
  • Are hypotheses and research questions clearly stated? Are they directional? Do the author’s hypotheses and/or research questions seem logical in light of the conceptual framework and research problem?
  • Overall, does the literature review lead logically into the Method section?
  • Is the sample clearly described, in terms of size, relevant characteristics (gender, age, SES, etc), selection and assignment procedures, and whether any inducements were used to solicit subjects (payment, subject credit, free therapy, etc)?
  • What population do the subjects represent (external validity)?
  • Are there sufficient subjects to produce adequate power (statistical validity)?
  • Have the variables and measurement techniques been clearly operationalised?
  • Do the measures/instruments seem appropriate as measures of the variables under study (construct validity)?
  • Have the authors included sufficient information about the psychometric properties (eg. reliability and validity) of the instruments?
  • Are the materials used in conducting the study or in collecting data clearly described?
  • Are the study’s scientific procedures thoroughly described in chronological order?
  • Is the design of the study identified (or made evident)?
  • Do the design and procedures seem appropriate in light of the research problem, conceptual framework, and research questions/hypotheses?
  • Are there other factors that might explain the differences between groups (internal validity)?
  • Were subjects randomly assigned to groups so there was no systematic bias in favour of one group? Was there a differential drop-out rate from groups so that bias was introduced (internal validity and attrition)?
  • Were all the necessary control groups used? Were participants in each group treated identically except for the administration of the independent variable?
  • Were steps taken to prevent subject bias and/or experimenter bias, eg, blind or double blind procedures?
  • Were steps taken to control for other possible confounds such as regression to the mean, history effects, order effects, etc (internal validity)?
  • Were ethical considerations adhered to, eg, debriefing, anonymity, informed consent, voluntary participation?
  • Overall, does the method section provide sufficient information to replicate the study?
  • Are the findings complete, clearly presented, comprehensible, and well organised?
  • Are data coding and analysis appropriate in light of the study’s design and hypotheses? Are the statistics reported correctly and fully, eg. are degrees of freedom and p values given?
  • Have the assumptions of the statistical analyses been met, eg. does one group have very different variance to the others?
  • Are salient results connected directly to hypotheses? Are there superfluous results presented that are not relevant to the hypotheses or research question?
  • Are tables and figures clearly labelled? Well-organised? Necessary (non-duplicative of text)?
  • If a significant result is obtained, consider effect size. Is the finding meaningful? If a non-significant result is found, could low power be an issue? Were there sufficient levels of the IV?
  • If necessary have appropriate post-hoc analyses been performed? Were any transformations performed; if so, were there valid reasons? Were data collapsed over any IVs; if so, were there valid reasons? If any data was eliminated, were valid reasons given?

Discussion and Conclusion

  • Are findings adequately interpreted and discussed in terms of the stated research problem, conceptual framework, and hypotheses?
  • Is the interpretation adequate? i.e., does it go too far given what was actually done or not far enough? Are non-significant findings interpreted inappropriately?
  • Is the discussion biased? Are the limitations of the study delineated?
  • Are implications for future research and/or practical application identified?
  • Are the overall conclusions warranted by the data and any limitations in the study? Are the conclusions restricted to the population under study or are they generalised too widely?
  • Is the reference list sufficiently specific to the topic under investigation and current?
  • Are citations used appropriately in the text?

General Evaluation

  • Is the article objective, well written and organised?
  • Does the information provided allow you to replicate the study in all its details?
  • Was the study worth doing? Does the study provide an answer to a practical or important problem? Does it have theoretical importance? Does it represent a methodological or technical advance? Does it demonstrate a previously undocumented phenomenon? Does it explore the conditions under which a phenomenon occurs?

How to turn your critical evaluation into writing

Example from a journal article.

We’re reviewing our resources this spring (May-August 2024). We will do our best to minimize disruption, but you might notice changes over the next few months as we correct errors & delete redundant resources. 

Critical Analysis and Evaluation

Many assignments ask you to   critique   and   evaluate   a source. Sources might include journal articles, books, websites, government documents, portfolios, podcasts, or presentations.

When you   critique,   you offer both negative and positive analysis of the content, writing, and structure of a source.

When   you   evaluate , you assess how successful a source is at presenting information, measured against a standard or certain criteria.

Elements of a critical analysis:

opinion + evidence from the article + justification

Your   opinion   is your thoughtful reaction to the piece.

Evidence from the article  offers some proof to back up your opinion.

The   justification   is an explanation of how you arrived at your opinion or why you think it’s true.

How do you critique and evaluate?

When critiquing and evaluating someone else’s writing/research, your purpose is to reach an   informed opinion   about a source. In order to do that, try these three steps:

  • How do you feel?
  • What surprised you?
  • What left you confused?
  • What pleased or annoyed you?
  • What was interesting?
  • What is the purpose of this text?
  • Who is the intended audience?
  • What kind of bias is there?
  • What was missing?
  • See our resource on analysis and synthesis ( Move From Research to Writing: How to Think ) for other examples of questions to ask.
  • sophisticated
  • interesting
  • undocumented
  • disorganized
  • superficial
  • unconventional
  • inappropriate interpretation of evidence
  • unsound or discredited methodology
  • traditional
  • unsubstantiated
  • unsupported
  • well-researched
  • easy to understand
  • Opinion : This article’s assessment of the power balance in cities is   confusing.
  • Evidence:   It first says that the power to shape policy is evenly distributed among citizens, local government, and business (Rajal, 232).
  • Justification :  but then it goes on to focus almost exclusively on business. Next, in a much shorter section, it combines the idea of citizens and local government into a single point of evidence. This leaves the reader with the impression that the citizens have no voice at all. It is   not helpful   in trying to determine the role of the common voter in shaping public policy.  

Sample criteria for critical analysis

Sometimes the assignment will specify what criteria to use when critiquing and evaluating a source. If not, consider the following prompts to approach your analysis. Choose the questions that are most suitable for your source.

  • What do you think about the quality of the research? Is it significant?
  • Did the author answer the question they set out to? Did the author prove their thesis?
  • Did you find contradictions to other things you know?
  • What new insight or connections did the author make?
  • How does this piece fit within the context of your course, or the larger body of research in the field?
  • The structure of an article or book is often dictated by standards of the discipline or a theoretical model. Did the piece meet those standards?
  • Did the piece meet the needs of the intended audience?
  • Was the material presented in an organized and logical fashion?
  • Is the argument cohesive and convincing? Is the reasoning sound? Is there enough evidence?
  • Is it easy to read? Is it clear and easy to understand, even if the concepts are sophisticated?

Thank you for visiting nature.com. You are using a browser version with limited support for CSS. To obtain the best experience, we recommend you use a more up to date browser (or turn off compatibility mode in Internet Explorer). In the meantime, to ensure continued support, we are displaying the site without styles and JavaScript.

  • View all journals
  • Explore content
  • About the journal
  • Publish with us
  • Sign up for alerts
  • CAREER COLUMN
  • 08 October 2018

How to write a thorough peer review

  • Mathew Stiller-Reeve 0

Mathew Stiller-Reeve is a climate researcher at NORCE/Bjerknes Centre for Climate Research in Bergen, Norway, the leader of SciSnack.com, and a thematic editor at Geoscience Communication .

You can also search for this author in PubMed   Google Scholar

Scientists do not receive enough peer-review training. To improve this situation, a small group of editors and I developed a peer-review workflow to guide reviewers in delivering useful and thorough analyses that can really help authors to improve their papers.

Access options

Access Nature and 54 other Nature Portfolio journals

Get Nature+, our best-value online-access subscription

24,99 € / 30 days

cancel any time

Subscribe to this journal

Receive 51 print issues and online access

185,98 € per year

only 3,65 € per issue

Rent or buy this article

Prices vary by article type

Prices may be subject to local taxes which are calculated during checkout

doi: https://doi.org/10.1038/d41586-018-06991-0

This is an article from the Nature Careers Community, a place for Nature readers to share their professional experiences and advice. Guest posts are encouraged. You can get in touch with the editor at [email protected].

Related Articles

how to write an evaluation for a research paper

Engage more early-career scientists as peer reviewers

Help graduate students to become good peer reviewers

  • Peer review

The human costs of the research-assessment culture

The human costs of the research-assessment culture

Career Feature 09 SEP 24

Publishing nightmare: a researcher’s quest to keep his own work from being plagiarized

Publishing nightmare: a researcher’s quest to keep his own work from being plagiarized

News 04 SEP 24

Exclusive: the papers that most heavily cite retracted studies

Exclusive: the papers that most heavily cite retracted studies

News 28 AUG 24

Intellectual property and data privacy: the hidden risks of AI

Intellectual property and data privacy: the hidden risks of AI

Career Guide 04 SEP 24

Massive Attack’s science-led drive to lower music’s carbon footprint

Massive Attack’s science-led drive to lower music’s carbon footprint

Career Feature 04 SEP 24

Tales of a migratory marine biologist

Tales of a migratory marine biologist

Career Feature 28 AUG 24

Faculty Positions in School of Engineering, Westlake University

The School of Engineering (SOE) at Westlake University is seeking to fill multiple tenured or tenure-track faculty positions in all ranks.

Hangzhou, Zhejiang, China

Westlake University

how to write an evaluation for a research paper

Postdoctoral Associate- Genetic Epidemiology

Houston, Texas (US)

Baylor College of Medicine (BCM)

how to write an evaluation for a research paper

NOMIS Foundation ETH Postdoctoral Fellowship

The NOMIS Foundation ETH Fellowship Programme supports postdoctoral researchers at ETH Zurich within the Centre for Origin and Prevalence of Life ...

Zurich, Canton of Zürich (CH)

Centre for Origin and Prevalence of Life at ETH Zurich

how to write an evaluation for a research paper

13 PhD Positions at Heidelberg University

GRK2727/1 – InCheck Innate Immune Checkpoints in Cancer and Tissue Damage

Heidelberg, Baden-Württemberg (DE) and Mannheim, Baden-Württemberg (DE)

Medical Faculties Mannheim & Heidelberg and DKFZ, Germany

how to write an evaluation for a research paper

Postdoctoral Associate- Environmental Epidemiology

how to write an evaluation for a research paper

Sign up for the Nature Briefing newsletter — what matters in science, free to your inbox daily.

Quick links

  • Explore articles by subject
  • Guide to authors
  • Editorial policies

Evaluating Research in Academic Journals: A Practical Guide to Realistic Evaluation

  • November 2018
  • Edition: 7th edition
  • Publisher: Routledge (Taylor & Francis)
  • ISBN: 978-0815365662
  • This person is not on ResearchGate, or hasn't claimed this research yet.

Maria Tcherni-Buzzeo at University of New Haven

  • University of New Haven

Discover the world's research

  • 25+ million members
  • 160+ million publication pages
  • 2.3+ billion citations

Aisha Siddiqua

  • Victoria Espinoza

Maria Tcherni-Buzzeo

  • Jesus Alberto Galvan
  • Dorothea Ivey

Raymond I Nelson

  • TOBIAS CHACHA OLAMBO
  • DR. MOSES ODHIAMBO ALUOCH (PhD)

Rob Davidson

  • Catherine Briand

Khayreddine Bouabida

  • Carol Giba Bottger Garcia
  • Kofar Wambai

Chek Derashid

  • J AM COLL HEALTH

Jennifer Cremeens Matthews

  • C. Nathan DeWall
  • J. Michael Bartels

Eunsun Kwon

  • Sojung Park

Jinho Kim

  • J APPL DEV PSYCHOL

Paul Klaczynski

  • Sandra A. Brown
  • COMPUT HUM BEHAV
  • Alberta Contarello

Mauro Sarrica

  • Barker Bausell

Terance Miethe

  • Recruit researchers
  • Join for free
  • Login Email Tip: Most researchers use their institutional email address as their ResearchGate login Password Forgot password? Keep me logged in Log in or Continue with Google Welcome back! Please log in. Email · Hint Tip: Most researchers use their institutional email address as their ResearchGate login Password Forgot password? Keep me logged in Log in or Continue with Google No account? Sign up

Banner

Research Paper: A step-by-step guide: 7. Evaluating Sources

  • 1. Getting Started
  • 2. Topic Ideas
  • 3. Thesis Statement & Outline
  • 4. Appropriate Sources
  • 5. Search Techniques
  • 6. Taking Notes & Documenting Sources
  • 7. Evaluating Sources
  • 8. Citations & Plagiarism
  • 9. Writing Your Research Paper

alt=" "

Evaluation Criteria

It's very important to evaluate the materials you find to make sure they are appropriate for a research paper.  It's not enough that the information is relevant; it must also be credible.  You will want to find more than enough resources, so that you can pick and choose the best for your paper.   Here are some helpful criteria you can apply to the information you find:

C urrency :

  • When was the information published?
  • Is the source out-of-date for the topic? 
  • Are there new discoveries or important events since the publication date?

R elevancy:

  • How is the information related to your argument? 
  • Is the information too advanced or too simple? 
  • Is the audience focus appropriate for a research paper? 
  • Are there better sources elsewhere?

A uthority :

  • Who is the author? 
  • What is the author's credential in the related field? 
  • Is the publisher well-known in the field? 
  • Did the information go through the peer-review process or some kind of fact-checking?

A ccuracy :

  • Can the information be verified? 
  • Are sources cited? 
  • Is the information factual or opinion based?
  • Is the information biased? 
  • Is the information free of grammatical or spelling errors?
  • What is the motive of providing the information: to inform? to sell? to persuade? to entertain?
  • Does the author or publisher make their intentions clear? Who is the intended audience?

Evaluating Web Sources

Most web pages are not fact-checked or anything like that, so it's especially important to evaluate information you find on the web.  Many articles on websites are fine for information, and many others are distorted or made up.  Check out our media evaluation guide for tips on evaluating what you see on social media, news sites, blogs, and so on.

This three-part video series, in which university students, historians, and pro fact-checkers go head-to-head in checking out online information, is also helpful.

  • << Previous: 6. Taking Notes & Documenting Sources
  • Next: 8. Citations & Plagiarism >>
  • Last Updated: Apr 18, 2023 12:12 PM
  • URL: https://butte.libguides.com/ResearchPaper
  • Privacy Policy

Research Method

Home » Evaluating Research – Process, Examples and Methods

Evaluating Research – Process, Examples and Methods

Table of Contents

Evaluating Research

Evaluating Research

Definition:

Evaluating Research refers to the process of assessing the quality, credibility, and relevance of a research study or project. This involves examining the methods, data, and results of the research in order to determine its validity, reliability, and usefulness. Evaluating research can be done by both experts and non-experts in the field, and involves critical thinking, analysis, and interpretation of the research findings.

Research Evaluating Process

The process of evaluating research typically involves the following steps:

Identify the Research Question

The first step in evaluating research is to identify the research question or problem that the study is addressing. This will help you to determine whether the study is relevant to your needs.

Assess the Study Design

The study design refers to the methodology used to conduct the research. You should assess whether the study design is appropriate for the research question and whether it is likely to produce reliable and valid results.

Evaluate the Sample

The sample refers to the group of participants or subjects who are included in the study. You should evaluate whether the sample size is adequate and whether the participants are representative of the population under study.

Review the Data Collection Methods

You should review the data collection methods used in the study to ensure that they are valid and reliable. This includes assessing the measures used to collect data and the procedures used to collect data.

Examine the Statistical Analysis

Statistical analysis refers to the methods used to analyze the data. You should examine whether the statistical analysis is appropriate for the research question and whether it is likely to produce valid and reliable results.

Assess the Conclusions

You should evaluate whether the data support the conclusions drawn from the study and whether they are relevant to the research question.

Consider the Limitations

Finally, you should consider the limitations of the study, including any potential biases or confounding factors that may have influenced the results.

Evaluating Research Methods

Evaluating Research Methods are as follows:

  • Peer review: Peer review is a process where experts in the field review a study before it is published. This helps ensure that the study is accurate, valid, and relevant to the field.
  • Critical appraisal : Critical appraisal involves systematically evaluating a study based on specific criteria. This helps assess the quality of the study and the reliability of the findings.
  • Replication : Replication involves repeating a study to test the validity and reliability of the findings. This can help identify any errors or biases in the original study.
  • Meta-analysis : Meta-analysis is a statistical method that combines the results of multiple studies to provide a more comprehensive understanding of a particular topic. This can help identify patterns or inconsistencies across studies.
  • Consultation with experts : Consulting with experts in the field can provide valuable insights into the quality and relevance of a study. Experts can also help identify potential limitations or biases in the study.
  • Review of funding sources: Examining the funding sources of a study can help identify any potential conflicts of interest or biases that may have influenced the study design or interpretation of results.

Example of Evaluating Research

Example of Evaluating Research sample for students:

Title of the Study: The Effects of Social Media Use on Mental Health among College Students

Sample Size: 500 college students

Sampling Technique : Convenience sampling

  • Sample Size: The sample size of 500 college students is a moderate sample size, which could be considered representative of the college student population. However, it would be more representative if the sample size was larger, or if a random sampling technique was used.
  • Sampling Technique : Convenience sampling is a non-probability sampling technique, which means that the sample may not be representative of the population. This technique may introduce bias into the study since the participants are self-selected and may not be representative of the entire college student population. Therefore, the results of this study may not be generalizable to other populations.
  • Participant Characteristics: The study does not provide any information about the demographic characteristics of the participants, such as age, gender, race, or socioeconomic status. This information is important because social media use and mental health may vary among different demographic groups.
  • Data Collection Method: The study used a self-administered survey to collect data. Self-administered surveys may be subject to response bias and may not accurately reflect participants’ actual behaviors and experiences.
  • Data Analysis: The study used descriptive statistics and regression analysis to analyze the data. Descriptive statistics provide a summary of the data, while regression analysis is used to examine the relationship between two or more variables. However, the study did not provide information about the statistical significance of the results or the effect sizes.

Overall, while the study provides some insights into the relationship between social media use and mental health among college students, the use of a convenience sampling technique and the lack of information about participant characteristics limit the generalizability of the findings. In addition, the use of self-administered surveys may introduce bias into the study, and the lack of information about the statistical significance of the results limits the interpretation of the findings.

Note*: Above mentioned example is just a sample for students. Do not copy and paste directly into your assignment. Kindly do your own research for academic purposes.

Applications of Evaluating Research

Here are some of the applications of evaluating research:

  • Identifying reliable sources : By evaluating research, researchers, students, and other professionals can identify the most reliable sources of information to use in their work. They can determine the quality of research studies, including the methodology, sample size, data analysis, and conclusions.
  • Validating findings: Evaluating research can help to validate findings from previous studies. By examining the methodology and results of a study, researchers can determine if the findings are reliable and if they can be used to inform future research.
  • Identifying knowledge gaps: Evaluating research can also help to identify gaps in current knowledge. By examining the existing literature on a topic, researchers can determine areas where more research is needed, and they can design studies to address these gaps.
  • Improving research quality : Evaluating research can help to improve the quality of future research. By examining the strengths and weaknesses of previous studies, researchers can design better studies and avoid common pitfalls.
  • Informing policy and decision-making : Evaluating research is crucial in informing policy and decision-making in many fields. By examining the evidence base for a particular issue, policymakers can make informed decisions that are supported by the best available evidence.
  • Enhancing education : Evaluating research is essential in enhancing education. Educators can use research findings to improve teaching methods, curriculum development, and student outcomes.

Purpose of Evaluating Research

Here are some of the key purposes of evaluating research:

  • Determine the reliability and validity of research findings : By evaluating research, researchers can determine the quality of the study design, data collection, and analysis. They can determine whether the findings are reliable, valid, and generalizable to other populations.
  • Identify the strengths and weaknesses of research studies: Evaluating research helps to identify the strengths and weaknesses of research studies, including potential biases, confounding factors, and limitations. This information can help researchers to design better studies in the future.
  • Inform evidence-based decision-making: Evaluating research is crucial in informing evidence-based decision-making in many fields, including healthcare, education, and public policy. Policymakers, educators, and clinicians rely on research evidence to make informed decisions.
  • Identify research gaps : By evaluating research, researchers can identify gaps in the existing literature and design studies to address these gaps. This process can help to advance knowledge and improve the quality of research in a particular field.
  • Ensure research ethics and integrity : Evaluating research helps to ensure that research studies are conducted ethically and with integrity. Researchers must adhere to ethical guidelines to protect the welfare and rights of study participants and to maintain the trust of the public.

Characteristics Evaluating Research

Characteristics Evaluating Research are as follows:

  • Research question/hypothesis: A good research question or hypothesis should be clear, concise, and well-defined. It should address a significant problem or issue in the field and be grounded in relevant theory or prior research.
  • Study design: The research design should be appropriate for answering the research question and be clearly described in the study. The study design should also minimize bias and confounding variables.
  • Sampling : The sample should be representative of the population of interest and the sampling method should be appropriate for the research question and study design.
  • Data collection : The data collection methods should be reliable and valid, and the data should be accurately recorded and analyzed.
  • Results : The results should be presented clearly and accurately, and the statistical analysis should be appropriate for the research question and study design.
  • Interpretation of results : The interpretation of the results should be based on the data and not influenced by personal biases or preconceptions.
  • Generalizability: The study findings should be generalizable to the population of interest and relevant to other settings or contexts.
  • Contribution to the field : The study should make a significant contribution to the field and advance our understanding of the research question or issue.

Advantages of Evaluating Research

Evaluating research has several advantages, including:

  • Ensuring accuracy and validity : By evaluating research, we can ensure that the research is accurate, valid, and reliable. This ensures that the findings are trustworthy and can be used to inform decision-making.
  • Identifying gaps in knowledge : Evaluating research can help identify gaps in knowledge and areas where further research is needed. This can guide future research and help build a stronger evidence base.
  • Promoting critical thinking: Evaluating research requires critical thinking skills, which can be applied in other areas of life. By evaluating research, individuals can develop their critical thinking skills and become more discerning consumers of information.
  • Improving the quality of research : Evaluating research can help improve the quality of research by identifying areas where improvements can be made. This can lead to more rigorous research methods and better-quality research.
  • Informing decision-making: By evaluating research, we can make informed decisions based on the evidence. This is particularly important in fields such as medicine and public health, where decisions can have significant consequences.
  • Advancing the field : Evaluating research can help advance the field by identifying new research questions and areas of inquiry. This can lead to the development of new theories and the refinement of existing ones.

Limitations of Evaluating Research

Limitations of Evaluating Research are as follows:

  • Time-consuming: Evaluating research can be time-consuming, particularly if the study is complex or requires specialized knowledge. This can be a barrier for individuals who are not experts in the field or who have limited time.
  • Subjectivity : Evaluating research can be subjective, as different individuals may have different interpretations of the same study. This can lead to inconsistencies in the evaluation process and make it difficult to compare studies.
  • Limited generalizability: The findings of a study may not be generalizable to other populations or contexts. This limits the usefulness of the study and may make it difficult to apply the findings to other settings.
  • Publication bias: Research that does not find significant results may be less likely to be published, which can create a bias in the published literature. This can limit the amount of information available for evaluation.
  • Lack of transparency: Some studies may not provide enough detail about their methods or results, making it difficult to evaluate their quality or validity.
  • Funding bias : Research funded by particular organizations or industries may be biased towards the interests of the funder. This can influence the study design, methods, and interpretation of results.

About the author

' src=

Muhammad Hassan

Researcher, Academic Writer, Web developer

You may also like

Research Contribution

Research Contribution – Thesis Guide

Institutional Review Board (IRB)

Institutional Review Board – Application Sample...

Appendices

Appendices – Writing Guide, Types and Examples

Data Interpretation

Data Interpretation – Process, Methods and...

Limitations in Research

Limitations in Research – Types, Examples and...

Significance of the Study

Significance of the Study – Examples and Writing...

12.7 Evaluation: Effectiveness of Research Paper

Learning outcomes.

By the end of this section, you will be able to:

  • Identify common formats and design features for different kinds of texts.
  • Implement style and language consistent with argumentative research writing while maintaining your own voice.
  • Determine how genre conventions for structure, paragraphing, tone, and mechanics vary.

When drafting, you follow your strongest research interests and try to answer the question on which you have settled. However, sometimes what began as a paper about one thing becomes a paper about something else. Your peer review partner will have helped you identify any such issues and given you some insight regarding revision. Another strategy is to compare and contrast your draft with the grading rubric similar to one your instructor will use. It is a good idea to consult this rubric frequently throughout the drafting process.

Score Critical Language Awareness Clarity and Coherence Rhetorical Choices

The text always adheres to the “Editing Focus” of this chapter: integrating sources and quotations appropriately as discussed in Section 12.6. The text also shows ample evidence of the writer’s intent to consciously meet or challenge conventional expectations in rhetorically effective ways. The writer’s position or claim on a debatable issue is stated clearly in the thesis and expertly supported with credible researched evidence. Ideas are clearly presented in well-developed paragraphs with clear topic sentences and relate directly to the thesis. Headings and subheadings clarify organization, and appropriate transitions link ideas. The writer maintains an objective voice in a paper that reflects an admirable balance of source information, analysis, synthesis, and original thought. Quotations function appropriately as support and are thoughtfully edited to reveal their main points. The writer fully addresses counterclaims and is consistently aware of the audience in terms of language use and background information presented.

The text usually adheres to the “Editing Focus” of this chapter: integrating sources and quotations appropriately as discussed in Section 12.6. The text also shows some evidence of the writer’s intent to consciously meet or challenge conventional expectations in rhetorically effective ways. The writer’s position or claim on a debatable issue is stated clearly in the thesis and supported with credible researched evidence. Ideas are clearly presented in well-developed paragraphs with topic sentences and usually relate directly to the thesis. Some headings and subheadings clarify organization, and sufficient transitions link ideas. The writer maintains an objective voice in a paper that reflects a balance of source information, analysis, synthesis, and original thought. Quotations usually function as support, and most are edited to reveal their main points. The writer usually addresses counterclaims and is aware of the audience in terms of language use and background information presented.

The text generally adheres to the “Editing Focus” of this chapter: integrating sources and quotations appropriately as discussed in Section 12.6. The text also shows limited evidence of the writer’s intent to consciously meet or challenge conventional expectations in rhetorically effective ways. The writer’s position or claim on a debatable issue is stated in the thesis and generally supported with some credible researched evidence. Ideas are presented in moderately developed paragraphs. Most, if not all, have topic sentences and relate to the thesis. Some headings and subheadings may clarify organization, but their use may be inconsistent, inappropriate, or insufficient. More transitions would improve coherence. The writer generally maintains an objective voice in a paper that reflects some balance of source information, analysis, synthesis, and original thought, although imbalance may well be present. Quotations generally function as support, but some are not edited to reveal their main points. The writer may attempt to address counterclaims but may be inconsistent in awareness of the audience in terms of language use and background information presented.

The text occasionally adheres to the “Editing Focus” of this chapter: integrating sources and quotations appropriately as discussed in Section 12.6. The text also shows emerging evidence of the writer’s intent to consciously meet or challenge conventional expectations in rhetorically effective ways. The writer’s position or claim on a debatable issue is not clearly stated in the thesis, nor is it sufficiently supported with credible researched evidence. Some ideas are presented in paragraphs, but they are unrelated to the thesis. Some headings and subheadings may clarify organization, while others may not; transitions are either inappropriate or insufficient to link ideas. The writer sometimes maintains an objective voice in a paper that lacks a balance of source information, analysis, synthesis, and original thought. Quotations usually do not function as support, often replacing the writer’s ideas or are not edited to reveal their main points. Counterclaims are addressed haphazardly or ignored. The writer shows inconsistency in awareness of the audience in terms of language use and background information presented.

The text does not adhere to the “Editing Focus” of this chapter: integrating sources and quotations appropriately as discussed in Section 12.6. The text also shows little to no evidence of the writer’s intent to consciously meet or challenge conventional expectations in rhetorically effective ways. The writer’s position or claim on a debatable issue is neither clearly stated in the thesis nor sufficiently supported with credible researched evidence. Some ideas are presented in paragraphs. Few, if any, have topic sentences, and they barely relate to the thesis. Headings and subheadings are either missing or unhelpful as organizational tools. Transitions generally are missing or inappropriate. The writer does not maintain an objective voice in a paper that reflects little to no balance of source information, analysis, synthesis, and original thought. Quotations may function as support, but most are not edited to reveal their main points. The writer may attempt to address counterclaims and may be inconsistent in awareness of the audience in terms of language use and background information presented.

This book may not be used in the training of large language models or otherwise be ingested into large language models or generative AI offerings without OpenStax's permission.

Want to cite, share, or modify this book? This book uses the Creative Commons Attribution License and you must attribute OpenStax.

Access for free at https://openstax.org/books/writing-guide/pages/1-unit-introduction
  • Authors: Michelle Bachelor Robinson, Maria Jerskey, featuring Toby Fulwiler
  • Publisher/website: OpenStax
  • Book title: Writing Guide with Handbook
  • Publication date: Dec 21, 2021
  • Location: Houston, Texas
  • Book URL: https://openstax.org/books/writing-guide/pages/1-unit-introduction
  • Section URL: https://openstax.org/books/writing-guide/pages/12-7-evaluation-effectiveness-of-research-paper

© Dec 19, 2023 OpenStax. Textbook content produced by OpenStax is licensed under a Creative Commons Attribution License . The OpenStax name, OpenStax logo, OpenStax book covers, OpenStax CNX name, and OpenStax CNX logo are not subject to the Creative Commons license and may not be reproduced without the prior and express written consent of Rice University.

U.S. flag

An official website of the United States government

The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

  • Publications
  • Account settings

Preview improvements coming to the PMC website in October 2024. Learn More or Try it out now .

  • Advanced Search
  • Journal List
  • Perspect Clin Res
  • v.12(2); Apr-Jun 2021

Critical appraisal of published research papers – A reinforcing tool for research methodology: Questionnaire-based study

Snehalata gajbhiye.

Department of Pharmacology and Therapeutics, Seth GS Medical College and KEM Hospital, Mumbai, Maharashtra, India

Raakhi Tripathi

Urwashi parmar, nishtha khatri, anirudha potey.

1 Department of Clinical Trials, Serum Institute of India, Pune, Maharashtra, India

Background and Objectives:

Critical appraisal of published research papers is routinely conducted as a journal club (JC) activity in pharmacology departments of various medical colleges across Maharashtra, and it forms an important part of their postgraduate curriculum. The objective of this study was to evaluate the perception of pharmacology postgraduate students and teachers toward use of critical appraisal as a reinforcing tool for research methodology. Evaluation of performance of the in-house pharmacology postgraduate students in the critical appraisal activity constituted secondary objective of the study.

Materials and Methods:

The study was conducted in two parts. In Part I, a cross-sectional questionnaire-based evaluation on perception toward critical appraisal activity was carried out among pharmacology postgraduate students and teachers. In Part II of the study, JC score sheets of 2 nd - and 3 rd -year pharmacology students over the past 4 years were evaluated.

One hundred and twenty-seven postgraduate students and 32 teachers participated in Part I of the study. About 118 (92.9%) students and 28 (87.5%) faculties considered the critical appraisal activity to be beneficial for the students. JC score sheet assessments suggested that there was a statistically significant improvement in overall scores obtained by postgraduate students ( n = 25) in their last JC as compared to the first JC.

Conclusion:

Journal article criticism is a crucial tool to develop a research attitude among postgraduate students. Participation in the JC activity led to the improvement in the skill of critical appraisal of published research articles, but this improvement was not educationally relevant.

INTRODUCTION

Critical appraisal of a research paper is defined as “The process of carefully and systematically examining research to judge its trustworthiness, value and relevance in a particular context.”[ 1 ] Since scientific literature is rapidly expanding with more than 12,000 articles being added to the MEDLINE database per week,[ 2 ] critical appraisal is very important to distinguish scientifically useful and well-written articles from imprecise articles.

Educational authorities like the Medical Council of India (MCI) and Maharashtra University of Health Sciences (MUHS) have stated in pharmacology postgraduate curriculum that students must critically appraise research papers. To impart training toward these skills, MCI and MUHS have emphasized on the introduction of journal club (JC) activity for postgraduate (PG) students, wherein students review a published original research paper and state the merits and demerits of the paper. Abiding by this, pharmacology departments across various medical colleges in Maharashtra organize JC at frequent intervals[ 3 , 4 ] and students discuss varied aspects of the article with teaching faculty of the department.[ 5 ] Moreover, this activity carries a significant weightage of marks in the pharmacology university examination. As postgraduate students attend this activity throughout their 3-year tenure, it was perceived by the authors that this activity of critical appraisal of research papers could emerge as a tool for reinforcing the knowledge of research methodology. Hence, a questionnaire-based study was designed to find out the perceptions from PG students and teachers.

There have been studies that have laid emphasis on the procedure of conducting critical appraisal of research papers and its application into clinical practice.[ 6 , 7 ] However, there are no studies that have evaluated how well students are able to critically appraise a research paper. The Department of Pharmacology and Therapeutics at Seth GS Medical College has developed an evaluation method to score the PG students on this skill and this tool has been implemented for the last 5 years. Since there are no research data available on the performance of PG Pharmacology students in JC, capturing the critical appraisal activity evaluation scores of in-house PG students was chosen as another objective of the study.

MATERIALS AND METHODS

Description of the journal club activity.

JC is conducted in the Department of Pharmacology and Therapeutics at Seth GS Medical College once in every 2 weeks. During the JC activity, postgraduate students critically appraise published original research articles on their completeness and aptness in terms of the following: study title, rationale, objectives, study design, methodology-study population, inclusion/exclusion criteria, duration, intervention and safety/efficacy variables, randomization, blinding, statistical analysis, results, discussion, conclusion, references, and abstract. All postgraduate students attend this activity, while one of them critically appraises the article (who has received the research paper given by one of the faculty members 5 days before the day of JC). Other faculties also attend these sessions and facilitate the discussions. As the student comments on various sections of the paper, the same predecided faculty who gave the article (single assessor) evaluates the student on a total score of 100 which is split per section as follows: Introduction –20 marks, Methodology –20 marks, Discussion – 20 marks, Results and Conclusion –20 marks, References –10 marks, and Title, Abstract, and Keywords – 10 marks. However, there are no standard operating procedures to assess the performance of students at JC.

Methodology

After seeking permission from the Institutional Ethics Committee, the study was conducted in two parts. Part I consisted of a cross-sectional questionnaire-based survey that was conducted from October 2016 to September 2017. A questionnaire to evaluate perception towards the activity of critical appraisal of published papers as research methodology reinforcing tool was developed by the study investigators. The questionnaire consisted of 20 questions: 14 questions [refer Figure 1 ] graded on a 3-point Likert scale (agree, neutral, and disagree), 1 multiple choice selection question, 2 dichotomous questions, 1 semi-open-ended questions, and 2 open-ended questions. Content validation for this questionnaire was carried out with the help of eight pharmacology teachers. The content validity ratio per item was calculated and each item in the questionnaire had a CVR ratio (CVR) of >0.75.[ 8 ] The perception questionnaire was either E-mailed or sent through WhatsApp to PG pharmacology students and teaching faculty in pharmacology departments at various medical colleges across Maharashtra. Informed consent was obtained on E-mail from all the participants.

An external file that holds a picture, illustration, etc.
Object name is PCR-12-100-g001.jpg

Graphical representation of the percentage of students/teachers who agreed that critical appraisal of research helped them improve their knowledge on various aspects of research, perceived that faculty participation is important in this activity, and considered critical appraisal activity beneficial for students. The numbers adjacent to the bar diagrams indicate the raw number of students/faculty who agreed, while brackets indicate %

Part II of the study consisted of evaluating the performance of postgraduate students toward skills of critical appraisal of published papers. For this purpose, marks obtained by 2 nd - and 3 rd -year residents during JC sessions conducted over a period of 4 years from October 2013 to September 2017 were recorded and analyzed. No data on personal identifiers of the students were captured.

Statistical analysis

Marks obtained by postgraduate students in their first and last JC were compared using Wilcoxon signed-rank test, while marks obtained by 2 nd - and 3 rd -year postgraduate students were compared using Mann–Whitney test since the data were nonparametric. These statistical analyses were performed using GraphPad Prism statistical software, San Diego, Calfornia, USA, Version 7.0d. Data obtained from the perception questionnaire were entered in Microsoft Excel sheet and were expressed as frequencies (percentages) using descriptive statistics.

Participants who answered all items of the questionnaire were considered as complete responders and only completed questionnaires were analyzed. The questionnaire was sent through an E-mail to 100 students and through WhatsApp to 68 students. Out of the 100 students who received the questionnaire through E-mail, 79 responded completely and 8 were incomplete responders, while 13 students did not revert back. Out of the 68 students who received the questionnaire through WhatsApp, 48 responded completely, 6 gave an incomplete response, and 14 students did not revert back. Hence, of the 168 postgraduate students who received the questionnaire, 127 responded completely (student response rate for analysis = 75.6%). The questionnaire was E-mailed to 33 faculties and was sent through WhatsApp to 25 faculties. Out of the 33 faculties who received the questionnaire through E-mail, 19 responded completely, 5 responded incompletely, and 9 did not respond at all. Out of the 25 faculties who received the questionnaire through WhatsApp, 13 responded completely, 3 were incomplete responders, and 9 did not respond at all. Hence, of a total of 58 faculties who were contacted, 32 responded completely (faculty response rate for analysis = 55%). For Part I of the study, responses on the perception questionnaire from 127 postgraduate students and 32 postgraduate teachers were recorded and analyzed. None of the faculty who participated in the validation of the questionnaire participated in the survey. Number of responses obtained region wise (Mumbai region and rest of Maharashtra region) have been depicted in Table 1 .

Region-wise distribution of responses

Students ( =127)Faculty ( =32)
Mumbai colleges58 (45.7)18 (56.3)
Rest of Maharashtra colleges69 (54.3)14 (43.7)

Number of responses obtained from students/faculty belonging to Mumbai colleges and rest of Maharashtra colleges. Brackets indicate percentages

As per the data obtained on the Likert scale questions, 102 (80.3%) students and 29 (90.6%) teachers agreed that critical appraisal trains the students in doing a review of literature before selecting a particular research topic. Majority of the participants, i.e., 104 (81.9%) students and 29 (90.6%) teachers also believed that the activity increases student's knowledge regarding various experimental evaluation techniques. Moreover, 112 (88.2%) students and 27 (84.4%) faculty considered that critical appraisal activity results in improved skills of writing and understanding methodology section of research articles in terms of inclusion/exclusion criteria, endpoints, and safety/efficacy variables. About 103 (81.1%) students and 24 (75%) teachers perceived that this activity results in refinement of the student's research work. About 118 (92.9%) students and 28 (87.5%) faculty considered the critical appraisal activity to be beneficial for the students. Responses to 14 individual Likert scale items of the questionnaire have been depicted in Figure 1 .

With respect to the multiple choice selection question, 66 (52%) students and 16 (50%) teachers opined that faculty should select the paper, 53 (41.7%) students and 9 (28.1%) teachers stated that the papers should be selected by the presenting student himself/herself, while 8 (6.3%) students and 7 (21.9%) teachers expressed that some other student should select the paper to be presented at the JC.

The responses to dichotomous questions were as follows: majority of the students, that is, 109 (85.8%) and 23 (71.9%) teachers perceived that a standard checklist for article review should be given to the students before critical appraisal of journal article. Open-ended questions of the questionnaire invited suggestions from the participants regarding ways of getting trained on critical appraisal skills and of improving JC activity. Some of the suggestions given by faculty were as follows: increasing the frequency of JC activity, discussion of cited articles and new guidelines related to it, selecting all types of articles for criticism rather than only randomized controlled trials, and regular yearly exams on article criticism. Students stated that regular and frequent article criticism activity, practice of writing letter to the editor after criticism, active participation by peers and faculty, increasing weightage of marks for critical appraisal of papers in university examinations (at present marks are 50 out of 400), and a formal training for research criticism from 1 st year of postgraduation could improve critical appraisal program.

In Part II of this study, performance of the students on the skill of critical appraisal of papers was evaluated. Complete data of the first and last JC scores of a total of 25 students of the department were available, and when these scores were compared, it was seen that there was a statistically significant improvement in the overall scores ( P = 0.04), as well as in the scores obtained in methodology ( P = 0.03) and results section ( P = 0.02). This is depicted in Table 2 . Although statistically significant, the differences in scores in the methodology section, results section, and overall scores were 1.28/20, 1.28/20, and 4.36/100, respectively, amounting to 5.4%, 5.4%, and 4.36% higher scores in the last JC, which may not be considered educationally relevant (practically significant). The quantum of difference that would be considered practically significant was not decided a priori .

Comparison of marks obtained by pharmacology residents in their first and last journal club

SectionMarks obtained by pharmacology residents in their first journal club ( =25) Marks obtained by pharmacology residents in their last journal club ( =25) Wilcoxon signed-rank test
Mean±SDMedian (IQR)Mean±SDMedian (IQR) value
Introduction (maximum: 20 marks)13.48±2.5214 (12-16)14.28±2.3214 (13-16)0.22
Methodology (maximum: 20 marks)13.36±3.1114 (12-16)14.64±2.4014 (14-16.5)0.03*
Results and conclusion (maximum: 20 marks)13.60±2.4214 (12-15.5)14.88±2.6415 (13.5-16.5)0.02*
Discussion (maximum: 20 marks)13.44±3.2014 (11-16)14.16±2.7814 (12.5-16)0.12
References (maximum: 10 marks)7.12±1.207 (6.5-8)7.06±1.287 (6-8)0.80
Title, abstract, and keywords (maximum: 10 marks)7.44±0.927 (7-8)7.78±1.128 (7-9)0.17
Overall score (maximum: 100 marks)68.44±11.3972 (64-76)72.80±11.3271 (68-82.5)0.04*

Marks have been represented as mean±SD. The maximum marks that can be obtained in each section have been stated as maximum. *Indicates statistically significant ( P <0.05). IQR=Interquartile range, SD=Standard deviation

Scores of two groups, one group consisting of 2 nd -year postgraduate students ( n = 44) and second group consisting of 3 rd -year postgraduate students ( n = 32) were compared and revealed no statistically significant difference in overall score ( P = 0.84). This is depicted in Table 3 . Since the quantum of difference in the overall scores was meager 0.84/100 (0.84%), it cannot be considered practically significant.

Comparison of marks obtained by 2 nd - and 3 rd -year pharmacology residents in the activity of critical appraisal of research articles

SectionMarks obtained by 2 -year pharmacology students ( =44) Marks obtained by 3 -year pharmacology students ( =32) Mann-Whitney test, value
Mean±SDMedian (IQR)Mean±SDMedian (IQR)
Introduction (maximum: 20 marks)14.09±2.4114 (13-16)14.28±2.1414 (13-16)0.7527
Methodology (maximum: 20 marks)14.30±2.9014.5 (13-16)14.41±2.2414 (13-16)0.8385
Results and conclusion (maximum: 20 marks)14.09±2.4414 (12.5-16)14.59±2.6114.5 (13-16)0.4757
Discussion (maximum: 20 marks)13.86±2.7314 (12-16)14.16±2.7114.5 (12.5-16)0.5924
References (maximum: 10 marks)7.34±1.168 (7-8)7.05±1.407 (6-8)0.2551
Title, abstract, and keywords (maximum: 10 marks)7.82±0.908 (7-8.5)7.83±1.118 (7-8.5)0.9642
Overall score (maximum: 100 marks)71.50±10.7171.5 (66.5-79.5)72.34±10.8573 (66-79.5)0.8404

Marks have been represented as mean±SD. The maximum marks that can be obtained in each section have been stated as maximum. P <0.05 was considered to be statistically significant. IQR=Interquartile range, SD=Standard deviation

The present study gauged the perception of the pharmacology postgraduate students and teachers toward the use of critical appraisal activity as a reinforcing tool for research methodology. Both students and faculties (>50%) believed that critical appraisal activity increases student's knowledge on principles of ethics, experimental evaluation techniques, CONSORT guidelines, statistical analysis, concept of conflict of interest, current trends and recent advances in Pharmacology and trains on doing a review of literature, and improves skills on protocol writing and referencing. In the study conducted by Crank-Patton et al ., a survey on 278 general surgery program directors was carried out and more than 50% indicated that JC was important to their training program.[ 9 ]

The grading template used in Part II of the study was based on the IMRaD structure. Hence, equal weightage was given to the Introduction, Methodology, Results, and Discussion sections and lesser weightage was given to the references and title, abstract, and keywords sections.[ 10 ] While evaluating the scores obtained by 25 students in their first and last JC, it was seen that there was a statistically significant improvement in the overall scores of the students in their last JC. However, the meager improvement in scores cannot be considered educationally relevant, as the authors expected the students to score >90% for the upgrade to be considered educationally impactful. The above findings suggest that even though participation in the JC activity led to a steady increase in student's performance (~4%), the increment was not as expected. In addition, the students did not portray an excellent performance (>90%), with average scores being around 72% even in the last JC. This can be probably explained by the fact that students perform this activity in a routine setting and not in an examination setting. Unlike the scenario in an examination, students were aware that even if they performed at a mediocre level, there would be no repercussions.

A separate comparison of scores obtained by 44 students in their 2 nd year and 32 students in their 3 rd year of postgraduation students was also done. The number of student evaluation sheets reviewed for this analysis was greater than the number of student evaluation sheets reviewed to compare first and last JC scores. This can be spelled out by the fact that many students were still in 2 nd year when this analysis was done and the score data for their last JC, which would take place in 3 rd year, was not available. In addition, few students were asked to present at JC multiple times during the 2 nd /3 rd year of their postgraduation.

While evaluating the critical appraisal scores obtained by 2 nd - and 3 rd -year postgraduate students, it was found that although the 3 rd -year students had a mean overall score greater than the 2 nd -year students, this difference was not statistically significant. During the 1 st year of MD Pharmacology course, students at the study center attend JC once in every 2 weeks. Even though the 1 st -year students do not themselves present in JC, they listen and observe the criticism points stated by senior peers presenting at the JC, and thereby, incur substantial amount of knowledge required to critically appraise papers. By the time, they become 2 nd -year students, they are already well versed with the program and this could have led to similar overall mean scores between the 2 nd -year students (71.50 ± 10.71) and 3 rd -year students (72.34 ± 10.85). This finding suggests that attentive listening is as important as active participation in the JC. Moreover, although students are well acquainted with the process of criticism when they are in their 3 rd year, there is certainly a scope for improvement in terms of the mean overall scores.

Similar results were obtained in a study conducted by Stern et al ., in which 62 students in the internal medicine program at the New England Medical Center were asked to respond to a questionnaire, evaluate a sample article, and complete a self-assessment of competence in evaluation of research. Twenty-eight residents returned the questionnaire and the composite score for the resident's objective assessment was not significantly correlated with the postgraduate year or self-assessed critical appraisal skill.[ 11 ]

Article criticism activity provides the students with practical experience of techniques taught in research methodology workshop. However, this should be supplemented with activities that assess the improvement of designing and presenting studies, such as protocol and paper writing. Thus, critical appraisal plays a significant role in reinforcing good research practices among the new generation of physicians. Moreover, critical appraisal is an integral part of PG assessment, and although the current format of conducting JCs did not portray a clinically meaningful improvement, the authors believe that it is important to continue this activity with certain modifications suggested by students who participated in this study. Students suggested that an increase in the frequency of critical appraisal activity accompanied by the display of active participation by peers and faculty could help in the betterment of this activity. This should be brought to attention of the faculty, as students seem to be interested to learn. Critical appraisal should be a two-way teaching–learning process between the students and faculty and not a dire need for satisfying the students' eligibility criteria for postgraduate university examinations. This activity is not only for the trainee doctors but also a part of the overall faculty development program.[ 12 ]

In the present era, JCs have been used as a tool to not only teach critical appraisal skills but also to teach other necessary aspects such as research design, medical statistics, clinical epidemiology, and clinical decision-making.[ 13 , 14 ] A study conducted by Khan in 2013 suggested that success of JC program can be ensured if institutes develop a defined JC objective for the development of learning capability of students and also if they cultivate more skilled faculties.[ 15 ] A good JC is believed to facilitate relevant, meaningful scientific discussion, and evaluation of the research updates that will eventually benefit the patient care.[ 12 ]

Although there is a lot of literature emphasizing the importance of JC, there is a lack of studies that have evaluated the outcome of such activity. One such study conducted by Ibrahim et al . assessed the importance of critical appraisal as an activity in surgical trainees in Nigeria. They reported that 92.42% trainees considered the activity to be important or very important and 48% trainees stated that the activity helped in improving literature search.[ 16 ]

This study is unique since it is the first of its kind to evaluate how well students are able to critically appraise a research paper. Moreover, the study has taken into consideration the due opinions of the students as well as faculties, unlike the previous literature which has laid emphasis on only student's perception. A limitation of this study is that sample size for faculties was smaller than the students, as it was not possible to convince the distant faculty in other cities to fill the survey. Besides, there may be a variation in the manner of conduct of the critical appraisal activity in pharmacology departments across the various medical colleges in the country. Another limitation of this study was that a single assessor graded a single student during one particular JC. Nevertheless, each student presented at multiple JC and thereby came across multiple assessors. Since the articles addressed at different JC were disparate, interobserver variability was not taken into account in this study. Furthermore, the authors did not make an a priori decision on the quantum of increase in scores that would be considered educationally meaningful.

Pharmacology students and teachers acknowledge the role of critical appraisal in improving the ability to understand the crucial concepts of research methodology and research conduct. In our institute, participation in the JC activity led to an improvement in the skill of critical appraisal of published research articles among the pharmacology postgraduate students. However, this improvement was not educationally relevant. The scores obtained by final-year postgraduate students in this activity were nearly 72% indicating that there is still scope of betterment in this skill.

Financial support and sponsorship

Conflicts of interest.

There are no conflicts of interest.

Acknowledgments

We would like to acknowledge the support rendered by the entire Department of Pharmacology and Therapeutics at Seth GS Medical College.

Site logo

  • How to Write Evaluation Reports: Purpose, Structure, Content, Challenges, Tips, and Examples
  • Learning Center

Evaluation report

This article explores how to write effective evaluation reports, covering their purpose, structure, content, and common challenges. It provides tips for presenting evaluation findings effectively and using evaluation reports to improve programs and policies. Examples of well-written evaluation reports and templates are also included.

Table of Contents

What is an Evaluation Report?

What is the purpose of an evaluation report, importance of evaluation reports in program management, structure of evaluation report, best practices for writing an evaluation report, common challenges in writing an evaluation report, tips for presenting evaluation findings effectively, using evaluation reports to improve programs and policies, example of evaluation report templates, conclusion: making evaluation reports work for you.

An evaluatio n report is a document that presents the findings, conclusions, and recommendations of an evaluation, which is a systematic and objective assessment of the performance, impact, and effectiveness of a program, project, policy, or intervention. The report typically includes a description of the evaluation’s purpose, scope, methodology, and data sources, as well as an analysis of the evaluation findings and conclusions, and specific recommendations for program or project improvement.

Evaluation reports can help to build capacity for monitoring and evaluation within organizations and communities, by promoting a culture of learning and continuous improvement. By providing a structured approach to evaluation and reporting, evaluation reports can help to ensure that evaluations are conducted consistently and rigorously, and that the results are communicated effectively to stakeholders.

Evaluation reports may be read by a wide variety of audiences, including persons working in government agencies, staff members working for donors and partners, students and community organisations, and development professionals working on projects or programmes that are comparable to the ones evaluated.

Related: Difference Between Evaluation Report and M&E Reports .

The purpose of an evaluation report is to provide stakeholders with a comprehensive and objective assessment of a program or project’s performance, achievements, and challenges. The report serves as a tool for decision-making, as it provides evidence-based information on the program or project’s strengths and weaknesses, and recommendations for improvement.

The main objectives of an evaluation report are:

  • Accountability: To assess whether the program or project has met its objectives and delivered the intended results, and to hold stakeholders accountable for their actions and decisions.
  • Learning : To identify the key lessons learned from the program or project, including best practices, challenges, and opportunities for improvement, and to apply these lessons to future programs or projects.
  • Improvement : To provide recommendations for program or project improvement based on the evaluation findings and conclusions, and to support evidence-based decision-making.
  • Communication : To communicate the evaluation findings and conclusions to stakeholders , including program staff, funders, policymakers, and the general public, and to promote transparency and stakeholder engagement.

An evaluation report should be clear, concise, and well-organized, and should provide stakeholders with a balanced and objective assessment of the program or project’s performance. The report should also be timely, with recommendations that are actionable and relevant to the current context. Overall, the purpose of an evaluation report is to promote accountability, learning, and improvement in program and project design and implementation.

Evaluation reports play a critical role in program management by providing valuable information about program effectiveness and efficiency. They offer insights into the extent to which programs have achieved their objectives, as well as identifying areas for improvement.

Evaluation reports help program managers and stakeholders to make informed decisions about program design, implementation, and funding. They provide evidence-based information that can be used to improve program outcomes and address challenges.

Moreover, evaluation reports are essential in demonstrating program accountability and transparency to funders, policymakers, and other stakeholders. They serve as a record of program activities and outcomes, allowing stakeholders to assess the program’s impact and sustainability.

In short, evaluation reports are a vital tool for program managers and evaluators. They provide a comprehensive picture of program performance, including strengths, weaknesses, and areas for improvement. By utilizing evaluation reports, program managers can make informed decisions to improve program outcomes and ensure that their programs are effective, efficient, and sustainable over time.

how to write an evaluation for a research paper

The structure of an evaluation report can vary depending on the requirements and preferences of the stakeholders, but typically it includes the following sections:

  • Executive Summary : A brief summary of the evaluation findings, conclusions, and recommendations.
  • Introduction: An overview of the evaluation context, scope, purpose, and methodology.
  • Background: A summary of the programme or initiative that is being assessed, including its goals, activities, and intended audience(s).
  • Evaluation Questions : A list of the evaluation questions that guided the data collection and analysis.
  • Methodology: A description of the data collection methods used in the evaluation, including the sampling strategy, data sources, and data analysis techniques.
  • Findings: A presentation of the evaluation findings, organized according to the evaluation questions.
  • Conclusions : A summary of the main evaluation findings and conclusions, including an assessment of the program or project’s effectiveness, efficiency, and sustainability.
  • Recommendations : A list of specific recommendations for program or project improvements based on the evaluation findings and conclusions.
  • Lessons Learned : A discussion of the key lessons learned from the evaluation that could be applied to similar programs or projects in the future.
  • Limitations : A discussion of the limitations of the evaluation, including any challenges or constraints encountered during the data collection and analysis.
  • References: A list of references cited in the evaluation report.
  • Appendices : Additional information, such as detailed data tables, graphs, or maps, that support the evaluation findings and conclusions.

The structure of the evaluation report should be clear, logical, and easy to follow, with headings and subheadings used to organize the content and facilitate navigation.

In addition, the presentation of data may be made more engaging and understandable by the use of visual aids such as graphs and charts.

Writing an effective evaluation report requires careful planning and attention to detail. Here are some best practices to consider when writing an evaluation report:

Begin by establishing the report’s purpose, objectives, and target audience. A clear understanding of these elements will help guide the report’s structure and content.

Use clear and concise language throughout the report. Avoid jargon and technical terms that may be difficult for readers to understand.

Use evidence-based findings to support your conclusions and recommendations. Ensure that the findings are clearly presented using data tables, graphs, and charts.

Provide context for the evaluation by including a brief summary of the program being evaluated, its objectives, and intended impact. This will help readers understand the report’s purpose and the findings.

Include limitations and caveats in the report to provide a balanced assessment of the program’s effectiveness. Acknowledge any data limitations or other factors that may have influenced the evaluation’s results.

Organize the report in a logical manner, using headings and subheadings to break up the content. This will make the report easier to read and understand.

Ensure that the report is well-structured and easy to navigate. Use a clear and consistent formatting style throughout the report.

Finally, use the report to make actionable recommendations that will help improve program effectiveness and efficiency. Be specific about the steps that should be taken and the resources required to implement the recommendations.

By following these best practices, you can write an evaluation report that is clear, concise, and actionable, helping program managers and stakeholders to make informed decisions that improve program outcomes.

Catch HR’s eye instantly?

  • Resume Review
  • Resume Writing
  • Resume Optimization

Premier global development resume service since 2012

Stand Out with a Pro Resume

Writing an evaluation report can be a challenging task, even for experienced evaluators. Here are some common challenges that evaluators may encounter when writing an evaluation report:

  • Data limitations: One of the biggest challenges in writing an evaluation report is dealing with data limitations. Evaluators may find that the data they collected is incomplete, inaccurate, or difficult to interpret, making it challenging to draw meaningful conclusions.
  • Stakeholder disagreements: Another common challenge is stakeholder disagreements over the evaluation’s findings and recommendations. Stakeholders may have different opinions about the program’s effectiveness or the best course of action to improve program outcomes.
  • Technical writing skills: Evaluators may struggle with technical writing skills, which are essential for presenting complex evaluation findings in a clear and concise manner. Writing skills are particularly important when presenting statistical data or other technical information.
  • Time constraints: Evaluators may face time constraints when writing evaluation reports, particularly if the report is needed quickly or the evaluation involved a large amount of data collection and analysis.
  • Communication barriers: Evaluators may encounter communication barriers when working with stakeholders who speak different languages or have different cultural backgrounds. Effective communication is essential for ensuring that the evaluation’s findings are understood and acted upon.

By being aware of these common challenges, evaluators can take steps to address them and produce evaluation reports that are clear, accurate, and actionable. This may involve developing data collection and analysis plans that account for potential data limitations, engaging stakeholders early in the evaluation process to build consensus, and investing time in developing technical writing skills.

Presenting evaluation findings effectively is essential for ensuring that program managers and stakeholders understand the evaluation’s purpose, objectives, and conclusions. Here are some tips for presenting evaluation findings effectively:

  • Know your audience: Before presenting evaluation findings, ensure that you have a clear understanding of your audience’s background, interests, and expertise. This will help you tailor your presentation to their needs and interests.
  • Use visuals: Visual aids such as graphs, charts, and tables can help convey evaluation findings more effectively than written reports. Use visuals to highlight key data points and trends.
  • Be concise: Keep your presentation concise and to the point. Focus on the key findings and conclusions, and avoid getting bogged down in technical details.
  • Tell a story: Use the evaluation findings to tell a story about the program’s impact and effectiveness. This can help engage stakeholders and make the findings more memorable.
  • Provide context: Provide context for the evaluation findings by explaining the program’s objectives and intended impact. This will help stakeholders understand the significance of the findings.
  • Use plain language: Use plain language that is easily understandable by your target audience. Avoid jargon and technical terms that may confuse or alienate stakeholders.
  • Engage stakeholders: Engage stakeholders in the presentation by asking for their input and feedback. This can help build consensus and ensure that the evaluation findings are acted upon.

By following these tips, you can present evaluation findings in a way that engages stakeholders, highlights key findings, and ensures that the evaluation’s conclusions are acted upon to improve program outcomes.

Evaluation reports are crucial tools for program managers and policymakers to assess program effectiveness and make informed decisions about program design, implementation, and funding. By analyzing data collected during the evaluation process, evaluation reports provide evidence-based information that can be used to improve program outcomes and impact.

One of the primary ways that evaluation reports can be used to improve programs and policies is by identifying program strengths and weaknesses. By assessing program effectiveness and efficiency, evaluation reports can help identify areas where programs are succeeding and areas where improvements are needed. This information can inform program redesign and improvement efforts, leading to better program outcomes and impact.

Evaluation reports can also be used to make data-driven decisions about program design, implementation, and funding. By providing decision-makers with data-driven information, evaluation reports can help ensure that programs are designed and implemented in a way that maximizes their impact and effectiveness. This information can also be used to allocate resources more effectively, directing funding towards programs that are most effective and efficient.

Another way that evaluation reports can be used to improve programs and policies is by disseminating best practices in program design and implementation. By sharing information about what works and what doesn’t work, evaluation reports can help program managers and policymakers make informed decisions about program design and implementation, leading to better outcomes and impact.

Finally, evaluation reports can inform policy development and improvement efforts by providing evidence about the effectiveness and impact of existing policies. This information can be used to make data-driven decisions about policy development and improvement efforts, ensuring that policies are designed and implemented in a way that maximizes their impact and effectiveness.

In summary, evaluation reports are critical tools for improving programs and policies. By providing evidence-based information about program effectiveness and efficiency, evaluation reports can help program managers and policymakers make informed decisions, allocate resources more effectively, disseminate best practices, and inform policy development and improvement efforts.

There are many different templates available for creating evaluation reports. Here are some examples of template evaluation reports that can be used as a starting point for creating your own report:

  • The National Science Foundation Evaluation Report Template – This template provides a structure for evaluating research projects funded by the National Science Foundation. It includes sections on project background, research questions, evaluation methodology, data analysis, and conclusions and recommendations.
  • The CDC Program Evaluation Template – This template, created by the Centers for Disease Control and Prevention, provides a framework for evaluating public health programs. It includes sections on program description, evaluation questions, data sources, data analysis, and conclusions and recommendations.
  • The World Bank Evaluation Report Template – This template, created by the World Bank, provides a structure for evaluating development projects. It includes sections on project background, evaluation methodology, data analysis, findings and conclusions, and recommendations.
  • The European Commission Evaluation Report Template – This template provides a structure for evaluating European Union projects and programs. It includes sections on project description, evaluation objectives, evaluation methodology, findings, conclusions, and recommendations.
  • The UNICEF Evaluation Report Template – This template provides a framework for evaluating UNICEF programs and projects. It includes sections on program description, evaluation questions, evaluation methodology, findings, conclusions, and recommendations.

These templates provide a structure for creating evaluation reports that are well-organized and easy to read. They can be customized to meet the specific needs of your program or project and help ensure that your evaluation report is comprehensive and includes all of the necessary components.

  • World Health Organisations Reports
  • Checkl ist for Assessing USAID Evaluation Reports

In conclusion, evaluation reports are essential tools for program managers and policymakers to assess program effectiveness and make informed decisions about program design, implementation, and funding. By analyzing data collected during the evaluation process, evaluation reports provide evidence-based information that can be used to improve program outcomes and impact.

To make evaluation reports work for you, it is important to plan ahead and establish clear objectives and target audiences. This will help guide the report’s structure and content and ensure that the report is tailored to the needs of its intended audience.

When writing an evaluation report, it is important to use clear and concise language, provide evidence-based findings, and offer actionable recommendations that can be used to improve program outcomes. Including context for the evaluation findings and acknowledging limitations and caveats will provide a balanced assessment of the program’s effectiveness and help build trust with stakeholders.

Presenting evaluation findings effectively requires knowing your audience, using visuals, being concise, telling a story, providing context, using plain language, and engaging stakeholders. By following these tips, you can present evaluation findings in a way that engages stakeholders, highlights key findings, and ensures that the evaluation’s conclusions are acted upon to improve program outcomes.

Finally, using evaluation reports to improve programs and policies requires identifying program strengths and weaknesses, making data-driven decisions, disseminating best practices, allocating resources effectively, and informing policy development and improvement efforts. By using evaluation reports in these ways, program managers and policymakers can ensure that their programs are effective, efficient, and sustainable over time.

' data-src=

Well understanding, the description of the general evaluation of report are clear with good arrangement and it help students to learn and make practices

' data-src=

Patrick Kapuot

Thankyou for very much for such detail information. Very comprehensively said.

' data-src=

hailemichael

very good explanation, thanks

' data-src=

Lerato qhobo

This method of monitoring and evaluation is very crucial

Leave a Comment Cancel Reply

Your email address will not be published.

How strong is my Resume?

Only 2% of resumes land interviews.

Land a better, higher-paying career

how to write an evaluation for a research paper

Jobs for You

Subject matter expert (media literacy).

  • North Macedonia

Evaluation Specialist

Senior associate, human resources.

  • United States

Team Leader

College of education: open-rank, evaluation/social research methods — educational psychology.

  • Champaign, IL, USA
  • University of Illinois at Urbana-Champaign

Deputy Director – Operations and Finance

Energy/environment senior advisor, climate finance specialist, call for consultancy: evaluation of dfpa projects in kenya, uganda and ethiopia.

  • The Danish Family Planning Association

Project Assistant – Close Out

  • United States (Remote)

Intern- International Project and Proposal Support – ISPI

Budget and billing consultant, manager ii, budget and billing, usaid/lac office of regional sustainable development – program analyst, services you might be interested in, useful guides ....

How to Create a Strong Resume

Monitoring And Evaluation Specialist Resume

Resume Length for the International Development Sector

Types of Evaluation

Monitoring, Evaluation, Accountability, and Learning (MEAL)

LAND A JOB REFERRAL IN 2 WEEKS (NO ONLINE APPS!)

Sign Up & To Get My Free Referral Toolkit Now:

  • Library Home
  • Research Guides

Writing a Research Paper

  • Evaluate Sources

Library Research Guide

  • Choose Your Topic
  • Organize Your Information
  • Draft Your Paper
  • Revise, Review, Refine

How Will This Help Me?

Evaluating your sources will help you:

  • Determine the credibility of information
  • Rule out questionable information
  • Check for bias in your sources

In general, websites are hosted in domains that tell you what type of site it is.

  • .com = commercial
  • .net = network provider
  • .org = organization
  • .edu = education
  • .mil = military
  • .gov = U.S. government

Commercial sites want to persuade you to buy something, and organizations may want to persuade you to see an issue from a particular viewpoint. 

Useful information can be found on all kinds of sites, but you must consider carefully whether the source is useful for your purpose and for your audience.

Content Farms

Content farms are websites that exist to host ads. They post about popular web searches to try to drive traffic to their sites. They are rarely good sources for research.

  • Web’s “Content Farms” Grow Audiences For Ads This article by Zoe Chace at National Public Radio describes the ways How To sites try to drive more traffic to their sites to see the ads they host.

Fact Checking

Fact checking can help you verify the reliability of a source. The following sites may not have all the answers, but they can help you look into the sources for statements made in U.S. politics.

  • FactCheck.org This site monitors the accuracy of statements made in speeches, debates, interviews, and more and links to sources so readers can see the information for themselves. The site is a project of the Annenberg Public Policy Center of the University of Pennsylvania.
  • PolitiFact This resource evaluates the accuracy of statements made by elected officials, lobbyists, and special interest groups and provides sources for their evaluations. PolitiFact is currently run by the nonprofit Poynter Institute for Media Studies.

Evaluate Sources With the Big 5 Criteria

The Big 5 Criteria can help you evaluate your sources for credibility:

  • Currency: Check the publication date and determine whether it is sufficiently current for your topic.
  • Coverage (relevance): Consider whether the source is relevant to your research and whether it covers the topic adequately for your needs.
  • Authority: Discover the credentials of the authors of the source and determine their level of expertise and knowledge about the subject.
  • Accuracy: Consider whether the source presents accurate information and whether you can verify that information. 
  • Objectivity (purpose): Think about the author's purpose in creating the source and consider how that affects its usefulness to your research. 

Evaluate Sources With the CRAAP Test

Another way to evaluate your sources is the CRAAP Test, which means evaluating the following qualities of your sources:

This video (2:17) from Western Libraries explains the CRAAP Test. 

Video transcript

Evaluating Sources ( Western Libraries ) CC BY-NC-ND 3.0

Evaluate Websites

Evaluating websites follows the same process as for other sources, but finding the information you need to make an assessment can be more challenging with websites. The following guidelines can help you decide if a website is a good choice for a source for your paper. 

  • Currency . A useful site is updated regularly and lets visitors know when content was published on the site. Can you tell when the site was last updated? Can you see when the content you need was added? Does the site show signs of not being maintained (broken links, out-of-date information, etc.)?
  • Relevance . Think about the target audience for the site. Is it appropriate for you or your paper's audience?
  • Authority . Look for an About Us link or something similar to learn about the site's creator. The more you know about the credentials and mission of a site's creators, as well as their sources of information, the better idea you will have about the site's quality. 
  • Accuracy. Does the site present references or links to the sources of information it presents? Can you locate these sources so that you can read and interpret the information yourself?
  • Purpose. Consider the reason why the site was created. Can you detect any bias? Does the site use emotional language? Is the site trying to persuade you about something? 

Identify Political Perspective

News outlets, think tanks, organizations, and individual authors can present information from a particular political perspective. Consider this fact to help determine whether sources are useful for your paper. 

how to write an evaluation for a research paper

Check a news outlet's website, usually under About Us or Contact Us , for information about their reporters and authors. For example, USA Today has the USA Today Reporter Index , and the LA Times has an Editorial & Newsroom Contacts . Reading a profile or bio for a reporter or looking at other articles by the author may tell you whether that person favors a particular viewpoint. 

If a particular organization is mentioned in an article, learn more about the organization to identify potential biases. Think tanks and other associations usually exist for a reason. Searching news articles about the organization can help you determine their political leaning. 

Bias is not always bad, but you must be aware of it. Knowing the perspective of a source helps contextualize the information presented. 

  • << Previous: Databases
  • Next: Organize Your Information >>
  • Last Updated: Sep 4, 2024 2:42 PM
  • URL: https://guides.lib.k-state.edu/writingresearchpaper

K-State Libraries

1117 Mid-Campus Drive North, Manhattan, KS 66506

785-532-3014 | [email protected]

  • Statements and Disclosures
  • Accessibility
  • © Kansas State University
  • Skip to main content
  • Skip to primary sidebar
  • Skip to footer
  • QuestionPro

survey software icon

  • Solutions Industries Gaming Automotive Sports and events Education Government Travel & Hospitality Financial Services Healthcare Cannabis Technology Use Case AskWhy Communities Audience Contactless surveys Mobile LivePolls Member Experience GDPR Positive People Science 360 Feedback Surveys
  • Resources Blog eBooks Survey Templates Case Studies Training Help center

how to write an evaluation for a research paper

Home Market Research

Evaluation Research: Definition, Methods and Examples

Evaluation Research

Content Index

  • What is evaluation research
  • Why do evaluation research

Quantitative methods

Qualitative methods.

  • Process evaluation research question examples
  • Outcome evaluation research question examples

What is evaluation research?

Evaluation research, also known as program evaluation, refers to research purpose instead of a specific method. Evaluation research is the systematic assessment of the worth or merit of time, money, effort and resources spent in order to achieve a goal.

Evaluation research is closely related to but slightly different from more conventional social research . It uses many of the same methods used in traditional social research, but because it takes place within an organizational context, it requires team skills, interpersonal skills, management skills, political smartness, and other research skills that social research does not need much. Evaluation research also requires one to keep in mind the interests of the stakeholders.

Evaluation research is a type of applied research, and so it is intended to have some real-world effect.  Many methods like surveys and experiments can be used to do evaluation research. The process of evaluation research consisting of data analysis and reporting is a rigorous, systematic process that involves collecting data about organizations, processes, projects, services, and/or resources. Evaluation research enhances knowledge and decision-making, and leads to practical applications.

LEARN ABOUT: Action Research

Why do evaluation research?

The common goal of most evaluations is to extract meaningful information from the audience and provide valuable insights to evaluators such as sponsors, donors, client-groups, administrators, staff, and other relevant constituencies. Most often, feedback is perceived value as useful if it helps in decision-making. However, evaluation research does not always create an impact that can be applied anywhere else, sometimes they fail to influence short-term decisions. It is also equally true that initially, it might seem to not have any influence, but can have a delayed impact when the situation is more favorable. In spite of this, there is a general agreement that the major goal of evaluation research should be to improve decision-making through the systematic utilization of measurable feedback.

Below are some of the benefits of evaluation research

  • Gain insights about a project or program and its operations

Evaluation Research lets you understand what works and what doesn’t, where we were, where we are and where we are headed towards. You can find out the areas of improvement and identify strengths. So, it will help you to figure out what do you need to focus more on and if there are any threats to your business. You can also find out if there are currently hidden sectors in the market that are yet untapped.

  • Improve practice

It is essential to gauge your past performance and understand what went wrong in order to deliver better services to your customers. Unless it is a two-way communication, there is no way to improve on what you have to offer. Evaluation research gives an opportunity to your employees and customers to express how they feel and if there’s anything they would like to change. It also lets you modify or adopt a practice such that it increases the chances of success.

  • Assess the effects

After evaluating the efforts, you can see how well you are meeting objectives and targets. Evaluations let you measure if the intended benefits are really reaching the targeted audience and if yes, then how effectively.

  • Build capacity

Evaluations help you to analyze the demand pattern and predict if you will need more funds, upgrade skills and improve the efficiency of operations. It lets you find the gaps in the production to delivery chain and possible ways to fill them.

Methods of evaluation research

All market research methods involve collecting and analyzing the data, making decisions about the validity of the information and deriving relevant inferences from it. Evaluation research comprises of planning, conducting and analyzing the results which include the use of data collection techniques and applying statistical methods.

Some of the evaluation methods which are quite popular are input measurement, output or performance measurement, impact or outcomes assessment, quality assessment, process evaluation, benchmarking, standards, cost analysis, organizational effectiveness, program evaluation methods, and LIS-centered methods. There are also a few types of evaluations that do not always result in a meaningful assessment such as descriptive studies, formative evaluations, and implementation analysis. Evaluation research is more about information-processing and feedback functions of evaluation.

These methods can be broadly classified as quantitative and qualitative methods.

The outcome of the quantitative research methods is an answer to the questions below and is used to measure anything tangible.

  • Who was involved?
  • What were the outcomes?
  • What was the price?

The best way to collect quantitative data is through surveys , questionnaires , and polls . You can also create pre-tests and post-tests, review existing documents and databases or gather clinical data.

Surveys are used to gather opinions, feedback or ideas of your employees or customers and consist of various question types . They can be conducted by a person face-to-face or by telephone, by mail, or online. Online surveys do not require the intervention of any human and are far more efficient and practical. You can see the survey results on dashboard of research tools and dig deeper using filter criteria based on various factors such as age, gender, location, etc. You can also keep survey logic such as branching, quotas, chain survey, looping, etc in the survey questions and reduce the time to both create and respond to the donor survey . You can also generate a number of reports that involve statistical formulae and present data that can be readily absorbed in the meetings. To learn more about how research tool works and whether it is suitable for you, sign up for a free account now.

Create a free account!

Quantitative data measure the depth and breadth of an initiative, for instance, the number of people who participated in the non-profit event, the number of people who enrolled for a new course at the university. Quantitative data collected before and after a program can show its results and impact.

The accuracy of quantitative data to be used for evaluation research depends on how well the sample represents the population, the ease of analysis, and their consistency. Quantitative methods can fail if the questions are not framed correctly and not distributed to the right audience. Also, quantitative data do not provide an understanding of the context and may not be apt for complex issues.

Learn more: Quantitative Market Research: The Complete Guide

Qualitative research methods are used where quantitative methods cannot solve the research problem , i.e. they are used to measure intangible values. They answer questions such as

  • What is the value added?
  • How satisfied are you with our service?
  • How likely are you to recommend us to your friends?
  • What will improve your experience?

LEARN ABOUT: Qualitative Interview

Qualitative data is collected through observation, interviews, case studies, and focus groups. The steps for creating a qualitative study involve examining, comparing and contrasting, and understanding patterns. Analysts conclude after identification of themes, clustering similar data, and finally reducing to points that make sense.

Observations may help explain behaviors as well as the social context that is generally not discovered by quantitative methods. Observations of behavior and body language can be done by watching a participant, recording audio or video. Structured interviews can be conducted with people alone or in a group under controlled conditions, or they may be asked open-ended qualitative research questions . Qualitative research methods are also used to understand a person’s perceptions and motivations.

LEARN ABOUT:  Social Communication Questionnaire

The strength of this method is that group discussion can provide ideas and stimulate memories with topics cascading as discussion occurs. The accuracy of qualitative data depends on how well contextual data explains complex issues and complements quantitative data. It helps get the answer of “why” and “how”, after getting an answer to “what”. The limitations of qualitative data for evaluation research are that they are subjective, time-consuming, costly and difficult to analyze and interpret.

Learn more: Qualitative Market Research: The Complete Guide

Survey software can be used for both the evaluation research methods. You can use above sample questions for evaluation research and send a survey in minutes using research software. Using a tool for research simplifies the process right from creating a survey, importing contacts, distributing the survey and generating reports that aid in research.

Examples of evaluation research

Evaluation research questions lay the foundation of a successful evaluation. They define the topics that will be evaluated. Keeping evaluation questions ready not only saves time and money, but also makes it easier to decide what data to collect, how to analyze it, and how to report it.

Evaluation research questions must be developed and agreed on in the planning stage, however, ready-made research templates can also be used.

Process evaluation research question examples:

  • How often do you use our product in a day?
  • Were approvals taken from all stakeholders?
  • Can you report the issue from the system?
  • Can you submit the feedback from the system?
  • Was each task done as per the standard operating procedure?
  • What were the barriers to the implementation of each task?
  • Were any improvement areas discovered?

Outcome evaluation research question examples:

  • How satisfied are you with our product?
  • Did the program produce intended outcomes?
  • What were the unintended outcomes?
  • Has the program increased the knowledge of participants?
  • Were the participants of the program employable before the course started?
  • Do participants of the program have the skills to find a job after the course ended?
  • Is the knowledge of participants better compared to those who did not participate in the program?

MORE LIKE THIS

Experimental vs Observational Studies: Differences & Examples

Experimental vs Observational Studies: Differences & Examples

Sep 5, 2024

Interactive forms

Interactive Forms: Key Features, Benefits, Uses + Design Tips

Sep 4, 2024

closed-loop management

Closed-Loop Management: The Key to Customer Centricity

Sep 3, 2024

Net Trust Score

Net Trust Score: Tool for Measuring Trust in Organization

Sep 2, 2024

Other categories

  • Academic Research
  • Artificial Intelligence
  • Assessments
  • Brand Awareness
  • Case Studies
  • Communities
  • Consumer Insights
  • Customer effort score
  • Customer Engagement
  • Customer Experience
  • Customer Loyalty
  • Customer Research
  • Customer Satisfaction
  • Employee Benefits
  • Employee Engagement
  • Employee Retention
  • Friday Five
  • General Data Protection Regulation
  • Insights Hub
  • Life@QuestionPro
  • Market Research
  • Mobile diaries
  • Mobile Surveys
  • New Features
  • Online Communities
  • Question Types
  • Questionnaire
  • QuestionPro Products
  • Release Notes
  • Research Tools and Apps
  • Revenue at Risk
  • Survey Templates
  • Training Tips
  • Tuesday CX Thoughts (TCXT)
  • Uncategorized
  • What’s Coming Up
  • Workforce Intelligence

How to Write and Publish a Research Paper for a Peer-Reviewed Journal

  • Open access
  • Published: 30 April 2020
  • Volume 36 , pages 909–913, ( 2021 )

Cite this article

You have full access to this open access article

how to write an evaluation for a research paper

  • Clara Busse   ORCID: orcid.org/0000-0002-0178-1000 1 &
  • Ella August   ORCID: orcid.org/0000-0001-5151-1036 1 , 2  

282k Accesses

17 Citations

709 Altmetric

Explore all metrics

Communicating research findings is an essential step in the research process. Often, peer-reviewed journals are the forum for such communication, yet many researchers are never taught how to write a publishable scientific paper. In this article, we explain the basic structure of a scientific paper and describe the information that should be included in each section. We also identify common pitfalls for each section and recommend strategies to avoid them. Further, we give advice about target journal selection and authorship. In the online resource 1 , we provide an example of a high-quality scientific paper, with annotations identifying the elements we describe in this article.

Similar content being viewed by others

how to write an evaluation for a research paper

How to Choose the Right Journal

how to write an evaluation for a research paper

The Point Is…to Publish?

how to write an evaluation for a research paper

Writing and publishing a scientific paper

Explore related subjects.

  • Artificial Intelligence

Avoid common mistakes on your manuscript.

Introduction

Writing a scientific paper is an important component of the research process, yet researchers often receive little formal training in scientific writing. This is especially true in low-resource settings. In this article, we explain why choosing a target journal is important, give advice about authorship, provide a basic structure for writing each section of a scientific paper, and describe common pitfalls and recommendations for each section. In the online resource 1 , we also include an annotated journal article that identifies the key elements and writing approaches that we detail here. Before you begin your research, make sure you have ethical clearance from all relevant ethical review boards.

Select a Target Journal Early in the Writing Process

We recommend that you select a “target journal” early in the writing process; a “target journal” is the journal to which you plan to submit your paper. Each journal has a set of core readers and you should tailor your writing to this readership. For example, if you plan to submit a manuscript about vaping during pregnancy to a pregnancy-focused journal, you will need to explain what vaping is because readers of this journal may not have a background in this topic. However, if you were to submit that same article to a tobacco journal, you would not need to provide as much background information about vaping.

Information about a journal’s core readership can be found on its website, usually in a section called “About this journal” or something similar. For example, the Journal of Cancer Education presents such information on the “Aims and Scope” page of its website, which can be found here: https://www.springer.com/journal/13187/aims-and-scope .

Peer reviewer guidelines from your target journal are an additional resource that can help you tailor your writing to the journal and provide additional advice about crafting an effective article [ 1 ]. These are not always available, but it is worth a quick web search to find out.

Identify Author Roles Early in the Process

Early in the writing process, identify authors, determine the order of authors, and discuss the responsibilities of each author. Standard author responsibilities have been identified by The International Committee of Medical Journal Editors (ICMJE) [ 2 ]. To set clear expectations about each team member’s responsibilities and prevent errors in communication, we also suggest outlining more detailed roles, such as who will draft each section of the manuscript, write the abstract, submit the paper electronically, serve as corresponding author, and write the cover letter. It is best to formalize this agreement in writing after discussing it, circulating the document to the author team for approval. We suggest creating a title page on which all authors are listed in the agreed-upon order. It may be necessary to adjust authorship roles and order during the development of the paper. If a new author order is agreed upon, be sure to update the title page in the manuscript draft.

In the case where multiple papers will result from a single study, authors should discuss who will author each paper. Additionally, authors should agree on a deadline for each paper and the lead author should take responsibility for producing an initial draft by this deadline.

Structure of the Introduction Section

The introduction section should be approximately three to five paragraphs in length. Look at examples from your target journal to decide the appropriate length. This section should include the elements shown in Fig.  1 . Begin with a general context, narrowing to the specific focus of the paper. Include five main elements: why your research is important, what is already known about the topic, the “gap” or what is not yet known about the topic, why it is important to learn the new information that your research adds, and the specific research aim(s) that your paper addresses. Your research aim should address the gap you identified. Be sure to add enough background information to enable readers to understand your study. Table 1 provides common introduction section pitfalls and recommendations for addressing them.

figure 1

The main elements of the introduction section of an original research article. Often, the elements overlap

Methods Section

The purpose of the methods section is twofold: to explain how the study was done in enough detail to enable its replication and to provide enough contextual detail to enable readers to understand and interpret the results. In general, the essential elements of a methods section are the following: a description of the setting and participants, the study design and timing, the recruitment and sampling, the data collection process, the dataset, the dependent and independent variables, the covariates, the analytic approach for each research objective, and the ethical approval. The hallmark of an exemplary methods section is the justification of why each method was used. Table 2 provides common methods section pitfalls and recommendations for addressing them.

Results Section

The focus of the results section should be associations, or lack thereof, rather than statistical tests. Two considerations should guide your writing here. First, the results should present answers to each part of the research aim. Second, return to the methods section to ensure that the analysis and variables for each result have been explained.

Begin the results section by describing the number of participants in the final sample and details such as the number who were approached to participate, the proportion who were eligible and who enrolled, and the number of participants who dropped out. The next part of the results should describe the participant characteristics. After that, you may organize your results by the aim or by putting the most exciting results first. Do not forget to report your non-significant associations. These are still findings.

Tables and figures capture the reader’s attention and efficiently communicate your main findings [ 3 ]. Each table and figure should have a clear message and should complement, rather than repeat, the text. Tables and figures should communicate all salient details necessary for a reader to understand the findings without consulting the text. Include information on comparisons and tests, as well as information about the sample and timing of the study in the title, legend, or in a footnote. Note that figures are often more visually interesting than tables, so if it is feasible to make a figure, make a figure. To avoid confusing the reader, either avoid abbreviations in tables and figures, or define them in a footnote. Note that there should not be citations in the results section and you should not interpret results here. Table 3 provides common results section pitfalls and recommendations for addressing them.

Discussion Section

Opposite the introduction section, the discussion should take the form of a right-side-up triangle beginning with interpretation of your results and moving to general implications (Fig.  2 ). This section typically begins with a restatement of the main findings, which can usually be accomplished with a few carefully-crafted sentences.

figure 2

Major elements of the discussion section of an original research article. Often, the elements overlap

Next, interpret the meaning or explain the significance of your results, lifting the reader’s gaze from the study’s specific findings to more general applications. Then, compare these study findings with other research. Are these findings in agreement or disagreement with those from other studies? Does this study impart additional nuance to well-accepted theories? Situate your findings within the broader context of scientific literature, then explain the pathways or mechanisms that might give rise to, or explain, the results.

Journals vary in their approach to strengths and limitations sections: some are embedded paragraphs within the discussion section, while some mandate separate section headings. Keep in mind that every study has strengths and limitations. Candidly reporting yours helps readers to correctly interpret your research findings.

The next element of the discussion is a summary of the potential impacts and applications of the research. Should these results be used to optimally design an intervention? Does the work have implications for clinical protocols or public policy? These considerations will help the reader to further grasp the possible impacts of the presented work.

Finally, the discussion should conclude with specific suggestions for future work. Here, you have an opportunity to illuminate specific gaps in the literature that compel further study. Avoid the phrase “future research is necessary” because the recommendation is too general to be helpful to readers. Instead, provide substantive and specific recommendations for future studies. Table 4 provides common discussion section pitfalls and recommendations for addressing them.

Follow the Journal’s Author Guidelines

After you select a target journal, identify the journal’s author guidelines to guide the formatting of your manuscript and references. Author guidelines will often (but not always) include instructions for titles, cover letters, and other components of a manuscript submission. Read the guidelines carefully. If you do not follow the guidelines, your article will be sent back to you.

Finally, do not submit your paper to more than one journal at a time. Even if this is not explicitly stated in the author guidelines of your target journal, it is considered inappropriate and unprofessional.

Your title should invite readers to continue reading beyond the first page [ 4 , 5 ]. It should be informative and interesting. Consider describing the independent and dependent variables, the population and setting, the study design, the timing, and even the main result in your title. Because the focus of the paper can change as you write and revise, we recommend you wait until you have finished writing your paper before composing the title.

Be sure that the title is useful for potential readers searching for your topic. The keywords you select should complement those in your title to maximize the likelihood that a researcher will find your paper through a database search. Avoid using abbreviations in your title unless they are very well known, such as SNP, because it is more likely that someone will use a complete word rather than an abbreviation as a search term to help readers find your paper.

After you have written a complete draft, use the checklist (Fig. 3 ) below to guide your revisions and editing. Additional resources are available on writing the abstract and citing references [ 5 ]. When you feel that your work is ready, ask a trusted colleague or two to read the work and provide informal feedback. The box below provides a checklist that summarizes the key points offered in this article.

figure 3

Checklist for manuscript quality

Data Availability

Michalek AM (2014) Down the rabbit hole…advice to reviewers. J Cancer Educ 29:4–5

Article   Google Scholar  

International Committee of Medical Journal Editors. Defining the role of authors and contributors: who is an author? http://www.icmje.org/recommendations/browse/roles-and-responsibilities/defining-the-role-of-authosrs-and-contributors.html . Accessed 15 January, 2020

Vetto JT (2014) Short and sweet: a short course on concise medical writing. J Cancer Educ 29(1):194–195

Brett M, Kording K (2017) Ten simple rules for structuring papers. PLoS ComputBiol. https://doi.org/10.1371/journal.pcbi.1005619

Lang TA (2017) Writing a better research article. J Public Health Emerg. https://doi.org/10.21037/jphe.2017.11.06

Download references

Acknowledgments

Ella August is grateful to the Sustainable Sciences Institute for mentoring her in training researchers on writing and publishing their research.

Code Availability

Not applicable.

Author information

Authors and affiliations.

Department of Maternal and Child Health, University of North Carolina Gillings School of Global Public Health, 135 Dauer Dr, 27599, Chapel Hill, NC, USA

Clara Busse & Ella August

Department of Epidemiology, University of Michigan School of Public Health, 1415 Washington Heights, Ann Arbor, MI, 48109-2029, USA

Ella August

You can also search for this author in PubMed   Google Scholar

Corresponding author

Correspondence to Ella August .

Ethics declarations

Conflicts of interests.

The authors declare that they have no conflict of interest.

Additional information

Publisher’s note.

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Electronic supplementary material

(PDF 362 kb)

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ .

Reprints and permissions

About this article

Busse, C., August, E. How to Write and Publish a Research Paper for a Peer-Reviewed Journal. J Canc Educ 36 , 909–913 (2021). https://doi.org/10.1007/s13187-020-01751-z

Download citation

Published : 30 April 2020

Issue Date : October 2021

DOI : https://doi.org/10.1007/s13187-020-01751-z

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Manuscripts
  • Scientific writing
  • Find a journal
  • Publish with us
  • Track your research

Examples

Evaluation Essay

Evaluation essay generator.

how to write an evaluation for a research paper

Creating an essay is a part of every student’s academic journey. There are different kinds of essays that can be a part of a  student writing  task. One of these essays is the evaluation essay. What can set apart an evaluation essay from various kinds of academic essays is that it can also be used in different undertakings within the corporate and professional environment. Evaluation essays are not limited to be used for educational purposes as it can also be beneficial in the fields of business, research and community development.

An evaluation essay contains an objective assessment that is written by an individual who should be fully-knowledgeable of what he or she is writing about. More so, this essay relays the sound judgement about a specific subject matter or topic of discussion. Each evaluation essay are based on evaluative writing that are commonly created in accordance to a set of criteria or value measurements. We have curated ten evaluation essays that you can refer to if you want to write your own evaluation essay.

Self-Evaluation Essay Sample

Self-Evaluation-Essay-Sample1

Size: 31 KB

Student Self-Evaluation Essay

Student-Self-Evaluation-Essay

Size: 121 KB

Things to Remember When Writing an Evaluation Essay

An evaluation essay should always be direct to the point and specific as it contains factual information that is essential to be known by the readers. To avoid  common essay mistakes , some of the things that you should always remind yourself when writing an evaluation essay are listed below.

  • When writing an evaluation essay, a writer must always be backed up by evidences so that he or she can support the evaluation being made. If you are writing an evaluation essay, you should always be objective with the content that you are presenting. Your opinion matters but you should make sure that it is based on reality. Evaluation essays work best if the readers can identify the sources that you have used to come up with the assessment that they are currently reading. If you will ensure that there is enough evidences to support you, then your evaluation essay can be more credible and relevant.
  • Be specific with the kind of evaluation essay that you are creating. An evaluation essay can only be effective if you are aware of the purpose on why you are writing the document. Being able to present details, comments, and information that is directly related to the kind of evaluation essay that you are writing can help you create a highly-usable output. There are different kinds of evaluation essays and you should be aware that each of them have differences depending on the purpose of their creation. Come up with a highly-usable and effective evaluation essay by directly providing the needs of your readers.
  • Always be clear when presenting your evaluation. Since the main purpose of an evaluation essay is to relay your viewpoint about a specific subject, you have to make sure that you will be precise and concise when delivering the message that you want your readers to be knowledgeable of. You have to explain how you were able to create the evaluation which includes the specification of the factors that you have considered within the entirety of the evaluation and writing process.

Humanities Project Evaluation Essay

Humanities-Project-Evaluation-Essay2

Size: 331 KB

Printable Self-Evaluation Essay Example

Printable-Self-Evaluation-Essay-Example

Size: 128 KB

Purposes of an Evaluation Essay

There is a wide variety of evaluation  essay examples that are specifically created for particular purposes. Evaluation essays can cover a lot of topics which is why it is used in a range of industries and processes. The different kinds of evaluation essays can be used for the following instances and activities:

  • To create a book report or a review of a book’s content and how it has affected the reader
  • To identify critical points of a written work may it be a poem, another essay or a research paper
  • To create a literature or literary review to fully identify the content of a literary piece
  • To give critique about an initial analysis or a full process
  • To support the processes of employment regularization or employee promotion
  • To assess and analyze the results of a reading activity
  • To add value to a recommendation letter
  • To analyze a research topic that can fully affect the entire research activity
  • To evaluate the work performance of either a student or an employee
  • To identify the strengths and weaknesses of an individual through a self-evaluation

With the different ways on how you can use an evaluation essay, it is safe to say that there are a lot of fields of expertise that can benefit from this document. When creating your own evaluation essay, you should always keep in mind that the content of your essay must be relevant to the message that you would like to disseminate or share to your target readers.

Thesis Paper Evaluation Essay Example

Thesis-Paper-Evaluation-Essay-Example1

Size: 100 KB

Evaluation Essay Sample in PDF

Evaluation-Essay-Sample-in-PDF

Size: 282 KB

Qualitative Evaluation Essay Example

Qualitative-Evaluation-Essay-Example

Size: 396 KB

Steps in Writing an Evaluation Essay

If you want to create an evaluation essay, you should be strategic when it comes to the presentation of information that can be helpful in the writing activity. Your evaluation essay can only be fully-maximized if there is an organized discussion of your evaluation as well as the facts that can support your thesis statement.

Here is an  essay writing basic guide  that you may follow when writing an evaluation essay:

  • Be aware of your topic. The first thing that you need to do when writing an evaluation essay is to be knowledgeable about the topic that you will write about. As much as possible, research about the subject of discussion so you can easily identify the characteristics that you can evaluate and the criteria that you will use for evaluation.
  • Make sure to have a set of criteria that can help you determine your evaluation. Once you are already aware of your topic, you can already set criteria that will serve as the basis for your evaluation. If you will properly identify the criteria that will best fit your needs for the specific evaluation, then you can make your evaluation essay stronger and more effective.
  • Refer to samples and templates of evaluation essays. It will be helpful if you will look at different kinds of evaluation essay samples and templates. These documents can help you be more familiar with what an evaluation essay is and how the details present in this kind of essay should be arranged and presented.
  • Create an evaluation essay draft. It will depend on you if you will use a template as your guide when writing an evaluation essay. You can also just browse through samples and start your evaluation essay from scratch. One thing that we highly suggest you should do is to make a draft or an outline of the discussion that you would like to have. This can help you ensure that all the necessary information will be placed in your final evaluation essay.
  • Start writing the content of your evaluation essay. Through the help of the draft that you have created, write a thesis in the first paragraph of your essay. This is the part where you can discuss the topic that you will use for evaluation and the statement on whether you think positively or negatively of the subject. The way that you create a thesis statement will be based on the nature of operations or functions where the essay will be used.
  • Incorporate evidences in your discussion so you can support your claims and/or opinions. After your thesis statement and discussion of important details, your next paragraphs should contain your opinions as well as the evidence that you have used as references. You can end your evaluation essay by having a firm statement of your conclusion.

Printable Self-Evaluation Essay Sample

Printable-Self-Evaluation-Essay-Sample

Size: 13 KB

Self-Assessment Essay Example

Self-Assessment-Essay-Example

Size: 29 KB

Simple Self-Evaluation Essay Example

Simple-Self-Evaluation-Essay-Example

Size: 51 KB

Evaluation Essay as an Important Written Document

An evaluation essay should be taken seriously especially in matters where its content can affect other people or even an entire community. Since an evaluation essay is not only a part of  college essay examples  as it can also be used in business and corporate processes, you have to understand the weight of its effectiveness. May it be a self-evaluation essay or a project evaluation essay, always keep in mind that you should put together all the evident facts and your statements in a professional and objective manner.

Whether it is a  last minute essay writing  or a thoughtfully planned evaluation essay composition, being aware of the items that we have discussed in this post can help you further improve the content and structure of an evaluation essay. It will also be easier for you to come up with an evaluation that can be trusted by your readers. Present all the details that you need to discuss in an organized and informative manner so you can come up with an evaluation essay that will truly work.

Twitter

Text prompt

  • Instructive
  • Professional

Write an Evaluation Essay on the effectiveness of online learning platforms.

Discuss the quality of a school cafeteria's lunch options in an Evaluation Essay.

how to write an evaluation for a research paper

The Predictive Edge: ChatGPT, LLMs, ML, and Asset Pricing

how to write an evaluation for a research paper

Useful Research Prompts - Updated Constantly

A collection of prompts i keep using over and over.

how to write an evaluation for a research paper

Feel free to send me some of your favorites or post them in the comments.

I’ll indicate which LLM (Claude or ChatGPT) is better for the prompt. So far, Claude wins in anything related to writing or thinking. Also, buy my book !

I’ll split it into sections:

Prompts for discussions and draft evaluation

Collection of helpful prompts when writing a paper

A prompt for idea evaluation

The horizontal dividers separate prompts from each other. Prompts are in quotes (the blue bar).

1. Discussions/Overall Paper Evaluation - Claude

When discussing a paper in an early stage. (The guiding is for the LLM, so it has the summary available)

I am discussing this paper. Guide me slowly through what they do, their method, and their contribution. Then, provide some areas of opportunities and how to address them. It is an early version, it is better to see focus on the opportunities.

Then follow with

Explain each part of what they do and their method in-depth

Continue with:

I am discussing the paper. Give me discussion slides based on all of the above. I need them to be self contained.

Obviously, it is not at the human level yet, but it helps brainstorm.

For AI/ML papers. I mostly use this for a glance at my student’s ideas.

Be honest, critical, and fair. Does this paper satisfy the necessary conditions for a finance AI/ML paper to be published? Elaborate on each point: The paper explores an interesting question. The paper shows that the proposed method works well and better than existing techniques applied in the literature. The method has to work out-of-sample (or after the knowledge cutoff date if using LLMs). Results using the new method give us new and different economic insights compared to old methods not just a more precise measure. Not just better results or showing that we can achieve something. What are the new insights?

Then, follow up with:

These are not sufficient conditions. What other issues the paper has that should be addressed in order for the paper to be publishable in a top-3 finance journal. First, list them. Then check whether the paper already addresses the points.

When evaluating holistically your own paper.

Pretend you are an extremely harsh referee for a top-3 finance journal. Write a comprehensive and critical referee report on this paper. First, write the summary of the paper. Then, focus on the major weaknesses of the paper. You must elaborate extensively on each potential weakness and back it up by referring to the specific text in the paper. You are doing this to help me anticipate any extremely critical referee report before submitting my paper, so I need you to be very strict. Pick on anything fair.

Follow up with:

Which of these are addressable? How?

2. Writing Help - Claude

I am worried I am not putting enough economic lessons in my paper. Is this correct?

You can copy and paste Google Slides in text format or Latex directly.

Here are my slides. And my motivation for the paper: Do I motivate the slides well? Is there anything missing?
Help with grammar and fluency. Leave the citations and latex as is
Better title? 10 options
Create abstract and keywords
I am working on my paper. I think I need to streamline this section better. Let's think of the layout first.
Let's start. Let's do paragraph format, not lists

Idea Evaluation - Claude

I am writing an article. Very speculative. Let's evaluate and brainstorm

The next promo helps evaluate new potential ideas. Again, I use it mainly for the PhDs. Ideally, it would have the following format.

A brief description and motivation of your idea

What are the three closest papers, and how would your paper contribute beyond what is available in the existing literature?

Why is your idea interesting and not a robustness or simple extension

The specific data you would use and how you would access this data

The specific variables of interest and how you would construct them

The exact main regression(s) (what y on what x)

What do you expect to get from this regression(s)

Why would the results be interesting

Then, attach the three closest papers’ abstracts or relevant sections. LLMs models are not good at literature reviews, so you must add them.

You are a leading expert in economics and finance research. Your task is to critically evaluate the potential of a given research idea, focusing primarily on its marginal contribution to the field. The user will provide information about the idea and the three closest related papers. Analyze the idea based on the following criteria: 1. Novelty and Marginal Contribution: - How does this idea differ substantially from the three closest papers provided? - Does it offer genuinely new insights, or is it merely an extension or application in a different setting? - What specific, significant contribution does this idea make to the field? - Does it improve our understanding of finance or economics? The idea should not be a robustness check or a simple extension. Examples of robustness checks: Applying an advanced technique and obtaining a similar result as previous papers. Changing the setup to a different region or international setting. Checking if the model from an existing paper is robust to parameter changes. 2. Methodological Innovation: - Does the idea propose any new methodological approaches? - If using existing methods, does it apply them in a novel way that generates new knowledge? - If in corporate finance, does it have a source exogenous variation? 3. Theoretical or Empirical Advancement: - How does this idea push the theoretical or empirical boundaries of the field? - Does it challenge or extend existing paradigms in a meaningful way? 4. Potential Impact: - If the research yields the expected results, how would it change our understanding of key economic or financial phenomena? - Could the findings influence policy, practice, or future research directions significantly? 5. Feasibility and Data: - Are the required data sources clearly identified? - Is the proposed methodology realistic and appropriate for addressing the research question? 6. Relevance and Timeliness: - How relevant is this idea to current academic debates or real-world issues in economics or finance? - Is it addressing a gap in the literature that is recognized as important by the field? Provide a concise, critical evaluation (maximum 250 words) addressing these points. Be direct, honest, and specific in your assessment. If the idea lacks sufficient novelty or marginal contribution, clearly explain why, referencing the closest papers provided. If it shows promise, highlight its unique strengths and any areas that need further development to maximize its contribution. After your evaluation, provide an overall rating of the idea's potential on a scale of 1 to 10, where: 1-3: Low potential, lacks significant novelty or marginal contribution 4-6: Moderate potential, needs substantial development to differentiate from existing literature 7-8: Good potential, offers notable contribution with some refinement needed 9-10: Excellent potential, highly novel with significant marginal contribution Rating: [Your rating here]

Want to learn more about ChatGPT or support my writing? Buy my book:

“The Predictive Edge: Generative AI and ChatGPT in Financial Forecasting.”

https://www.amazon.com/Predictive-Edge-Generative-Financial-Forecasting/dp/1394242719

how to write an evaluation for a research paper

Thank you for subscribing!

how to write an evaluation for a research paper

Ready for more?

Have a language expert improve your writing

Run a free plagiarism check in 10 minutes, generate accurate citations for free.

  • Knowledge Base
  • Research paper

Writing a Research Paper Introduction | Step-by-Step Guide

Published on September 24, 2022 by Jack Caulfield . Revised on September 5, 2024.

Writing a Research Paper Introduction

The introduction to a research paper is where you set up your topic and approach for the reader. It has several key goals:

  • Present your topic and get the reader interested
  • Provide background or summarize existing research
  • Position your own approach
  • Detail your specific research problem and problem statement
  • Give an overview of the paper’s structure

The introduction looks slightly different depending on whether your paper presents the results of original empirical research or constructs an argument by engaging with a variety of sources.

The five steps in this article will help you put together an effective introduction for either type of research paper.

Instantly correct all language mistakes in your text

Upload your document to correct all your mistakes in minutes

upload-your-document-ai-proofreader

Table of contents

Step 1: introduce your topic, step 2: describe the background, step 3: establish your research problem, step 4: specify your objective(s), step 5: map out your paper, research paper introduction examples, frequently asked questions about the research paper introduction.

The first job of the introduction is to tell the reader what your topic is and why it’s interesting or important. This is generally accomplished with a strong opening hook.

The hook is a striking opening sentence that clearly conveys the relevance of your topic. Think of an interesting fact or statistic, a strong statement, a question, or a brief anecdote that will get the reader wondering about your topic.

For example, the following could be an effective hook for an argumentative paper about the environmental impact of cattle farming:

A more empirical paper investigating the relationship of Instagram use with body image issues in adolescent girls might use the following hook:

Don’t feel that your hook necessarily has to be deeply impressive or creative. Clarity and relevance are still more important than catchiness. The key thing is to guide the reader into your topic and situate your ideas.

Prevent plagiarism. Run a free check.

This part of the introduction differs depending on what approach your paper is taking.

In a more argumentative paper, you’ll explore some general background here. In a more empirical paper, this is the place to review previous research and establish how yours fits in.

Argumentative paper: Background information

After you’ve caught your reader’s attention, specify a bit more, providing context and narrowing down your topic.

Provide only the most relevant background information. The introduction isn’t the place to get too in-depth; if more background is essential to your paper, it can appear in the body .

Empirical paper: Describing previous research

For a paper describing original research, you’ll instead provide an overview of the most relevant research that has already been conducted. This is a sort of miniature literature review —a sketch of the current state of research into your topic, boiled down to a few sentences.

This should be informed by genuine engagement with the literature. Your search can be less extensive than in a full literature review, but a clear sense of the relevant research is crucial to inform your own work.

Begin by establishing the kinds of research that have been done, and end with limitations or gaps in the research that you intend to respond to.

The next step is to clarify how your own research fits in and what problem it addresses.

Argumentative paper: Emphasize importance

In an argumentative research paper, you can simply state the problem you intend to discuss, and what is original or important about your argument.

Empirical paper: Relate to the literature

In an empirical research paper, try to lead into the problem on the basis of your discussion of the literature. Think in terms of these questions:

  • What research gap is your work intended to fill?
  • What limitations in previous work does it address?
  • What contribution to knowledge does it make?

You can make the connection between your problem and the existing research using phrases like the following.

Although has been studied in detail, insufficient attention has been paid to . You will address a previously overlooked aspect of your topic.
The implications of study deserve to be explored further. You will build on something suggested by a previous study, exploring it in greater depth.
It is generally assumed that . However, this paper suggests that … You will depart from the consensus on your topic, establishing a new position.

Now you’ll get into the specifics of what you intend to find out or express in your research paper.

The way you frame your research objectives varies. An argumentative paper presents a thesis statement, while an empirical paper generally poses a research question (sometimes with a hypothesis as to the answer).

Argumentative paper: Thesis statement

The thesis statement expresses the position that the rest of the paper will present evidence and arguments for. It can be presented in one or two sentences, and should state your position clearly and directly, without providing specific arguments for it at this point.

Empirical paper: Research question and hypothesis

The research question is the question you want to answer in an empirical research paper.

Present your research question clearly and directly, with a minimum of discussion at this point. The rest of the paper will be taken up with discussing and investigating this question; here you just need to express it.

A research question can be framed either directly or indirectly.

  • This study set out to answer the following question: What effects does daily use of Instagram have on the prevalence of body image issues among adolescent girls?
  • We investigated the effects of daily Instagram use on the prevalence of body image issues among adolescent girls.

If your research involved testing hypotheses , these should be stated along with your research question. They are usually presented in the past tense, since the hypothesis will already have been tested by the time you are writing up your paper.

For example, the following hypothesis might respond to the research question above:

Don't submit your assignments before you do this

The academic proofreading tool has been trained on 1000s of academic texts. Making it the most accurate and reliable proofreading tool for students. Free citation check included.

how to write an evaluation for a research paper

Try for free

The final part of the introduction is often dedicated to a brief overview of the rest of the paper.

In a paper structured using the standard scientific “introduction, methods, results, discussion” format, this isn’t always necessary. But if your paper is structured in a less predictable way, it’s important to describe the shape of it for the reader.

If included, the overview should be concise, direct, and written in the present tense.

  • This paper will first discuss several examples of survey-based research into adolescent social media use, then will go on to …
  • This paper first discusses several examples of survey-based research into adolescent social media use, then goes on to …

Scribbr’s paraphrasing tool can help you rephrase sentences to give a clear overview of your arguments.

Full examples of research paper introductions are shown in the tabs below: one for an argumentative paper, the other for an empirical paper.

  • Argumentative paper
  • Empirical paper

Are cows responsible for climate change? A recent study (RIVM, 2019) shows that cattle farmers account for two thirds of agricultural nitrogen emissions in the Netherlands. These emissions result from nitrogen in manure, which can degrade into ammonia and enter the atmosphere. The study’s calculations show that agriculture is the main source of nitrogen pollution, accounting for 46% of the country’s total emissions. By comparison, road traffic and households are responsible for 6.1% each, the industrial sector for 1%. While efforts are being made to mitigate these emissions, policymakers are reluctant to reckon with the scale of the problem. The approach presented here is a radical one, but commensurate with the issue. This paper argues that the Dutch government must stimulate and subsidize livestock farmers, especially cattle farmers, to transition to sustainable vegetable farming. It first establishes the inadequacy of current mitigation measures, then discusses the various advantages of the results proposed, and finally addresses potential objections to the plan on economic grounds.

The rise of social media has been accompanied by a sharp increase in the prevalence of body image issues among women and girls. This correlation has received significant academic attention: Various empirical studies have been conducted into Facebook usage among adolescent girls (Tiggermann & Slater, 2013; Meier & Gray, 2014). These studies have consistently found that the visual and interactive aspects of the platform have the greatest influence on body image issues. Despite this, highly visual social media (HVSM) such as Instagram have yet to be robustly researched. This paper sets out to address this research gap. We investigated the effects of daily Instagram use on the prevalence of body image issues among adolescent girls. It was hypothesized that daily Instagram use would be associated with an increase in body image concerns and a decrease in self-esteem ratings.

The introduction of a research paper includes several key elements:

  • A hook to catch the reader’s interest
  • Relevant background on the topic
  • Details of your research problem

and your problem statement

  • A thesis statement or research question
  • Sometimes an overview of the paper

Don’t feel that you have to write the introduction first. The introduction is often one of the last parts of the research paper you’ll write, along with the conclusion.

This is because it can be easier to introduce your paper once you’ve already written the body ; you may not have the clearest idea of your arguments until you’ve written them, and things can change during the writing process .

The way you present your research problem in your introduction varies depending on the nature of your research paper . A research paper that presents a sustained argument will usually encapsulate this argument in a thesis statement .

A research paper designed to present the results of empirical research tends to present a research question that it seeks to answer. It may also include a hypothesis —a prediction that will be confirmed or disproved by your research.

Cite this Scribbr article

If you want to cite this source, you can copy and paste the citation or click the “Cite this Scribbr article” button to automatically add the citation to our free Citation Generator.

Caulfield, J. (2024, September 05). Writing a Research Paper Introduction | Step-by-Step Guide. Scribbr. Retrieved September 9, 2024, from https://www.scribbr.com/research-paper/research-paper-introduction/

Is this article helpful?

Jack Caulfield

Jack Caulfield

Other students also liked, writing strong research questions | criteria & examples, writing a research paper conclusion | step-by-step guide, research paper format | apa, mla, & chicago templates, "i thought ai proofreading was useless but..".

I've been using Scribbr for years now and I know it's a service that won't disappoint. It does a good job spotting mistakes”

IMAGES

  1. What Is an Evaluation Essay? Simple Examples To Guide You

    how to write an evaluation for a research paper

  2. Evaluation Essay

    how to write an evaluation for a research paper

  3. 🌱 Evaluative writing examples. How to Write an Evaluation Report?. 2022

    how to write an evaluation for a research paper

  4. How to Write an Evaluation Essay: Examples and Format

    how to write an evaluation for a research paper

  5. 12+ Sample Evaluation Reports

    how to write an evaluation for a research paper

  6. Evaluation Essay

    how to write an evaluation for a research paper

VIDEO

  1. How to Write an Evaluation Essay

  2. Understanding Evaluation using love Marriage Example

  3. How to write Review Paper (Example 1)

  4. Macro guess question 8-9 AS Economics paper 2 June 2024

  5. Tips to Writing an Evaluation Report

  6. Analysis of the data

COMMENTS

  1. 7 Steps for How to Write an Evaluation Essay (Example & Template)

    7 Steps for How to Write an Evaluation Essay (Example & ...

  2. How to Write an Evaluation Paper With Sample Essays

    How to Write an Evaluation Paper With Sample Essays

  3. How to Write a Literature Review

    How to Write a Literature Review | Guide, Examples, & ...

  4. Writing Tips: Critically Evaluating Research

    Writing Tips: Critically Evaluating Research

  5. How to Write a Peer Review

    How to Write a Peer Review

  6. Writing Critical Reviews: A Step-by-Step Guide

    the article, taking the main points of each paragraph. The point of the diagram is to. show the relationships between the main points in the article. Ev en better you might. consider doing an ...

  7. Critical Analysis and Evaluation

    Critical Analysis and Evaluation | Writing and ...

  8. How to Write an Evaluation Essay

    The evaluation essay is one of the more common types of advanced academic writing. While a basic research paper or essay asks a student to gather and present information, the evaluation essay goes a step further by asking students to draw conclusions from the information they have researched and present an informed opinion on a subject. The role of opinion in the evaluation essay can be ...

  9. How to Write a Research Paper

    How to Write a Research Paper | A Beginner's Guide

  10. Writing a Research Paper Conclusion

    Writing a Research Paper Conclusion | Step-by-Step Guide

  11. How to write a thorough peer review

    You should now have a list of comments and suggestions for a complete peer review. The full peer-review document can comprise the following sections: 1. Introduction: Mirror the article, state ...

  12. Evaluating Research in Academic Journals: A Practical Guide to

    (PDF) Evaluating Research in Academic Journals

  13. Research Paper: A step-by-step guide: 7. Evaluating Sources

    Evaluation Criteria. It's very important to evaluate the materials you find to make sure they are appropriate for a research paper. It's not enough that the information is relevant; it must also be credible. You will want to find more than enough resources, so that you can pick and choose the best for your paper.

  14. Evaluating Research

    Evaluating Research refers to the process of assessing the quality, credibility, and relevance of a research study or project. This involves examining the methods, data, and results of the research in order to determine its validity, reliability, and usefulness. Evaluating research can be done by both experts and non-experts in the field, and ...

  15. 12.7 Evaluation: Effectiveness of Research Paper

    Implement style and language consistent with argumentative research writing while maintaining your own voice. Determine how genre conventions for structure, paragraphing, tone, and mechanics vary. When drafting, you follow your strongest research interests and try to answer the question on which you have settled.

  16. Critical appraisal of published research papers

    INTRODUCTION. Critical appraisal of a research paper is defined as "The process of carefully and systematically examining research to judge its trustworthiness, value and relevance in a particular context."[] Since scientific literature is rapidly expanding with more than 12,000 articles being added to the MEDLINE database per week,[] critical appraisal is very important to distinguish ...

  17. How to Write Evaluation Reports: Purpose, Structure, Content

    How to Write Evaluation Reports: Purpose, Structure ...

  18. Research Guides: Writing a Research Paper: Evaluate Sources

    Evaluate Sources - Writing a Research Paper

  19. Evaluation Research: Definition, Methods and Examples

    Evaluation Research: Definition, Methods and Examples

  20. How to Write a Results Section

    How to Write a Results Section | Tips & Examples

  21. How to Write and Publish a Research Paper for a Peer-Reviewed Journal

    The introduction section should be approximately three to five paragraphs in length. Look at examples from your target journal to decide the appropriate length. This section should include the elements shown in Fig. 1. Begin with a general context, narrowing to the specific focus of the paper.

  22. Evaluation Essay

    When writing an evaluation essay, a writer must always be backed up by evidences so that he or she can support the evaluation being made. ... To identify critical points of a written work may it be a poem, another essay or a research paper; To create a literature or literary review to fully identify the content of a literary piece; To give ...

  23. Useful Research Prompts

    Pretend you are an extremely harsh referee for a top-3 finance journal. Write a comprehensive and critical referee report on this paper. First, write the summary of the paper. Then, focus on the major weaknesses of the paper. You must elaborate extensively on each potential weakness and back it up by referring to the specific text in the paper.

  24. Evaluating Sources

    Evaluating Sources | Methods & Examples

  25. Writing for Publication in Nursing and Healthcare: Getting it Right

    Writing for Publication in Nursing and Healthcare helps readers develop the skills necessary for publishing in professional journals, presenting conference papers, authoring books, research reports, and literature reviews, and more. This comprehensive resource covers all aspects of writing for publication, including good practice in reviewing, the editorial process, ethical aspects of ...

  26. Writing a Research Paper Introduction

    Writing a Research Paper Introduction | Step-by-Step Guide