Formal Critique There are many acceptable formats for critiquing the scientific literature. The purpose of this outline and the following article “Bringing up scientists in the art of critiquing research” is to provide ideas and guidance to those who may not already have a preferred writing style. Suggested Critique Format: Heading • Critique/Review • Reviewer • Your Major, Department, University, City, State • Article Title • Author(s) • Journal: (include name of journal, volume, pages) Overview/Brief Description Without going into detail or injecting your personal opinion, provide a summary of the study. Include the following information: • The objective of the study and why the author believes it to be significant. • Summarize how the author investigated his/her objective. • Describe the most noteworthy results • Present the author’s main conclusions Strengths and Weaknesses (1-2 paragraphs on each) The following is a list of important aspects to consider when evaluating scientific literature. Not all may be relevant to a particular study. Consider these points and address a subset of them or other relevant points in the strengths or weaknesses portion of your review. Remember that justification is key for addressing strengths and weaknesses.• How appropriate is the Title? • Does the study have scientific merit? Did it contribute to society’s understanding of ecology? • Were the methods sufficient to test the author’s objective? • Could the study be duplicated to acquire similar results? • Are figures and tables well organized and necessary? Do the results address the main objectives of the study? • Are the conclusions supported by the results? • Is the paper well written/easy to follow? • Does this study inspire additional research? If so, provide examples. *Clarity and conciseness should be emphasized in your critiques. Avoid scientific jargon. The Critique is limited to 2 pages (doubled spaced).
Formal Critique There are many acceptable formats for critiquing the scientific literature. The purpose of this outline and the following article “Bringing up scientists in the art of critiquing rese
Bringing up scientists in the art of critiquing research by Barbara J. Kuyper (from BioScience 1991. 41(4):248 -249) In addition to factual knowledge of a given discipline, scientifically literate college graduates need analytical skills to interpret, appl y, and communicate the scientific information they have acquired (AAAS 1990, NAS 1989). For research scientists, analytical skills are essential in writing , critiquing, revising, and defending research proposals and articles and reviewing the research of other scientists. Critical thinking and writing are activities integral, rather than peripheral, to scientific research. As Sidney Perkowitz (1989) of Emory University writes, “I have learned that when I write a research paper I do far more than summari ze conclusions already neatly stored in my mind. Rather, the writing process is where I carry out the final comprehension, analysis, and synthesis of my results” (p. 353). But graduate students rarely receive formal training in thinking or writing about research. Many become good scientists who are nonetheless severely handicapped in communicating their own research and in eliciting useful assessments of it from others. With a good analytical mind and a few other tools at hand, however, a scientist at a ny career stage can learn the art of critiquing research. Critical assessment of research articles Traditionally, the scientific method involves formulating a hypothesis, designing an experiment to test the hypothesis, collecting data, and interpreting t he data. The structure of research articles (called IMRAD) parallels this sequence: introduction, including statement of objective; methods; results; and discussion. The model for conducting research and the structure for presenting it have variations, but the basic analogy remains. Research is conducted and presented by the scientific method, and it can also be analyzed by using the same logical sequence of steps. Critical assessment of a research article appropriately occurs at several stages. The a uthor critiques the first draft and revises it accordingly. Friendly colleagues review the revised draft, and the author revises the manuscript again in the light of their suggestions. These pre -submission critiques and revisions are intended to improve the written presentation of research, short -circuit unfavorable reviews, and decrease time to publication. On submission, the article undergoes peer review to determine acceptability for publication. when an article enters the scientific literature, it b ecomes open to scrutiny by other scientists, as well as by journalists, politicians, and the general public, and at this stage a scientist’s reputation can be firmly established or irrevocably damaged. The value of being able to self -critique manuscripts and to have confidence in the critique cannot be overemphasized. A scientist should ask, “What was my bias in carrying out procedures or in collecting data? Did I want my results to happen?” Scientists are human and thus subjective, and awareness of on e’s own subjectivity is essential in preparing objective research results for presentation to the scientific community (Harper 1990). For the same reason, scientists need to learn how to elicit useful critiques from colleagues. “Is my bias showing? Can you tell what I’m most afraid of? Can you detect any weaknesses in my experimental design or methodology that an incisive reader will most certainly expose if you don’t? As a friendly colleague, I’d like you to tell me before a journalist tells the world !” Developing skills in critiquing research Some tools are needed for training scientists to critique their own and their colleagues’ research articles. An analytical mind -set is basic to all facets of scientific research, including critical analysis of the scientific literature. In editing manuscripts for research scientists, I prepare a written summary that assesses the article section by section. This editorial critique is designed to give the author an overview of the manuscript rather than getting bogged down in editorial clean -up work or a sentence -by-sentence analysis. A colleague’s written critique also provides an overview, but it emphasizes design and interpretation of research rather than presentation. The checklist, a traditional editors’ tool, is also useful in scrutinizing scientific manuscripts from authors’, statisticians’, and reviewers’ standpoints (Applewhite 1979, CBE Style Manual Committee 1983, Gardner et al. 1986, Squires 1990). I have developed a checklist for critiquing a rese arch article at an early draft stage that both the author and in-house reviewers can use (see box). The checklist focuses on structure, or organization, and its interrelationship with content. It is based on the IMRAD structure but can be modified for ot her types of journal articles. In assessing articles with the aid of the checklist, fluorescent color markers are useful tools that give authors and reviewers something useful (and playful) to do. I use a yellow marker to call attention to statements of objectives at various points in the manuscript (and discrepancies among them) and a rose marker to identify undefined or misused terms. A critique of the introduction alone (steps 1 -4) sometimes unravels the entire article. Discrepancies between the titl e of the article and the stated objective at the end of the introduction throb in the fluorescent color. The researcher may discover an ambiguity in thinking about the purpose of the research that was previously concealed but is now glaringly obvious. A careful scrutiny of research methods (steps 5 -8) may expose fatal flaws in sample selection or experimental design that invalidate the results. This disturbing revelation can be beneficial over the long run, however, if it helps the scientist to cut loss es and move on to better -defined research. A review of methods on completion of a research project can also emphasize the importance of choosing an appropriate experimental design at the onset and evaluating the research project as it develops. The resul ts, particularly as presented in tables and illustrations, almost inevitably require drastic redesign and revision. Selecting, aligning, and labeling data appropriately in tables require as much thought as does the textual description of results. Ideally , the author has designed the tables before writing the results section, and steps 9 -12 on the checklist directs reviewers to examine the tables first. A table should be self – explanatory, with a title that accurately and concisely describes content and co lumn headings that accurately describe information in the cells. Instructions for preparing scientific tables (CBE Style Manual Committee 1983) and illustrations (CBE Scientific Illustration Committee 1988) are invaluable tools in writing and revising res earch articles. Authors often seem mentally fatigued by the time they have defined in writing what their research was really about, struggled with statistical analysis of data, sorted out meaningful results, and revised tables again and again. Consequent ly, the discussion often degenerates into a feeble rewording of results rather than interpretation of the research and its status in relation to other studies in the field. In critiquing the discussion section (steps 13 -16), the author can easily detect m ere repetition of results. To validate and refine interpretation, however, a colleague’s probing questions are probably more fruitful at this stage than is self -examination. The overview section of the checklist (steps 17 -20) requires the author or revie wer to step back and reconsider the manuscript as a whole. Does the author think and write logically? Is the organizational sequence of the paper logical and appropriate to content? Are the objectives and results of the research stated clearly? Does th e article fit the stated purpose of the journal to which it is being submitted? Conclusions After all is said and done, critiquing research is intellectual fun. The ability to scrutinize a piece of writing with a critical eye requires time for leisurely contemplation, an analytical mind (the scientific mind?), a zest for arguing with colleagues, and the ability to set ego aside. If we do not assess our own research, journal reviewers and subsequent readers will do it for us, with the potential for much more badly bruised egos and scientific reputations. Acknowledgments I thank Stephen B. Kritchevsky, Department of Biostatistics and Epidemiology, University of Tennessee, Memphis, and Jerry M. Williams, Department of Horticulture, Virginia Polytechnic I nstitute and State University, for critiquing this manuscript. References cited American Association for the Advancement of Science (AAAS). 1990. The Liberal Art of Science: Agenda for Action. AAAS, Washington, D.C. Applewhite, L. 1979. Examin ation of the medical/scientific manuscript. Journal of Technical Writing and Communication 9:17 -25. CBE Scientific Illustration Committee. 1988. Illustrating Science: Standards for Publication. Council of Biology Editors, Bethesda, MD. CBE Style Man ual Committee. 1983. CBE Style Manual: A Guide for Authors, Editors, and Publishers in the Biological Sciencs. 5th edition. Council of Biology Editors, Bethesda, MD. Gardner, M.J., D. Machin, and M.J. Campbell. 1986. Use of check lists in assessing the statistical content of medical studies. Br. Med. J. 292:810 -812. Harper, A.E. 1990. Critical evaluation -the only reliable road to knowledge. BioScience 40:46 -47. National Academy of Sciences (NAS), Committee on the Conduct of Science. 1989. O n Being a Scientist. National Academy Press, Washington, D.C. Perkowitz, S. 1989. Commentary: can scientists learn to write? Journal of Technical Writing and Communication 19:353 -356. Squires, B.P. 1990. Statistics in biomedical manuscripts: wha t editors want from authors and peer reviewers. Can. Med. Assoc. J. 142:213 -214. _Barbara J. Kuyper is an assistant professor in the Department of Health Informatics, University of Tennessee, Memphis, TN 38163. She is responsible for developing the sci entific writing component of a curriculum for graduate students planned to include training in information science, analytical skills, scientific communication, and the roles and responsibilities of scientists in the world community. She teaches a graduat e course on writing journal articles and a faculty workshop on critiquing research articles. © 1991 American Institute of Biological Sciences. Checklist for critiquing a research article Title: Author: Introduction 1. Read the statement of purpose at the end of the introduction. What was the objective of the study? 2. Consider the title. Does it precisely state the subject of the paper? 3. Read the statement of purpose in the abstract. Does it match that in the introduction? 4. Check the sequ ence of statements in the introduction. Does all information lead directly to the purpose of the study? Methods 5. Review all methods in relation to the objective of the study. Are the methods valid for studying this problem? 6. Check the methods fo r essential information. Could the study be duplicated from the information given? 7. Review the methods for possible fatal flaws. Is the sample selection adequate? Is the experimental design appropriate? 8. Check the sequence of statements in the m ethods. Does all information belong in the methods? Can the methods be subdivided for greater clarity? Results 9. Scrutinize the data, as presented in tables and illustrations. Does the title or legend accurately describe content? Are column heading s and labels accurate? Are the data organized for ready comparison and interpretation? 10. Review the results as presented in the text while referring to data in the tables and illustrations. Does the text complement, and not simply repeat, data? Are t here discrepancies in results between text and tables? 11. Check all calculations and presentation of data. 12. Review the results in the light of the stated objective. Does the study reveal what the researcher intended? Discussion 13. Check the inter pretation against the results. Does the discussion merely repeat the results? Does the interpretation arise logically from the data, or is too far -fetched? Have shortcomings of the research been addressed? 14. Compare the interpretation to related stud ies cited in the article. Is the interpretation at odds or in line with other researchers’ thinking? 15. Consider the published research on this topic. Have all key studies been considered? 16. Reflect on directions for future research. Has the author suggested further work? Overview 17. Consider the journal for which the article is intended. Are the topic and format appropriate for that journal? 18. Reread the abstract. Does it accurately summarize the article? 19. Check the structure of the art icle (first headings and then paragraphing). Is all material organized under the appropriate heading? Are sections subdivided logically into subsections or paragraphs? 20. Reflect on the author’s thinking and writing style. Does the author present this research logically and clearly? __Explicit Authorship by Carlos Galindo -Leal (from Bull. Ecol. Soc. Amer. 1996. 77(4): 216 -220)