10 tips for reviewing scientific manuscripts – and 5 red flags
A seasoned journal editor and professor gives advice to peer reviewers
10 tips for reviewing
1. Ensure that the subject is within your purview of expertise. Thus, if you are an interventional cardiologist, it would probably be best if you declined an opportunity to review a manuscript involving the pathogenesis of an arrhythmia.
2. Read the abstract first to see if what the authors are stating makes logical sense and if it is written in a way that is comprehensible. Some manuscripts involve excellent work and interesting observations, but they are so poorly written that it is difficult to understand what the author is saying. This is a relatively common problem with authors whose native language is not English. If the work reported in the manuscript looks interesting and valuable, it should be sent back for editing by a native English speaker or professional translator.
3. Determine if the observation made and reported is something new or if it reproduces previously made observations? Clearly, the more original the observation, the more likely the manuscript should be accepted for publication.
4. Examine tables and figures to see if the legends are clear and if the tables and figures demonstrate the same thing that is stated in the text. Frequently, material placed in a table does not have to be reported in detail in the results section.
5. Look to see if the statistical analysis makes sense. Are the differences reported in the statistical analysis of sufficient magnitude to be of biological or clinical significance? Sometimes, a small statistically significant difference between two or more groups of patients is so small as to be (as my mentor, Lewis Dexter used to say) “biologically insignificant.”
6. Examine the methods to make sure the authors knew what they were doing. If their laboratory analyses were just run on a commercial kit without input from someone in the hospital or medical school laboratory, these results may be of lower quality and higher variability. Make sure the study is based on a sufficient number of patients or measurements. Ask a biostatistician to review the manuscript if there is any question of the reliability of the analyses performed.
7. Read the discussion and see if it makes sense and if it reflects what the data in the article reports. Look for unnecessary conjecture or unfounded conclusions that are not based on the evidence presented.
8. Note whether the manuscript is concise and well organized. Most of the ones I receive could be shortened with improvement.
9. Note whether the quality of the figures or photos is adequate for accurate reproduction. If not, ask the journal staff (for example, the managing editor) what is required. Then, as part of your review, you can recommend that the authors have an expert at their institution reformat the figures to meet the specified requirements.
10. Please take this job seriously. It is a professional honor to be invited to review a scientific manuscript; the journal’s reputation depends in part on this peer review process.
5 reasons to pause
Stop and contact the journal’s editorial staff if you can answer “yes” to any of these questions:
1. Has the author neglected to follow the instructions that are part of your journal’s submission criteria?
2. Are there potential conflicts of interest either declared or not declared but known by the reviewer? If the review is not blinded, i.e., you know who the authors are, do they have a “track record” of working in this area, and are they from a reputable institution?
3. Was there appropriate informed consent (human experiments) with documentation that a human or animal protection committee reviewed the protocol prior to the initiation of the study?
4. Is the manuscript full of typographical errors or mistakes in references, implying a sloppy job of putting it together?
5. Is there a chance that there is scientific fraud or plagiarism involved in this manuscript? Subjectively, do you believe what the authors are telling you or do you suspect some consistent error in the hypothesis, methods, analysis of data, etc?