Skip to content

Writing Papers

To me, writing papers is a primary mechanism for doing research, not just reporting it. Often starting a paper draft before beginning the project makes sense—this helps us consider what experiments to perform to make our case.

Write extensively and use complete sentences—even in notes. Writing facilitates thinking.

Disseminating our work is key to our job as scientists. Authoring papers isn't a numbers game, but we can't ignore that publication number and quality are perhaps the most important criteria by which we're evaluated. The writing process also helps with thinking and shapes projects.

"Interesting and unpublished" is equivalent to "non-existent."

G.M. Whitesides

In writing papers, we should aim to: - Have one clear message per paper. - Remain that giving credit to others doesn't diminish the credit you receive from our paper. - Convey intuition first, not second. Use examples.

Process

  1. Create an Overleaf project as soon as you have evidence you won't abandon the project
  2. Create a paper file based on our template
  3. Brain dump ideas, figures, and text in the paper file as you progress. Periodically rewrite parts and reorganize. Use it to guide experiments—do you find interesting angles? Do you need specific figures to make your point? Already create figures in the right size (most journals layout figures single column (8 cm) or double column (16 cm))
  4. Iterate continuously
  5. Clean up references
  6. Separate sections into files as you write the full draft

Checklist

  • Did I read the entire piece aloud?
  • Did I carefully check that a patent application is not relevant? If unsure, raise the issue with Kevin.

Formatting

  • Did you run a spellchecker such as LanguageTool?
  • Did you check for conciseness with Hemingway app (https://hemingwayapp.com) and The Writer’s Diet?

General Content

  • Is there one clear message?
  • Is every word, sentence, paragraph understandable by a non-expert? Assume a first-year undergraduate
  • Does the cure match the disease? Does your introduction's problem statement match your solution?
  • Do you show, not tell? Do you use specifics instead of abstractions? Did you present ideas as observable truths, not arguments?
  • Are your claims refutable?
  • Are you making claims you are not 100% sure about? If yes, delete them (or indicate uncertainty).
  • Does every claim have linked evidence?
  • Do you use active voice?
  • Do you avoid nominalizations where you could use active verbs?
  • ❌ "there was an appearance of improvement" → ✅ "it improved"
  • Can you delete any word, sentence, or paragraph without changing meaning? If yes, delete it
  • Are there adjectives and adverbs without objective reference? (e.g., "impressive," "best")
  • Does the outline follow logical flow instead of chronological order?
  • Did you avoid excessive metacommentary ("In this section I will discuss...")?
  • Did you replace weak verb-noun combinations with strong verbs?
  • ❌ "make a decision" → ✅ "decide"
  • ❌ "perform an analysis" → ✅ "analyze"
  • Do all pronouns have clear antecedents?
  • ❌ "This shows significant promise" → ✅ "This approach shows significant promise"
  • ❌ "It was observed that..." → ✅ "We observed that..."
  • Do you avoid jumping to conclusions without building dramatic tension?
  • ❌ "Using compound X, we achieved 95% efficiency"
  • ✅ "Standard approaches failed due to Y limitation. By addressing Y through X, we achieved 95% efficiency"

Introduction

  • Does your first something so general that it could be prepended to any paper in the field? If yes, delete it

Content Verification

  • Did you double-check every number in text, tables, and plots? Did you record the source for each number in the LaTeX source?

Abstract

  • Did you follow the Nature Summary Paragraph Template guidelines?

References

  • Are all preprints shown in the same style? We default to using article as BibTeX type with journal = {arXiv preprint arXiv:<arxivnumber>}
  • Are there spelling mistakes?
  • Are journal names consistently abbreviated? We default to abbreviated journal names

Pet peeves

  • Comma after introductory clauses like: however, therefore, additionally,
  • We use the Oxford comma to avoid confusion (“a, b, and c” versus “a, b and c”)
  • Numbers less than ten should be spelled out
  • All units have a space between the number and the unit (not “400uM” but “400 uM”)
  • I tend to avoid words like “transformative”, “breakthrough”
  • In our field, the meaning of ‘significant’ is ‘statistically significant’, not ‘large’.
  • Distinguish between percent and percentage points: when a rate changes from 20% to 30%, it increased by 10 percentage points (not 10%), or alternatively by 50% relative to the original value. Do not use "%" or "percent" when you mean "percentage points". Use "percentage points" when you mean the difference between two percentages.

Authorship

Authorship is complicated. Science is a team effort, but current evaluation mechanisms focus on attributing achievements to individuals. Contributions to scientific outputs are multidimensional—from ideation and implementation to writing and reviewing. Current author lists force us to project this multidimensional data onto one line, which loses information.

While this system exists, we must work within it using two approaches: - Be transparent in listing every member's contributions - Use shared authorships liberally while ensuring the author list faithfully represents impact

Key principles: - First and last authors are people without whom the work wouldn't exist—they made it happen - Shared authorship is valuable—we lose little by sharing first author position but can better represent that work wouldn't have happened without any individuals in the shared author set

Reference: Consult DFG guidelines on good scientific practice.

LaTeX

Whenever possible, I want to use LaTeX for authoring papers, ideally using Overleaf.

Start with this guide if you don't know how to use LaTeX.

When using LaTeX, I like to do the following things:

  • use microtype - This package the makes the text look nicer
  • use cref - It makes referencing easier, e.g \Cref{fig:figure_1} instead of Fig.~\ref{fig:figure_1}
  • use siunitx for units, e.g. \SI{10}{\kilo\gram}
  • use

    \clubpenalty=10000
    \widowpenalty=10000
    \displaywidowpenalty=10000
    
  • use booktabs and tabularx for nice tables (no vertical lines)

  • where possible, use pdf for figures and try to create figures in one or two-column size (but nothing in between)
  • Use \𝘶𝘴𝘦𝘱𝘢𝘤𝘬𝘢𝘨𝘦[𝗯𝗮𝗰𝗸𝗿𝗲𝗳=𝗽𝗮𝗴𝗲]{𝘩𝘺𝘱𝘦𝘳𝘳𝘦𝘧} instead of simply \𝘶𝘴𝘦𝘱𝘢𝘤𝘬𝘢𝘨𝘦{𝘩𝘺𝘱𝘦𝘳𝘳𝘦𝘧} to make it easier for a reader to jump to the references and back.
  • In each figure and table and environment, leave a reference to the script you used to generate it as well as a link to the data source.

Rebuttals

The reviewer is almost always right. This doesn't mean their ratings are correct or their reasoning must make sense, but there's always some reason behind their opinion. Even if we think we explained something clearly: at least one reviewer didn't understand it, so we need to clarify. Search for the reason behind the reviewer's opinion and address it directly. Thank reviewers when you mean it—when they genuinely improved the paper. Don't be dishonest or repeat thanks mechanically.

Rebuttals for Journals

  • In the point-by-point response you want to have most of the text be quotes of changes in the main text/appendix. Do not give a lenghty discussion: If there is something the reviewer wanted to have clarification on or added, it is something also readers deserve to see.
  • In the point-by-point response, it can be useful to lead with some comments about overarching major changes you want to highlight. This is particularly useful if a lot of rework happened in the revision.
  • You will need to create a document with highlighted differences. You can use latexdiff for this. If you include multiple LaTeX files in one main file, you might need to use latexpand to collect all LaTeX content into one file. You can directly use latexdiff on Overleaf.

Rebuttals for ML Conferences

Know Your Audiences

  • Reviewers: Have read your paper (to varying degrees) but may have forgotten details or misunderstood them initially.
  • Area Chairs (ACs): Likely less familiar with your work. Assume they'll only read the reviews and rebuttal.

Your Goals

  • For reviewers: Clarify doubts, answer questions, correct misunderstandings, push back on mischaracterizations, and demonstrate good-faith effort to incorporate feedback.
  • For ACs: Convince them you made a good-faith effort, present a representative summary of reviews, show how reviewer concerns were addressed, call out bad-faith reviewing, and help them make decisions. Critical insight: ACs make the decisions, but we often focus too much on reviewers. Provide evidence you addressed concerns based on the rebuttal alone—people should understand what's happening without knowing the paper. Always mention the positives reviewers highlighted. Start with these.

The Process

  1. The rebuttal process begins right after you submit the paper. Since you might have rushed for the deadline, you might be aware of issues that need addressing. Use the time before the rebuttal to fix these issues and already anticipate reviewer comments.
  2. Clear your calendar for the rebuttal period. You need time to think, write, and revise.
  3. Itemize reviewer comments immediately after receiving them in a shared document. Use color coding:
  4. Red – requires extra experiments
  5. Orange – requires writing changes
  6. Green – noteworthy praise of the paper
  7. Brain dump possible responses right after that
  8. Write draft rebuttal: Start with positives, then address concerns by importance (and quality of response). Let reviewers speak for themselves—quote their comments, then respond
  9. Review and revise
  10. Post your rebuttals as soon as possible. This gives reviewers enough time to read (and raise their score!). Key difference from journal rebuttals: Do the "show" part in the rebuttal, not the paper. Don't say "we will explain XY in the paper"—explain it in the rebuttal and mention you'll add it to the paper.

Rebuttals for Journals

Apply "show, don't tell" to rebuttals. Don't discuss with reviewers—show them changes in the main text responding to their comments. Nearly always, some change addresses their comment and improves the paper. Response text should rarely be longer than the quoted change in the main text.

The Process

  1. Itemize reviewer comments immediately in a shared document using our template. Create a new folder in the Overleaf project
  2. Brain dump possible responses right after that
  3. Write point-by-point response: Reference responses to other reviewers when relevant
  4. Include all reviewers' comments as provided
  5. Respond to each comment before moving to the next
  6. Address all points
  7. Make it self-contained—include all main text changes in the response
  8. Begin each response with a direct answer to the point raised
  9. Add summary of main changes made in response to comments
  10. Review and revise

Resources

Galley Proofs

Galley proofs represent the final opportunity to catch errors before publication. Publishers often modify figures, formatting, and occasionally introduce errors during the production process. A systematic review process is critical to maintaining scientific integrity.

Prerequisites

To enable comprehensive verification, maintain these standards during manuscript preparation:

During Figure/Table Creation

  • Embed source references in LaTeX comments for each figure and table

    % Figure generated by: scripts/analysis/plot_performance.py
    % Data source: data/processed/model_results_2024.csv
    % Last updated: 2024-12-15
    \includegraphics{figures/performance_comparison.pdf}
    

  • Maintain clean, documented analysis scripts that can be re-run

  • Preserve exact data files used for each analysis
  • Version control all analysis code and track which version generated each figure
  • Document computational environment (package versions, etc.) for reproducibility

File Organization Standards

  • Standardized directory structure for data, scripts, and outputs
  • Clear naming conventions linking figures to generating scripts
  • Archived snapshots of analysis environment at submission time

Note that much of this would be automated if you used a tool like showyourwork.

Workflow

  1. When galley proofs arrive, the first author must:
  2. Create a shared tracking document (Google Sheets or similar) with all reviewers. This should include:
    • Reviewer names
    • Checklist items (make separate entries for every figure and table as well as one for the data points in the text)
    • Status of each item (e.g., "Not started", "In progress", "Completed")
  3. Set deadline for review completion
  4. Assign reviewers (minimum three people; first and corresponding author must always be included)
  5. Distribute original manuscript files for comparison alongside galley proofs

Reviewers should budget at least 4 hours for this process.

  1. Systematic Review Process Each reviewer must independently complete the following checklist:

    Each reviewer must independently complete the following checklist:

    Content Verification

    • Trace every figure back to source data and code
    • Verify figure against original submitted version (minimum standard)
    • Ideally, re-run analysis code to confirm figure accuracy
    • Check that data files match what was actually used
    • Validate all labels, legends, axis titles are correct
    • Confirm figure order and numbering
    • Verify figure quality and resolution
    • Trace every table back to source data
    • Verify table against original submitted version (minimum standard)
    • Ideally, re-run script that generated the table
    • Confirm headers and formatting match source
    • Verify table order and numbering
    • Cross-check all in-text citations
    • Verify author names, journals, and publication years
    • Validate all equations and mathematical expressions
    • Confirm all author names, affiliations, and contact information

    Technical Elements

    • Verify acknowledgments and funding information
    • Confirm supplementary material references

    Quality Control

    • Read entire paper aloud or carefully to catch typographical errors. Consider using text-to-speech tools for this. It is often easier to catch errors this way. Alternatively, or in addition, read the paper backwards, starting from the end.
    • Check for formatting inconsistencies
    • Verify all hyperlinks and DOIs work correctly

    Every reviewer must track the status of their checklist item in the shared document.

  2. Final consolidation. Before approving proofs, the first author must:

    • Verify that all checklist items are completed and that changes have been implemented
    • Archive all document versions, raw data and scripts. Use a public repository like Zenodo or Figshare to ensure long-term access.

Affiliations

FSU employees

  • Laboratory of Organic and Macromolecular Chemistry (IOMC), Friedrich Schiller University Jena, Humboldtstrasse 10, 07743 Jena, Germany

HZB employees via HIPOLE

  • HIPOLE Jena (Helmholtz Institute for Polymers in Energy Applications Jena), Lessingstrasse 12-14, 07743 Jena, Germany

  • Helmholtz-Zentrum Berlin für Materialien und Energie GmbH, Hahn-Meitner-Platz 1, 14109 Berlin, Germany

Additional for Kevin

  • Center for Energy and Environmental Chemistry Jena (CEEC Jena), Friedrich Schiller University Jena, Philosophenweg 7a, 07743 Jena, Germany
  • Jena Center for Soft Matter (JCSM), Friedrich Schiller University Jena, Philosophenweg 7, 07743 Jena, Germany

Resources