Self-review template
Structured feedback for your own (or a peer’s) project
This course has no instructor and no grades, so feedback is something you generate for yourself. Reading a project’s disclosure carefully (your own or a peer’s) is the single best way to internalise the disclosure rubric and to notice the gap between what you did and what you wrote.
The template below mirrors the four-dimensional disclosure rubric, plus two open-ended prompts. Use it on your own work first. If you’re taking the course with a study group, also use it on each other’s projects.
How to use this template
Self-review (the default). Set the project aside for a day after you think it’s done. Then come back, read the project as if a stranger wrote it, and fill in the template against your own work. The 24-hour gap is doing real work. Fresh eyes catch the gaps your tired eyes wrote.
Peer review (optional). If you’re taking this course with a study group:
- Read the project (the analysis, brief, or protocol) before you read the disclosure.
- Then read the disclosure paragraph at the bottom of the project.
- Fill in the four rubric questions and the two open-ended prompts below.
- Share your review with the author. There’s no submission system. This is about feedback, not bureaucracy.
Aim for 250 to 400 words total. Be specific, be useful, be kind. See the note on tone at the end.
Template (copy and fill in)
Reviewer: [your name, or “self-review” if reviewing your own work] Reviewing: [project title, yours or the author’s] Date: [YYYY-MM-DD]
1. Tools listed?
The disclosure names which tools were used (specific tool, model or version, access tier).
Brief note (one or two sentences): …
2. Use described?
The disclosure says what the AI did concretely (drafted X, suggested Y, rewrote Z), not just “for help”.
Brief note: …
3. Verification stated?
The disclosure names what was checked and how (ran tests, verified citations on PubMed, ran code on test data, cross-referenced documentation).
Brief note: …
4. Rejections noted?
The disclosure lists at least one AI suggestion that was rejected, with a reason.
Brief note: …
5. One thing this project does well
One to three sentences. Be specific. Point at a section, a decision, or a sentence.
6. One thing to improve next time
One to three sentences. Frame as a discernment failure or a missing diligence step, not a personal critique. See the tone note below.
A note on tone
Discernment is a skill, not a personality trait. The same person can practise good discernment on Tuesday and miss a fabricated citation on Wednesday. What matters is that the verification habit exists, not that any one learner “is good at” AI critique.
When you flag something missed, name the move that would have caught it (“a step-3 DOI resolution would have caught this; the link goes to a different paper”), not the person (“you didn’t verify”). The first comment is actionable. The second feels like an accusation, even when it is not meant that way. This applies just as much when you’re reviewing your own work. Be clinical, not self-flagellating.
The disclosure dimensions are intentionally about practice, not quality. A project can be excellent science with a thin disclosure. That is what the review should flag, and it is exactly the lesson that gets the rubric to stick the next time you write one.