Candidate evaluation forms: Templates for fair hiring

TL;DR

  • Candidate evaluation forms turn interviews into evidence, so hiring decisions are consistent, quick, and fair.
  • Use role-relevant criteria, a clear scoring rubric, and space for verbatim evidence for every interviewer.
  • Standardise across stages: CV screen, structured interview, task, and final review.
  • Track pass–through, average scores by criterion, and rater alignment to spot bias and improve the hiring process.
  • Sapia.ai can handle the first mile with mobile-first structured interviews, explainable scoring, and interview scheduling, while hiring managers stay in charge of decisions.

A blind, automated evaluation like a structured AI interview should always be your first step to completely remove human bias from initial screening. Then, follow with a structured evaluation process that assists interviewers in staying objective.

A solid candidate evaluation form is one of the simplest ways to improve hiring quality. It gives interviewers a shared structure, helps you compare candidates on evidence rather than instinct, and protects the candidate experience by keeping outcomes timely and transparent. 

What is a candidate evaluation form?

A candidate evaluation sheet is a single source of truth that captures how interviewers assess skills, behaviours, and values against a role profile. It usually includes a rubric, anchored questions, numeric ratings, and space for short notes that quote what the candidate said or did. Consistent use reduces unconscious bias, improves the quality of discussion, and speeds up offers without sacrificing fairness.

Expert advice on what makes effective candidate evaluation forms

Before you jump to templates, it helps to align on what good looks like.

1) Role profile and success criteria

  • Summary of the job and the outcomes that matter in the first 90 days.
  • The few must-have skills and behaviours that predict success.

2) Structured prompts and tasks

  • 4 to 6 behavioural questions tied to real work.
  • 1 short, role-relevant task or work sample.

3) Scoring rubric to develop

  • A 1 to 5 or 1 to 4 scale, with behavioural anchors for each level.
  • Weighting for critical criteria if needed.

4) Evidence boxes

  • Space to capture verbatim snippets or observable actions.
  • Notes must reference the prompt or task, not general impressions.

5) Red flags and accommodations

  • A checkbox to call out any job-relevant risks, plus space to note adjustments provided during the interview.

6) Overall recommendation

  • Hire, Hold, or No hire, with one-sentence justification that points back to evidence.

Candidate evaluation form templates

You can lift any of the templates straight into your ATS or an Excel or Google Sheet. Use one form per interviewer, then combine into a single view for the panel.

General candidate evaluation template

Use for phone screens or first interviews.

Candidate:
Role:
Interviewer:
Stage: First in-person or phone interview, Task review

Criteria and rubric (1–5)

  • Customer focus
  • Problem solving
  • Teamwork and communication
  • Role-specific knowledge
  • Values and motivation

Behavioural prompts

  1. Tell me about a time you handled a difficult customer or stakeholder. What happened and what did you do?
  2. Walk me through a recent problem you solved end-to-end.
  3. Give an example of working with a team under time pressure.
  4. What attracted you to this role and company?

Ratings

  • Customer focus: [1–5]
  • Problem solving: [1–5]
  • Teamwork and communication: [1–5]
  • Role knowledge: [1–5]
  • Values and motivation: [1–5]

Evidence notes

  • Prompt 1 evidence:
  • Prompt 2 evidence:
  • Prompt 3 evidence:
  • Prompt 4 evidence:

Red flags or concerns
Accommodation provided
Overall recommendation: Hire, Hold, No hire
One-sentence rationale:

Candidate interview evaluation form, by stage

Use this to standardise second-round or panel interviews.

Stage: Panel interview
Weighting: Problem solving 30 per cent, Collaboration 25 per cent, Role knowledge 25 per cent, Values 20 per cent

CriterionAnchor for 1Anchor for 3Anchor for 5Score
Problem solvingJumps to solution without clarifyingBreaks problem into parts, tests assumptionsBuilds options, quantifies impact, selects best route
CollaborationTalks about “I” onlyShares credit, uses feedbackShows conflict resolution, proactive support across functions
Role knowledgeVague, general termsWorking knowledge, some gapsConfident detail, up to date on tools and trends
Values and motivationMisaligned driversBasic alignmentClear alignment to mission and ways of working

Task evidence
Panel notes
Decision and rationale

Candidate evaluation survey

Use post-interview to measure the candidate experience.

Ideally, sent after the interview, giving candidates time and space to respond. Keep feedback anonymous to protect candidate privacy and encourage honesty. 

Ask 3 quick questions on a 1–5 scale:

  1. The process felt clear and respectful.
  2. The interview questions felt relevant.
  3. I understand next steps and timeline.

Add one open question:

  • Is there anything we could improve for future candidates?

Scoring rubrics and examples

Clear rubrics are the difference between opinion and evidence. Keep them short and behaviour-based.

Behavioural rubric, 1–5 scale

  • 1 – Emerging: needs frequent guidance, light examples.
  • 3 – Proficient: clear, relevant examples with measurable outcomes.
  • 5 – Strong: complex examples, anticipates risks, quantifies impact, transferable playbook.

Skills rubric, 1–4 scale

  • 1 – Limited familiarity
  • 2 – Working knowledge
  • 3 – Advanced, independent
  • 4 – Expert, teaches others

Values and motivation

  • 1 – Motivated by factors that conflict with role demands
  • 3 – Reasonable fit, some alignment to values
  • 5 – Strong alignment, cites evidence of past behaviour that matches company culture

Include two or three sample notes to coach interviewers on good evidence capture: quote a sentence the candidate used, or list the concrete steps they took, rather than summarising with adjectives.

Candidate evaluation templates by role

Role-specific forms help interviewers probe the right work. Adjust prompts and tasks to fit.

Retail associate

Top criteria: Service recovery, pace, reliability, teamwork.
Task: Prioritise five stockroom tasks for the last 30 minutes of shift and explain the order.
Prompt: A queue forms and stock is low. What is your first move and why?

Software engineer

Top criteria: Problem solving, code quality, collaboration, learning mindset.
Task: Short refactor or debugging exercise with unit tests.
Prompt: Walk me through a recent system you redesigned. What trade-offs did you consider?

Customer service adviser

Top criteria: Empathy, written clarity, resilience, product learning.
Task: Draft a short response to a delayed order with two policy constraints.
Prompt: Tell me about a time you turned a frustrated customer into a promoter.

These examples double as a candidate evaluation sample set that teams can iterate.

How to use the forms in practice

A little process discipline goes a long way.

Before the interview

  • Share the role profile, criteria, and forms with interviewers.
  • Align on who owns which prompts to avoid overlap.
  • Brief interviewers on the rubric and the meaning of each score.

During the interview

  • Ask the same core questions of every candidate.
  • Capture short, verbatim notes linked to each prompt or task.
  • Use the rubric to score immediately after each answer, not from memory later.

After the interview

  • Submit individual forms first.
  • Hold a short debrief to compare scores and evidence.
  • Record a final decision with one-sentence rationale and next actions.

Reducing bias and increasing fairness

Structured forms help reduce unconscious bias and affinity bias by keeping assessors focused on job-related evidence. Use consistent questions, a shared rubric, and documented decision criteria. For inclusive hiring, publish your adjustments process on the careers page, and provide alternatives where appropriate.

Scaling with Excel and AI tools

You can run a candidate interview evaluation form in Excel or Google Sheets with data validation for 1–5 scores, drop-downs for stage and recommendation, and basic conditional formatting to flag outliers. For panels that handle multiple requisitions, store templates centrally in your applicant tracking system and pre-fill role criteria to save time.

For larger teams and high-volume roles, consider using AI tools for bulk candidate evaluation at the first mile. Sapia.ai’s structured, mobile AI interview produces explainable scores against your rubric and integrates with interview scheduling, which keeps candidates engaged and the hiring team focused on decisions rather than administration.

What to measure from your forms

Keep the scoreboard small and weekly.

  • Average score by criterion across candidates at the same stage.
  • Rater variance across interviewers to spot calibration gaps.
  • Pass-through by stage, including time to decision.
  • Correlation between interview scores and early success measures, such as 90-day ramp.

These metrics help you develop better prompts, empower interviewers, and align hiring with real outcomes.

Download-ready copy of the candidate evaluation form

Paste this into your ATS or document editor and customise the criteria to your context.

Header
Candidate, Role, Interviewer, Stage, Date

Criteria and weightings
Criterion 1 [weight]
Criterion 2 [weight]
Criterion 3 [weight]
Criterion 4 [weight]
Criterion 5 [weight]

Prompts and evidence
Q1, Rating [1–5], Evidence
Q2, Rating [1–5], Evidence
Q3, Rating [1–5], Evidence
Task, Rating [1–5], Evidence

Red flags
Accommodation provided
Overall recommendation: Hire, Hold, No hire
Rationale: one sentence that references evidence

Conclusion

A clear candidate evaluation form makes hiring decisions faster, fairer, and easier to defend. Start with the outcomes that define success in the role, turn them into structured prompts and a simple rubric, and train interviewers to capture short, verbatim evidence. Use the same form at each stage, then review a small set of metrics to learn and improve.

If you want to see how a structured, mobile-first first mile can plug directly into your forms, book a Sapia.ai demo. You will keep people in charge of decisions, while candidates get a clear, consistent process from first interview to offer.

What is a candidate evaluation form interview template used for?

It standardises how interviewers score skills, behaviours, and values. You get consistent data that speeds the final decision and improves fairness.

Do we need different candidate evaluation forms by stage?

Keep one core template but tweak prompts and weightings for CV screen, interview, and task review. Use a resume evaluation form for the screen, then a behavioural and task form for interviews.

How many criteria should a candidate evaluation template include?

Four to six. Go deeper with better prompts and a task rather than adding more checkboxes.

What scale is best?

Use 1–5 for behaviours, and 1–4 for skills to reduce fence-sitting. Always include behavioural anchors.

Where does Sapia.ai fit with forms and scoring?

Sapia.ai can run the structured first interview, generate explainable scores aligned to your rubric, and handle interview scheduling. You still review the evidence and make the decision.

About Author

Get started with Sapia.ai today

Hire brilliant with the talent intelligence platform powered by ethical AI
Speak To Our Sales Team